In this interview, Jim McKeeth, the lead developer evangelist of Embarcadero Technologies, talks about the current and future states of wearables. He explains how thought input is changing the way we control different devices, as well as what excites him most about wearables.
Josiah Renaudin: Today I’m joined by Jim McKeeth, the lead developer evangelist of Embarcadero Technologies. Jim, thank you very much for joining us.
Jim McKeeth: Oh, it’s my pleasure.
Josiah Renaudin: First, could you tell us just a bit about your experience in the industry?
Jim McKeeth: I’ve been a software developer for twenty plus years, developing on different platforms, different tools for languages, all sorts of different things, for quite a while. As a kid, I remember my parents got me a Commodore 64 and I just … As soon as I got playing with BASIC programming, I knew that’s what I wanted to do. I wanted to be a software developer. I wanted to program computers. That’s what I’ve done ever since then. There may have been other jobs, but always on the side, I would be tinkering with the computer and software development and figuring out how to do things.
Josiah Renaudin: Absolutely. Actually, Commodore 64, I mean, it was a little bit before my time, but that is something I did play around with. I had a Nintendo originally, and a Commodore 64 is something I played around with when I was a kid, so I can absolutely relate to that.
Now, you have a keynote at the upcoming Mobile Dev + Test conference. It’s titled, “Thought: The Future of Mobile and Embedded Application Input.” Can you give a brief explanation of how you came up with this specific topic for the discussion?
Jim McKeeth: I’ve been with Embarcadero Technologies for about a year and a half, and they … my boss, my manager, said, “You know, I think a great way to inspire developers, because part of my job as a developer evangelist is to teach and inspire developers to go and try new things and accomplish cool stuff. He said, “A great idea would be to come up with some cool hardware, cool gadgets and technology, and figure out ways to do demos around that.” I was like, “Oh, yeah. That sounds like a great idea. I love it.” A friend of mine, I had mentioned previously looking at this Emotiv EPOC headset, which is an EEG headset that you wear on your head and it picks up the electrical pulses from your brain and translates that into signals to the computer that software can then respond to.
I was like, “Well, that could be a great one,” so I grabbed one of those, and I got a Parrot AR.Drone. I originally just figured out how to use each of them individually. That was cool. Then I was like, “You know what would be really, really cool is if you could actually use the headset, the Emotiv EPOC, to fly the Parrot AR.Drone.” I spent a few weekends on that, figuring out how to make those two work together, and it came out really interesting. Then, in the process of all this, I’ve done a lot of research into where is this at in the industry? Where are we going with the whole idea of brain computer interfaces? How can we read the brain? What’s the science behind this? Then, also, what are the potentials for writing to the brain?
I built the session around the state of the technology, where we’re at, where things are going, and then explaining the specifics of this implementation and then demonstrating that. It never fails. I’ve done this presentation around the world. I’ve been to Brazil and Australia, and all across the United States, and it’s always just an incredible presentation. People love it.
Josiah Renaudin: Yeah, and a lot of your presentation has to do with thought input, and that’s something you were just mentioning was controlling different pieces of technology with your brain.
Before we move forward, can you just give your definition of what thought input is, before we get into the deeper questions?
Jim McKeeth: The Emotiv EPOC, and so there’s actually other systems, as well, they use what’s called an EEG, or an electroencephalography technology, which basically, the idea is that the neurons in your brain, when they fire, produce slight fluctuations in the voltage. You could actually get a mutlimeter set up to the most sensitive voltage, DC voltage, cages, and hold the electric on your scalp, and you can see the numbers change. That’s the basic way, the physicality, of how it works, is the fluctuations in voltage.
Then the software then takes that and analyzes that based on the positions of multiple electrodes across your scalp to figure out … to observe patterns in your brain. You have to start with getting a baseline and then you find the differences between different thought patterns. Then when it recognizes a repeated thought pattern, so you train it to say, “This thought pattern means this.” Then when you repeat that thought pattern, it recognizes it again and triggers the even in the software, in this case, that causes the Parrot AR.Drone to maybe go up or rotate to the left or land or something along those lines.
The thought input is, literally, is recognizing specific thought patterns in the brain, and triggering events in software and computerized systems.
Josiah Renaudin: I think one part that’s really interesting about it is, not … the average person doesn’t really understand how exactly it works. How are these thoughts controlling things? But in movies, literature, and TV we often see this far-flung future where people are controlling advanced pieces of tech with their minds. We’ve seen it in different forms of media. Now, it’s always seemed far-fetched, at least to me. I know a lot of other people feel that way, but are we approaching this reality sooner than we might think? Is this becoming something that might be more mass market than we expect?
Jim McKeeth: I think so. The headset I’m using is Emotiv EPOC and it was one of the first really serious brain computer, consumer brain computer, devices. Now for years they’ve been using this sort of technology in labs and hospitals and stuff like that to peer into the brain. Now this is actually a consumer device. They actually have a couple new models coming out from Emotiv. There’s the OpenBCI project just launched their product, I think, the last month, which is adrenal-based EDG brain computer interface headset. There was a couple other ones, the NeuroSky, and other devices that have been around for a while, and then plus we have, like I said, OpenBCI just came out, the new Emotiv EPOCs, and then a couple other next-generation devices are coming out.
It’s not a single niche device. There are multiple devices out there that are considered arduino-based. We’re seeing next-generation devices coming out, as well. This is something that’s happening right now. This not purely science fiction at this point, but also the … Gartner puts out this emerging technologies hype cycle, they call it, where they take all these different merging technologies and they plot them for relative inflated expectations versus functionality over time, et cetera. They actually have the brain computer interface listed on there. It’s all the way to the left is a very early on technology, and they’re expecting that it’s going to be over ten years’ time until we actually see this in the massively productive, everywhere, scenario.
It is on the chart. It is something that is actually coming on, that is being used, and I think probably in the next few years we’ll definitely start to see a lot more about this as it gets more attention. The next five years we might start to see more and more books about this and more devices and it won’t be entirely uncommon. It’ll be like the early smartphones were, for example. Within ten years plus time, Gartner’s expecting we’re going to see this in large scale uses.
Josiah Renaudin: That’s actually kind of similar to the next question I was going to ask you about. What phase do you think we’re in with this right now? Like you said, mobile is expanding at a very fast rate. A lot of people are using it. Almost everyone’s used to a touch screen and having these really powerful smartphones and it seems like we’re moving onto wearables like the iWatch and Oculus Rift. What stage are we in with thought input?
Jim McKeeth: I think we’re at an early stage for it, but I think we’re, kind of, at this unique time right now with all these different massively connected things. We have all the stuff you just listed, the wearables and stuff like that. I think these devices, the brain computer interface devices, could ride this wave, as well, and we could see it adopted, I think, much quicker because of the result of that. For example, some of the next-generation devices that are coming out are, instead of focusing on the idea of using your mind to control software, are instead using the … The devices are being used to help you evaluate your mental state, and so they help you be aware of your stress level, your anxiety level, and things along those lines.
Then can provide you biofeedback so that you can then go through some exercises to help bring your stress level down. Maybe some breathing exercises or meditation, et cetera.
Now, all these, you know, we’ve done breathing exercises and meditation for years to control stress and anxiety, but to have that immediate bio feedback available to you through these systems, and then, I think, I can … That kind of use might be say, a higher adaption, or faster adoption rate, then using your brain to fly helicopters, as cool as that may be. As we see more and more adoption of brain computer interface used for this scenario, I think we might see it get further adoption in other areas.
Plus, everything is connected, and it’s not going to be long before we’re just going to expect that we’re going to have a Matrix-like experience. You know? Where we can plug our brains into the computer, learn Kung Fu in a matter of minutes, and so on and so forth. I think our expectations are rapidly accelerating and, as a result of that, I wouldn’t be surprised if we see a faster adoption.
Josiah Renaudin: Using your brain to control a helicopter, that’s all I really need to be impressed, but you’ve been researching this a lot. You’ve been digging into this a lot. Have you seen any recent tech in this department dealing with thought input that’s really impressed you? Has there been one in-development device that’s really stood out?
Jim McKeeth: Two things that have occurred that have been interesting since I started researching this. The first is that there is a technology called transcranial direct current stimulation. This is basically the reverse of the EEG. EEG reads the voltage fluctuations to understand what’s going on in your mind. Transcranial direct current stimulation, or TDCS, runs a small voltage through your brain in order to stimulate different parts of your brain and produce different results. This is something that’s being used for treating depression, epilepsy, et cetera, different brain disorders. Just in the last year this has become a do-it-yourself at home brain hacking phenomenon, if you will.
People are building and making their own devices, or buying them online, and running slight voltages through their brains to produce different results. Recently, DARPA did a study where they took snipers, when snipers were in sniper training, and used this TDCS device. They found that the snipers that were using the TDCS device versus the ones that were not were over 50 percent more effective in the training program than those that were, and was using an accelerated learning montage.
That’s really exciting that they’re seeing those results. Universities have actually claimed they’re seeing double people prices effective in their studying, using an accelerated learning montage through TDCS, as well. This is really exciting that A, they’re seeing this massive results, people being more productive in their learning as well as other areas, and B, it’s a at-home technology. We’re building these devices at home, buying these devices at home, using the devices themselves.
When you see that sort of technology showing up in people’s hands, that’s exciting. Something else that was really interesting recently is there’s a project called the Connectome project. Now, you may recall back to a few years ago, the Human Genome Project was this huge, monumental task. They were trying to sequence all the human genes. Well, now the new task is they’re trying to sequence all the connections in the brain, all the neurons, and the synapsis and whatnot.
The theory is that if you do this, it will give … take a person’s brain and gather all this mapping information. You’ll actually capture who they are, their memories, their thoughts, their feelings, their emotions, their personality, even. But it’s a big question mark, will it actually work? Recently they did this for a worm. Now, worms have much simpler brains than humans do, and they took this worm’s brain, this worm’s connectome, and this guy built a software model of it and loaded it into a Lego Mindstorm Robot, and the cool thing was, they turned this robot on, and the robot behaved like a worm. They didn’t have to train it how to behave. They didn’t have to program it how to behave. Just using the map of the worm’s brain interconnections, it behaved like a worm did.
That’s a much lower functionality than we have in our brains, but it’s really positive for the possibility. We’re looking at the beginnings of being able to upload someone’s brain, someone’s consciousness, into a computer, and live forever, essentially. I mean, that’s kind of an incredible, crazy technology right there.
Josiah Renaudin: Yeah, and I think you really answered my next question with that. It’s hard to look at where we are now and predict, well, this is where we’re going to be in ten years. This is we’re going to be in twenty-five years. Like you said, the possibility of where this stuff can go with pretty much becoming like the movie The Matrix, it’s really something. I will ask you a little bit, what do you see as maybe ten years down the line? Of course, we’re not going to solve it all and have everything all figured out by then, but where do you see thought input devices at that point?
Jim McKeeth: I think ten years down the line we’re going to see that they’re going to be more commonplace. If you think back to the early days of voice input. Right? It was gimmick. You’d try some dictation and you’re like, “Oh, I got three words right. You know? That’s not bad.” Now, Android and iPhones both have fairly reliable voice input. It regularly surprises me how effective it is. It still makes mistakes, but occasionally it’s like, “Wow. I can’t believe it got even half those words right.”
Josiah Renaudin: Yeah, absolutely.
Jim McKeeth: I think we’re going to see in ten years’ time, probably see the brain computer interface becoming more towards that, more usability, more useable-wise. I don’t think it’s going to be, in ten years’ time, I don’t think we’re going to see a total Matrix-type experience where we’re all … it’s all mental input. Honestly, keyboards are incredibly good input devices, and if you think about it, we haven’t really seen huge changes to the keyboard in I don’t know how many years since it’s been introduced. It’s pretty much been the same keyboard. It’s very efficient at inputting information.
I think we’ll probably still see traditional input methods, but I think we’re going to see a combination. In some situations people or other situations input embedded masses don’t work. We’ll see more adaption, maybe, with the brain computer interface, but I think we’ll see it augmenting other input methodologies. You might have your headset, and it might be picking up brain information while you’re typing on the keyboard, for example, as yet another channel of input.
Josiah Renaudin: To go off the beaten path a little bit here, have you read a book or seen a movie recently that you think most accurately portrays the direction we’re headed in when it comes to technology and thought input?
Jim McKeeth: You know, in my talk I reference a few science fiction movies like The Matrix and others. I found that most science fiction has little nuggets, in my opinion, of where science is going, and they take these little nuggets and then they build a big Hollywood plot around it, and people get distracted and miss the little nuggets of possible future. For example, I saw the movie Transcendence where they upload his brain to this computer, which, is at least a possibility, but then they had to destroy the Internet in order to stop him, which I thought was kind of missed the point.
There’s lots of movies out there that we see little bits of what is possible. Transcendence is one. Strange Days is another one. One I really liked is called Brainstorm. It’s actually from quite a while ago with Christopher Walken. It actually did a fairly, probably accurate, arc of the adoption of the technology and then the government, of course, steps in and tries to militarize it, yadda, yadda, yadda, so on and so forth. That was pretty fun, as well.
I think science fiction is a great indicator of where things are going. We tend to take inspiration and take our dreams and we put it in science fiction, and then we take the science fiction and it becomes our inspiration to implement our science in the next two years.
Josiah Renaudin: Absolutely. I do have one more question for you to wrap things up, and I do appreciate your time, Jim, but more than anything, what are you most excited for in the realm of tech moving forward?
Jim McKeeth: You know, the thing that always excites me is technologies that are put into the hands of end users that they’re able to then use in ways that the creators didn’t expect. Software development is the perfect example of that. You create software development tools, a compiler, et cetera, and you give it to your developers, and they take it and do brilliant things you never would have expected with it. That in a whole, I think, really is what makes working in this sector so incredible. The technology sectors, that you see these technologies being used to develop the next technologies, and it just is moving so fast, at such a high rate. It’s just so neat to see creative things happening, and stuff you would never have expected a few years ago to become commonplace.
I think the Internet of Things, the idea of all these interconnected devices, and now our brain’s possible at being interconnected to the Internet, it’s just exciting to see where this is going to go next.
Josiah Renaudin: Fantastic. Once again, I appreciate your time, Jim, and I look forward to hearing more about thought input at your keynote in San Diego.
Jim McKeeth: Great. Thank you very much.
As lead developer evangelist at Embarcadero Technologies, Jim McKeeth is a key part of Embarcadero’s developer community outreach. With more than twenty years of programming experience, Jim travels the world speaking at conferences and sharing his excitement and knowledge. He holds a patent for the swipe to unlock and pattern unlocks used on both iPhone and Android phones, plus a number of other computer and software-related patents. When not traveling, Jim is an improvisational performer with ComedySportz Boise and enjoys spending time with his family.