In this on-site interview from STARWEST 2016, Jonathon Wright, the director of software engineering at CA Technologies and a speaker at the conference, joins Josiah Renaudin to discuss the Internet of Things, artificial intelligence, scaling for load, and virtual reality.
Josiah Renaudin: What's your strategy for understanding what sector of the software audience is actively working with artificial intelligence, VR, the Internet of Things, and any other new idea? When you talk about, let's say, mobile testing, everyone's using mobile devices. But people are just starting to work with VR now—it’s new and it feels like it’s mostly early adopters on board. A lot of that, people see it as, maybe five years from now, it'll be much more integrated into things. We're kind of starting it.
How do you make that something that is extremely useful for everyone right now?
Jonathon Wright: You talked about Sony VR earlier. Obviously, on the thirteenth of October, you've got the VR-gen headset, which is coming out. I've done quite a few VR projects now. This is interesting because obviously, the reason why people talk about mobile is because it starts becoming mainstream. VR is very similar. I've been doing projects, especially over the last two years, where VR has been stage one. For instance, drones. Part of it is autonomous drones is great if you've got machine learning and AI at a certain level. There is stuff that's gone where they're starting to be able to understand when you’re on land based on recognizing video analytics and going, "That's a tree, I can't land there."
That's maturing it quite rapidly. When it comes to starting that journey from a maturity, typically things like VR starts happening because people pilot drones and headsets are a great option there. I did a project for a ... I talk about critical infrastructure a bit, for a port which was using a VR headset to move containers around and drop bulk loaders and get scoop up and stuff. Part of that is making it unmanned. You take the manual thing, but then it also provides input sources that it can go through machine learning and become autonomous. VR has a very practical sense to it, partly where I talk about VR in this particular instance.
I talk about AI in the sense of around things like Echo, which is something that you guys have had for about two years. It only just launched in the UK. On Wednesday and partly where I'd got ... If you look at Alexa's API, this—and I'm obviously talking about API's primarily, so that's where the link comes in—is you can build a third party app in there, whether that be ... In our case, we've got, which is the example I've got here, which I’ve got video got, we've got something which you've got Hive over here. We got a very similar thing to Hive, where we can actually control our heating, et cetera. I actually do a bit of a demo where I say, "Okay, turn my thermostat to twenty-four."
Then what we've done from a product company, we can do all the flows with all the different possibilities. You can see what would happen if someone changed the temperature and the temperature was already at that temperature, we'll let it go to this area. Part of it is remodeling how you test the AI, but really all that is is just sending requests from a third party. In this case, they call it skills. In API, coming from Amazon to your system to say, "Control the thermostat to turn it down."
The class of that thermostat is an IoT device; it just makes it more realistic. The same is obviously from a VR/AR point of view and an MR, which is the multireality stuff, is things like Pokémon Go. Last week, for the tech world stuff, which we did with you guys, we actually demonstrated all the possibilities of going through Pokémon. Part of that is also generating synthetic test data to actually go through all paths. As we did on the demonstration, I think it went to a billion different possibilities of going through.
Partly why we're doing this is not because it's just something that's in the news, is the sense of yes, it is very similar to an IoT device, but secondly, when they first launched it and we obviously had all the downtime, partly as they said, it was an untestable platform. You had to use real devices, holding video, using gestures. We partnered with Perfecto to prove that you could actually drive mobile phones using these algorithms that we generated.
Part of it was a practical example of something that's real. When I say that, I look at digital disruption, digital transformation, is around pivoting business ideas. In the case of Pokémon, it was an increase there, the stock by 50 percent in Nintendo.
Josiah Renaudin: Oh yeah, it was insane.
Jonathon Wright: It was insane. It's an example of how quickly do you need to take that to market? Things like Harry Potter Go is coming out in a couple of months time. Typically, the response to disruption is maybe nine to eighteen months. That's how quickly you can internally get the stuff working. Like you said at Sony, you have Oculus and a couple of other ...
Josiah Renaudin: The Vive is out there.
Jonathon Wright: Yeah, the Vive. Part of it is you go, "We can either not enter that, or we're going to have to accelerate how we can ... our time to market." The same as other companies will be thinking, we can use that same model, but how quickly can you get this tested and into a production environment? Partly what I try to do with some of the examples which are more recent, last week, Google AI published their machine learning and pattern analysis for taking images through, and it recognizing images.
Again, if you read the white paper, this is all around sample test data, that it feeds into a system that uses machine learning, and outcomes come out the other side. The same as this gentleman talked about, Vive, which is an Alexa/Siri area. The machine learning aspect of this using natural language models out the understanding. You can say to it, in this particular instance, they're talking about recognizing the Golden Gate Bridge based on where you are and what your plans are, as a point of interest to add it into a node.
A lot of what I go into is actually what we're testing is these individual nodes, whether that be a sensor which controls heat, or whether that's a sensor that controls some kind of API to send a request via a service. We're testing those nodes. That could be a UI, that could be an API, that could be a database, it could be some kind of data, it could be some kind of sensor, it could be VR.
I think the challenge is now, especially in this industry is, that these are real things that we're having to test, just like gaming many years ago. People like Chris ... I'm trying to think of his surname at the moment. He used to head up EA Games. Part of it was, how do you test these huge environments? You can do it manually with hundreds of gamers that spend all their time. I know there are a lot of Blizzard guys here today, who are actually in the UK, they've got the office above us at CA. They're having to understand how they can use these kind of technologies to drive bots, or whatever they may be, to test the games. You can't physically test all the possibilities.
Jonathon is a strategic thought leader and distinguished technology evangelist specializing in emerging technologies, innovation & automation with over 15 years of international commercial experience within global organizations, and is currently Director of Digital Assurance at CA in Oxford in the UK. His practical experience and leadership in the area of DevOps and Digital Assurance has seen him in demand as a speaker at international conferences including Gartner, Unicom, HPE Discover, Oracle Digital Forum, STARWest, STAREast, and EuroSTAR, where he was awarded Innovator of the Year (2014). He is the author of several books on test automation as well as numerous online webinars, podcasts & training courses on Automation and API Testing. With Jonathon’s practical insights into real world application of the core principles and methodologies underpinning DevOps and Digital Assurance, his presentations are not to be missed.