Analytics, Data, and How Testing Is like Baseball: An Interview with Geoff Meyer

[interview]
Summary:

In this interview, Geoff Meyer, a test architect in the Dell EMC infrastructure solutions group, explains how test teams can succeed by emulating sports teams in how they collect and interpret data. Geoff explains how analytics can better prepare you for the changing nature of software.

Jennifer Bonine: All right. We are back with more virtual interviews. Geoff, thanks for joining me.

Geoff Meyer: Thanks, Jennifer, for inviting me.

Jennifer Bonine: It's so good to have you here. Finally. Pretty exciting. You have a topic that I think is so relevant to so many people. We talked to a company, right before you here, about how important it is to test with the lights on, so to speak. Having information and data.

Geoff Meyer: Yep.

Jennifer Bonine: Which leads us right into some of the stuff you're talking about.

Geoff Meyer: Yeah, yeah.

Jennifer Bonine: Why don't you ... If the folks out there maybe did or didn't see the Lightning Keynote, which was really a snippet into what you're talking about and what you've done a lot of homework and research on, maybe give them a little broader viewpoint.

Geoff Meyer: Thanks. One of the things that I did is, was my Lightning was “unlovable losers.” That was really what my presentation was on, is the Chicago Cubs and winning the World Series Championship in 2016, and how they did it based on sabermetrics. And it really saw— I looked into what started with Moneyball, the Billy Beane and Oakland A's story, and the Chicago Cubs winning the World Series.

Jennifer Bonine: Right.

Geoff Meyer: It was really a story about, analytics got into baseball reluctantly, and it became mainstream within sixteen years. The whole analytics and trying to use information to drive your process and efficiencies. I took that concept, in my presentation, and talked about that and how it was being adapted in different sports settings, like Kevin Kelley the high school coach who never punts. Got rid of his punter.

Jennifer Bonine: Yes. Such an interesting story. I don't know if people know that. Maybe give them a little insight into ...

Geoff Meyer: That was another part of this, part of my research as I was getting into it. I said, "What other areas are they using analytics in sports?" Kevin Kelley is a coach in Arkansas, at Pulaski Academy. As a high school coach, he also teaches other courses, had an affinity for statistics. And he went looking at some NCAA football data bases, and he discerned out of the data ... It was about two thousand games that he looked at. He found that in 80 percent of the instances where the team that won the turnover battle, won the football game.

Jennifer Bonine: Wow.

Geoff Meyer: And so he goes, "Okay. Interesting information. How can I turn that into something actioned to what he did?" He goes, "You know looking at fourth down, in football, when teams get to fourth down, standard convention has been that they kick and punt the ball." They just kick off.

Jennifer Bonine: Right.

Geoff Meyer: And he goes, "That's voluntarily handing the ball over. It's a voluntary turnover."

Jennifer Bonine: Right. Exactly.

Geoff Meyer: What he did is ... This is the key, is he took the data, what the data was telling him. It was a very insightful piece of information, but he had the courage and conviction to put it into action. He got rid of his punter. He just ...

Jennifer Bonine: He's like, "I'm not doing it."

Geoff Meyer: Yep. Doesn't do it. That was a really cool story.

Jennifer Bonine: Yeah, that's amazing. If you think about that, it changes behavior. It changes how people react to things. How they want to implement. There's some other ones that we've talked about around ... Even the legal ...

Geoff Meyer: Oh, the predictive justice.

Jennifer Bonine: Yeah. Predictive justice.

Geoff Meyer: And the interesting thing on the predictive justice ... This was a story about Anne Milgram, who was attorney general in New Jersey about ten years ago. What she did is she went out and started looking at giving judges better data to prevent high ... Just high ... Offenders that had high probability of restivism from getting back out into the streets while they're waiting to come to trial. She was trying to keep her community protected. She came up with a system, based on data, on taking data in from around the country, to give the judges more information at their hands when they're making these important judicial decisions and presentencing decisions. The things that made it relevant to me, from a testing perspective, and something for us and our community, is she wasn't applying ... She wasn't giving analytics to judges to get rid of judges.

Jennifer Bonine: Right, no. Right. This doesn't get rid of the judge. Right. No.

Geoff Meyer: It was allowing the judge to make smarter decisions.

Jennifer Bonine: Exactly. Yeah.

Geoff Meyer: What I was telling my group today is the whole idea that I'm telling ... I'm inviting you to embrace analytics into our test processes. Test is just another business process. Embrace it. It's not that we're trying to get rid of our testers, we're trying to give them and empower them to apply their cognitive activity and smart activity to other areas.

Jennifer Bonine: Yeah. It's like, you saw it before, retailers who are buyers in companies. They're trying to predict what someone will buy. They struggled for a long time because there's something called retail analytics, right?

Geoff Meyer: Yep.

Jennifer Bonine: Which tell retailers, "Here's what we think people will buy. Here's when you should mark it down. Here's the price you should make it." Using analytics and data, in databases, to say, "Predicatively, here's how you drive your retial business." And they struggled, because they're just a business unit, like testing and other folks, where they said, "No. This is an art. This isn't a science."

Geoff Meyer: That's right.

Jennifer Bonine: "The data can't do that. We're so much smarter than all of that data." Now, it's a balance. That's what you strike, right? Is to say, "Absolutely we need the data because it helps you make better, more informed decisions." But then you put the art on top of it, of the individual and what they know, and layering in, hey ... The example of no punter, right? It didn't tell him, "Don't have a punter." What it told him is, "Here's something to look at. And then you make ..."

Geoff Meyer: Then he had to figure out how to turn that into an action.

Jennifer Bonine: Exactly. Yeah.

Geoff Meyer: That was a great example of it. Then another thing that pointed things home for me, as I started getting into the analytics of it and turning it predictive, I got wind of this when I read The World Quality Report from 2016 and '17, where they pointed it out, this trend towards predictive QA. As I got into it, one of the things that I was looking at was how we've been doing test automation now. Anybody that's been doing agile development has also built up a test automation competency. And one of the outputs of test automation is data and results. We generate a lot of test results.

Jennifer Bonine: Absolutely.

Geoff Meyer: When you take a look at analytics, the elemental resource of analytics is data. If your organization already has structured data, which they probably do in the form of defects—logging defects, logging test cases, keeping test configurations, generating defects—storing that back in there, you've got a really good set of data, of input data there. And then you also capture your output data, your test results, whether it's logs or whether it's test case failures. That's all stuff that then can be fed back into an analytics engine to start optimizing and looking for opportunities for you to identify. For us, it was trying to identify our high-value configurations, our test configurations.

Jennifer Bonine: Maybe to give them ... I don't know if we can say this. Just throwing out a hypothetical number of things you could test. It gets more complex as our industry gets more complex. Hypothetically, you could have, say, how many configurations?

Geoff Meyer: In ours—and I'll throw out some of the numbers we ran into because we were doing our first step into it this year with our proof of concepts.

Jennifer Bonine: Just to get the concept of why you can't possibly do all of it.

Geoff Meyer: We knew that our test leads, and our test engineers, that was one of the areas where they churn a lot of their brain power, is on trying to figure out which test configurations we're going to go enter in our text execution phases, because it's really expensive. I'm in the server division at Dell EMC, and it's expensive to build our prototypes. It's expensive to pay for the prototypes for us to test. We don't have ... We're resource-constrained, just like many organizations.

Jennifer Bonine: Like everyone else.

Geoff Meyer: We've got a certain amount of people, we can only invest in a certain number of equipment. We want to make sure and just to give you guys a perspective, is when we pulled in our analytics folks to help us look at this problem, they looked at our prior generation configurations that we actually sold, they looked at our sales configuration history, and said, "Do you guys realize that you sold 500 unique ... 500,000 ..."

Jennifer Bonine: 500,000?

Geoff Meyer: "500,000 unique configurations of your prior generation?"

Jennifer Bonine: Right.

Geoff Meyer: And we go, "We knew it was a lot. We didn't know it was that much." We know that within our time constraints, and the number of systems that we can test, we can only test around a thousand, roughly. That one's a hypothetical. But to give you an order of magnitude ...

Jennifer Bonine: But take that as a percentage of 500,000 to 1,000.

Geoff Meyer: We need to make sure we get those configurations right. We need to figure out the right configurations.

Jennifer Bonine: Oh, yeah.

Geoff Meyer: We've got our test engineers, super smart folks, been doing this a long time and we trust them, but it takes them a while to do it. When I offered them up this possible solution of condensing that timeline down, with them as the feedback mechanism so that we can teach our analytics engine ...

Jennifer Bonine: The plus ...

Geoff Meyer: It gives us ... They're super excited.

Jennifer Bonine: Right.

Geoff Meyer: About taking our POC and putting it into operational mode, which is gonna be on our plan for 2018.

Jennifer Bonine: Yeah. This is really exciting, Geoff. If folks go, "Okay. You got me really excited about this data analytics and predictive analytics, and where this is taking us," where can they go to contact you or get more information?

Geoff Meyer: Yeah. Feel free to contact me at my email [email protected]. I'm gonna be doing the same presentation to STARCANADA in a couple of weeks.

Jennifer Bonine: Yes. STARCANADA.

Geoff Meyer: I'll be doing that there as well. I'll probably sign up to do some of the lunch things as well, over there, too.

Jennifer Bonine: Definitely contact Geoff—G-E-O-F-F—if you want to about more information. Really fascinating topic. I hope some of you get to research that a little bit. Thanks, Geoff.

Geoff Meyer: Yeah. Thanks for having me. All right.

Geoff MeyerA test architect in the Dell EMC infrastructure solutions group, Geoff Meyer has more than thirty years of industry experience as a software developer, manager, program manager, and director. Geoff oversees the test strategy and architecture for more than 400 software and hardware testers across India, Taiwan, and the United States. He leads initiatives in continuous testing, predictive analytics, and infrastructure as a service (IaaS). Outside of work, Geoff is a member of the Agile Austin community, contributor to the Agile and STAR conferences, and an active mentor to veterans participating in the Vets4Quality.org program, which provides an on-ramp to a career in software quality assurance.

About the author

Upcoming Events

Apr 28
Jun 02
Jun 23
Sep 29