The Viability of Context-Driven Testing: An Interview with Keith Klain

[interview]
Summary:

In this interview, Keith Klain, a software testing and quality management professional, discusses all the lessons he's learned from selling software testing. He also explains why context-driven testing is viable, as well as how to discern between wants and needs.

Josiah Renaudin: Welcome back to another TechWell interview. Today I am joined by Keith Klain, a software testing and quality management professional, and a keynote speaker at our upcoming STAREAST Conference. Keith, thank you very much for joining us today. 

Keith Klain: Great to be here. Thank you very much. 

Josiah Renaudin: Absolutely. Before we really dig into the meat of your keynote, can you tell us a bit about your experience in the industry? 

Keith Klain: Sure. I'm going on twenty-plus years experience in software testing and quality management. I've worked at large financial services institutions all over the planet; did a long run in London, Southeast Asia, I've worked all over India, and I've been back in the US for about four and a half years now running large global testing operations and consulting businesses. 

Josiah Renaudin: How difficult was it to leave such a very certain thing to start something on your own? I mean, you look at a title like "head of global test center," that's a big thing. What was the process in your head when you were leaving that? 

Keith Klain: Right. Yeah, it's scary. I think that's one of the things I talk about during my keynote: what you think you know, and then what you really know once you leave a job like that. Those large, I call them, "enterprise IT jobs" are very dulling of the senses because you get used to, and very, I view it in my frank way of putting it, "soft" from your ideas being accepted because it's generally in people's best interests to accept ideas.

I learned a lot about what works, what doesn't work, in the real world by doing that. But to answer your question specifically, yeah, it's a leap of faith. I mean, I've been aligned towards the context-driven testing school of thought for a while now. I really believe that it's got the most commercially viable and best ideas on how to test software, and truly believe that it can be a commercial success, and not a lot of companies in the world have tried to do it that way. So yeah, you try and bank on yourself, but it's a scary venture. 

Josiah Renaudin: Yeah, well, it sounds like betting on yourself worked out in this case. You just mentioned context-driven testing. That's a great segue for me because, in your mind, what do you feel are the best methods for turning context-driven testing into a practical commercial approach? And what have you really done recently to move in this direction?

Keith Klain: I think, particularly when it comes to context-driven testing, the context-aware test strategy and context-aware information that aligns your test approach towards relevant information for your business is really the most important thing that comes out of the context-driven testing world. This is kind of paired with the skilled testing movement as well; that testing is a set of skills that can be learned and practiced and developed. Those two things, I view them as related but are different in a way. 

We tend to focus on the context-driven testing world, I think, a lot towards skilled testing. A lot of the people who are founders of the movement are artists and testers. The Michael Boltons, the James Bachs of the world are artists and testers. People who follow them tend to focus on the skill side to it, but there's a lot of information and great stuff that comes out of context-driven testing that's completely relevant to the commercial prospect of helping a business be successful, or at least make great decisions based on great information. 

One of the things that I've tried to do in making it a commercially viable proposition is aligning test missions to the business as quickly and iteratively as possible. That's what being context-aware is really about. We focus a lot on what do we need to know, who needs to know, when do we need to get that information to them, and using context-driven principles and skilled testing practices to generate that rather quickly. 

That's what the primary ... The alternate world that a lot of the bad practices in software testing have been developed out of this idea of a factory mentality towards testing is really they antithesis to that. That's where we can, again trying to carve out some of that market for us, make a very reasoned and demonstrable difference between our competitors. That's what I've really tried to focus on in selling this to businesses, and using the lessons I've learned in selling it to Barclay's and other organizations as well. 

Josiah Renaudin: You have to have learned a lot at this point. Like you said, you have more than twenty-years of experience and you've worked with a medley of different businesses. How often do you run into stakeholders who have a very specific need that's really just diametrically opposed to what they want? How do you handle these situations without pushing a client away, this kind of strange gap between what you see needs to be done and what they think needs to be done? 

Keith Klain: Well I think there's the presumption of, "I know what somebody needs and wants from the start," is again where I think you run into these tendencies of wanting to apply models again and again. That's where this factory mentality comes into quality models or maturity: that we can replicate the same idea in any context. Being of the context-driven stripe, I would come in saying ... I need to ask a lot of questions to find out where they want and what they need, and help them align that. I've quite frequently run into stakeholders that ... They need help. They don't know how to articulate in the language that's commercially available of software testing, so test cases, and test counts, and all the metrics, and a bad test strategy, and all the other things ... They don't know how that helps them get good information, so they're trying to navigate that and not getting what they want or really what they need. 

I run into that quite frequently because they've had decades and decades of being blasted by bad information from the software testing business. Some of that to me is aligning language and goals with what actual testing artifacts, deliverables, objectives should be about; once we help get that decoder ring in place we're going to say, "Okay, well here's what you really are looking for," and I've run into this quite frequently actually, then we can figure out how we structure our test approach to get you what you need. What they think they want typically is aligned towards this kind of weird testing industry language, but doesn't always translate into what they actually need; particularly at the approach and strategy level, we focus that. 

That happens more frequently than not; particularly when you're working with large enterprise organizations. These are non-technology companies that build tech: banks, insurance companies, telcos, those types of industries. 

Josiah Renaudin: How difficult can that be, when you run into these large industries that have such deeply ingrained bad practices due to persisting mistakes or just poor testing principles? I know each company's different, each situation is different, but how do you help change their mindset and institute these practices that you know work? Like you said again, you don't want to come in with this almighty, "I know everything. I know the solution to your problems," but is there a first step you have towards trying to remove these mindsets? 

Keith Klain: Well yeah. I think that first step is something that a lot of context-driven testers, or people who follow the school, miss, which is becoming context-aware ... Which would seem to be obvious, but a lot of times it's missed. That's really trying to understand, "What is going to work in this environment? What's going to work in this context?" Your context, the biggest contributor to context, it's people. Understanding who the people are, what's going to work there, and what's not, is the first step really. You're not going to have a hundred percent success rate on every single environment, but defining what success looks like. You're going to to push an organization a certain part, they're going to have to take it the rest of the way; so understanding that environment and being attuned to it is really important, because not all organizations are going to be able to adapt new things. You have to find out how far you can push a place. 

It's funny because I've worked at UBS, Citigroup, Barclay's ... some fairly large organizations. I always had this feeling whenever I left there that ... Even a place like Barclay's where we had a great deal of success in implementing context-driven testing, I always wished I would have pushed harder, faster; we didn't get as far as I really wanted to go. There's stories that I've heard out of some of these organizations that I've left that the impact is still being felt. 

Also, I think there's an ability to propagate things after you've left. There's people who have started communities in Asia that used to work for me that are now working in new organizations that I built. I think it's a bit of a movement but leveling as best you can like me. You'll hear me say again and again when I talk: "Managing your own expectations." I should probably get that put on a T-shirt; it's a bit of a motto for me. You know? Just making sure you're managing your expectations about what you're actually going to be able to get done. 

Josiah Renaudin: Now I'm expecting you to wear a hat during this presentation that says "Managing your expectations." That would be fantastic. 

Keith Klain: That is my phrase, actually. I actually should get that copyrighted. 

Josiah Renaudin: You really should. So there were two terms that stuck out when I was reading your abstract and that is "test case allergies" and "smartypants syndrome." If you don't mind, could you give a brief explanation of these two things so we can have at least a little bit of an understanding before the keynote? 

Keith Klain: Sure. One of the things that I find ... And that's really about the kind of pedantic use of language, particularly around test case allergies. There's nothing wrong with getting to the meaning of words, but I've been involved in conversations with people and testers who can't stand the use of the word "QA," when organizations use, or people use, the word "QA" instead of "testing," where literally the two people are going back and forth saying, "Well, yes, we QA'ed that the other day." And then the person responds, "Well, did you test it too?" And they say, "Yes, we QA'ed it last week." And they say, "Okay, so when you tested it ..." and just literally this kind of ... I guess today you'd call it microagressive correcting of people's language. 

Look, I understand the need for clarity of language and wanting to know what people mean, but I also tie this into, if you think you can help people test better, then a good analogy for how you should approach your job is a doctor analogy: You're not a GP, you're more like an ER surgeon. People aren't coming in for well visits. If they didn't need help, they wouldn't have asked you there. So let's not beat people up first over language; let's help solve some bigger problems. If they weren't sick, they wouldn't have asked you for help. 

Berating people because they don't use the right language can be very irritating, and I think that leads into what smartypants syndrome is about. There's a lot of folks who I've called out on this. Also new people in the context-driven testing community that James Bach refers to as "the tiger cub problem," where you've taught people new skills and how to use them, and how to use language better, and a whole bunch of new techniques, but they're like a tiger cub just feeling its claws. They go around tearing the entire house up, which, again, isn't entirely helpful. Knowing how to use things appropriately, and where, is super important, and people can come off like smartypants. Nothing will prevent someone from allowing you to help them quicker than them feeling like you're a smartypants. 

Josiah Renaudin: Yeah. I don't want to give away your entire keynote, but more than anything, just kind of to sum this up a bit, what central message are you hoping to leave with your keynote audience? 

Keith Klain: I tie my talks into kind of experience reports, so it's hopefully drawing some lessons from my failures and lessons that I've learned personally on that ... And that there's four key practices, or I guess experiences that I want people to draw from and that's how to align what you do towards the business; my views on what good test management looks like; how to integrate a software testing organization into the larger corporate culture in an organization; and then what I think are the characteristics of a good software tester ... Someone who is capable of doing the hard work of providing timely, relevant information to the business in a regular way, and what I think are the characteristics and traits, through my twenty years of software testers that have ability. 

Josiah Renaudin: Fantastic. Well, thank you very much, Keith. I appreciate your speaking with us today. I'm looking forward to hearing the full talk at STAREAST

Keith Klain: You got it. We'll talk to you soon.

Keith KKeith Klain is the co-CEO of Doran Jones, a New York-based consulting company focused on agile development and testing services. For the past twenty years, Keith has built technology teams for global financial services and IT consulting firms worldwide. At Barclays, he ran the Global Test Center, a central technology service delivering software testing globally to investment bank and wealth management businesses. Keith designed the Software Testing Education Program with the Bronx-based non-profit Per Scholas that provides free training in software testing to people from non-traditional technology backgrounds. An active participant in the global software testing community, Keith received the 2013 Software Test Professionals Luminary award.

About the author

Upcoming Events

Apr 28
Jun 02
Sep 22
Oct 13