Many job descriptions include a requirement for domain expertise to filter candidates for testing jobs. But is expertise really necessary before joining a team? Does it ensure a good tester? Justin Rohrman digs into his experiences in difficult business domains, what expertise means, and how it applies to software testing.
Domain expertise is the most popular way I have seen to filter candidates for software testing jobs before they even get into a building to talk with the team. Testers are expected to be masters of—or at least able to talk about—many different areas: testing, programming, development processes (agile, waterfall, Scrum), and the business. On top of that, the job description says something like “Candidate must have five years’ experience in financial services.”
I think filtering on domain expertise is usually the wrong way to get good testers.
Let’s take a deeper look at what expertise means, how it applies to software testing work, and some of my experiences working in difficult business domains.
I’ve worked in a few difficult business domains in my time as a software tester, but the ones that stand out the most are pricing science and behavioral health.
One of the first software jobs I had was at a pricing science company. We built software that ran Bayesian math algorithms to generate prices. If you have ever purchased an airline ticket, then chances are the price you paid was generated by that product.
I came into this company as a very junior tester. I had to learn about the business domain, professional software development, software testing, and more. I had a mixed strategy to become useful in the business domain quickly. The first thing I did was gain some primary source knowledge on pricing by reading a book about the basic principles of pricing. After that I spent time with our sales people and our customers to learn how people use the product by talking with them and watching how they work.
After just a few months, I was able to talk about pricing, and our product, in a reasonable way. I started thinking in terms of deals, price adjustments, and price indexes.
This company valued learning and invested in that.
Another platform I worked on was created for doctors to help document casework at behavioral health clinics. This company was very focused on domain expertise. Most product managers, support staff, and even a few people on the technical staff spent time working at a behavioral health facility at some point.
The most complex part of this job was mapping together clients, procedures, and doctors to insurance providers. There were only a couple of people on the team who were familiar with that part of the product. I wanted to develop some knowledge (and simplify the work) by spending time with them to document the state machine. They were too busy, and managers never supported my idea.
I was able to find problems in new code, but I never made the jump into finding problems in the context of behavioral health and was never able to speak the language of the field.
This product was developed mostly by people who understood behavioral health rather than skilled software developers, and it showed. I didn’t last long, and it is still a mystery to me how I made it through the interview process.
Table of Expertise
Expertise is the word we use to describe a range of how well we can perform some skill. Harry Collins and Robert Evans wrote a book called Rethinking Expertise that goes in depth on a model they developed to help us talk about a person's ability:
On the far left we have beer-mat knowledge—this is what you get from reading a product label at a grocery store. Reading the label won’t tell you how the product was made, and without having some training you probably won’t be able to re-create that item, but you know just enough to describe the product to someone else now.
One level deeper, to the right we have popular understanding. This describes the type of knowledge a person has from reading “pop-science” books, similar to what Malcolm Gladwell writes. At this level we know about anecdotes and stories, but not much more.
After this is primary source knowledge. This comes from original works. For example, if you read the book Rethinking Expertise, you could say you have some primary source knowledge on expertise.
I could read about a subject and move through those three levels of expertise without ever having performed the skill I was reading about or talking with someone who has. According to Collins and Evans, you can have this level of knowledge in complete isolation from the actual work.
But the next two levels, interactional expertise and contributory expertise, are special. To have these, you need to do more than read about a subject. You have to step into the community where they are used and interact with the people who live there.
Interactional expertise is the ability to talk with a group of experts and not be recognized as an outsider. You know the lingo, you know the social cues, and you understand the topics that are important to that group. At one point, I had interactional expertise in front-end programming. My desk was right next to those developers, so I knew what programming languages and libraries they were using and what problems they were having, and I could talk with them about that in detail. That has slipped away over time because I don’t sit with them anymore and technologies and current topics have changed.
The next level, contributory expertise, is the ability to perform in the area. Contributory expertise can be broken down into its own spectrum.
One thing I find interesting is that someone can be a contributory expert but not have the lower levels of expertise. Collins calls this downward discrimination. We see this a lot in software testers who consistently find the biggest and most important bugs but can never describe what they were doing at the time, or why. The bugs just come to them like manna from heaven.
Working toward Expertise
Expertise is a temporary thing. Right now, I like to think I am a halfway decent software tester. I read, study, and practice in addition to my regular work. If I were to decide today that I didn’t want to be a software tester anymore, and that my real interest is in becoming a full-time goat farmer and folk artist, my expertise would slide back down the scale as things change in the industry and my skills get rusty. It takes work to move to the right, and it’s a constant battle to stay there.
Domain expertise might be important if you are working at NASA to fling rockets into space, or on financial services products that do high-speed trading. But most of us aren’t. Most of the time, I think including domain expertise in a job description is really a call for someone who understands the product being developed. My experience has been that this can happen very quickly in companies that are willing to spend the time. Organizations that invest in their employees’ professional development gain dedicated and skilled workers, and the employees themselves get the benefit—and the challenge—of moving along the scale of expertise.
So in a short paraphrase (always risky to do), your view is "it depends". Domain knowledge may be a good thing and developing more can not hurt.
So a question is, can a tester with beer-mat knowledge work in a team of people on a complex subject where another member of the test team has contributory expertise?
My experience is that this is how many test teams work in complex fields. Each person bring some expertise.
Justin, I couldn't agree with you more. Domain knowledge and expertise comes in time and testers can quickly obtain domain knowledge and expertise by using their analytical skills. I worked at a company where the VP and CTO was convinced that domain knowledge was the only factor for creating an in-house testing team. They gave me customer service data entry folks as a QA team. What was interesting was that none of them were domain experts or business acumen. That's another misconception, thinking just because a person uses a system or application, they are experts.