Plenty of articles talk about testing schools of thought, testing theory, and testing tools. But how often do we talk about what we actually do when we test?
If you search the web for test exercises, you’ll likely find some programs that simulate a triangle, a word count exercise, or a parking calculator. Most of these exercises predate the Internet. Some are Windows programs designed for a monitor with one-tenth the pixels; they fit in a corner of today’s screens. Many are entirely pen-to-paper. The few that are web-based predate mobile devices and responsive design.
Two years ago, I decided to do something about it. I ran the WorksHop on Test Design, or “WHaTDa,” in Columbus, Ohio. My public promise was to develop some test training material at the workshop and give it away.
We took a word problem and put it online, and if I do say so myself, it is fantastic. So fantastic, in fact, that I was selfish: My company used it as an interview problem and a training exercise.
But it’s time to keep my promise and provide a test exercise to everyone. Here’s the setup.
The Palindrome Exercise
You are a recently hired tester. The company has software it wants to ship at the end of the day that tests palindromes—words that read the same backward as forward. (So “bob” is a palindrome, but “robert” is not.) You type in some text, click Submit, and the software tells you whether the text is a palindrome—pretty simple. The lead developer is out sick, but there is a junior developer who can fix bugs that you find. We need time to fix and retest, of course. The product owner is nontechnical, but he can answer questions.
That’s enough detail for you to test the software. Here’s the website where you can find it. (You can ignore the Anagram section; that’s not finished yet.) Check it out and leave comments about your favorite bugs.
But before you go and do that, think about it for a moment. The real power in this exercise is where you can take it when you have someone else playing the role of product owner. Let’s talk about a few places this could go:
- What browsers are you testing it in? What mobile devices? When do you stop?
- How long will it take you to test?
- What makes a bug, a bug? Which issues are bugs? Which are not?
- The product owner is worried about performance of the API—the call made on the submit button. Can you isolate the API? How would you performance test it?
- Can you find any potential security issues on the page? Accessibility? Internationalization?
- Can the software ship or not? (This often leads to a big stupid argument about the tester role, followed by “Can you at least make a recommendation? Can we have a conversation?”)
Testers who run these exercises typically don’t do well. They have a series of other skills. They’ll wiggle on the hook, trying to get the junior developer to do the research, or call the senior developer on her hospital bed. They’ll pull out dog-eared copies of How To Win Friends and Influence People or The 7 Habits of Highly Effective People. Or they won’t. Some testers explain that the questions above are for the developers—not their job. Some automators say, “Just hand me the test cases and I’ll automate them.”
And a few—just a few—testers have the technical skills to model the risks, do the technical investigation, and handle the uneducated customer. That sort of conversation requires a new set of skills: the kind we increasingly see in demand from our customers.