Plenty of articles talk about testing schools of thought, testing theory, and testing tools. But how often do we talk about what we actually do when we test?
If you search the web for test exercises, you’ll likely find some programs that simulate a triangle, a word count exercise, or a parking calculator. Most of these exercises predate the Internet. Some are Windows programs designed for a monitor with one-tenth the pixels; they fit in a corner of today’s screens. Many are entirely pen-to-paper. The few that are web-based predate mobile devices and responsive design.
Two years ago, I decided to do something about it. I ran the WorksHop on Test Design, or “WHaTDa,” in Columbus, Ohio. My public promise was to develop some test training material at the workshop and give it away.
We took a word problem and put it online, and if I do say so myself, it is fantastic. So fantastic, in fact, that I was selfish: My company used it as an interview problem and a training exercise.
But it’s time to keep my promise and provide a test exercise to everyone. Here’s the setup.
The Palindrome Exercise
You are a recently hired tester. The company has software it wants to ship at the end of the day that tests palindromes—words that read the same backward as forward. (So “bob” is a palindrome, but “robert” is not.) You type in some text, click Submit, and the software tells you whether the text is a palindrome—pretty simple. The lead developer is out sick, but there is a junior developer who can fix bugs that you find. We need time to fix and retest, of course. The product owner is nontechnical, but he can answer questions.
That’s enough detail for you to test the software. Here’s the website where you can find it. (You can ignore the Anagram section; that’s not finished yet.) Check it out and leave comments about your favorite bugs.
But before you go and do that, think about it for a moment. The real power in this exercise is where you can take it when you have someone else playing the role of product owner. Let’s talk about a few places this could go:
- What browsers are you testing it in? What mobile devices? When do you stop?
- How long will it take you to test?
- What makes a bug, a bug? Which issues are bugs? Which are not?
- The product owner is worried about performance of the API—the call made on the submit button. Can you isolate the API? How would you performance test it?
- Can you find any potential security issues on the page? Accessibility? Internationalization?
- Can the software ship or not? (This often leads to a big stupid argument about the tester role, followed by “Can you at least make a recommendation? Can we have a conversation?”)
Testers who run these exercises typically don’t do well. They have a series of other skills. They’ll wiggle on the hook, trying to get the junior developer to do the research, or call the senior developer on her hospital bed. They’ll pull out dog-eared copies of How To Win Friends and Influence People or The 7 Habits of Highly Effective People. Or they won’t. Some testers explain that the questions above are for the developers—not their job. Some automators say, “Just hand me the test cases and I’ll automate them.”
And a few—just a few—testers have the technical skills to model the risks, do the technical investigation, and handle the uneducated customer. That sort of conversation requires a new set of skills: the kind we increasingly see in demand from our customers.
Skills for the Modern Software Tester
In the 1990s, software came on physical disks and ship decisions were expensive. Today, companies are trying to ship more often, improve the quality of code on the first release, and notice and fix problems more quickly. Programmers are writing unit tests, creating integration tests, and even driving the user interface, and it’s all hooked into continuous integration. Superficial tester skills—the skills needed to find obvious problems—are becoming increasingly marginalized.
Modern testers don’t need to write full-stack applications, but they likely know enough about accessibility, internationalization, platforms, simulating Wi-Fi speed, networking, HTML, CSS, TCP/IP, and JavaScript to be able to debug web applications. Many also probably know enough code to write a log analyzer, have the skills to analyze server logs, and understand enough about virtualization to create test machines and test servers, plus debugging, if not setting up and running continuous integration.
That’s the nature of the modern web. Modern testers can specialize in a domain, such as databases, large text data sets, native mobile applications, API, or legacy systems, but to get the next job, testers will need to be able to learn new technologies.
And that’s just the hard tech skills. Of course, we also need to understand common failure modes, to have the people skills to get to know the customer, and to continue to learn to think.
For today, though, test the palindrome software. Pass it around. Let’s talk about how to make the exercise better—and about how we can all develop our testing skills for the future.
User Comments
i couldn't quickly tell how you want bugs reported. There's a spelling error when a word entered is not a palindrome. It should say reversed but instead it says reserved
Dear All,
I took this challenge via https://blog.gurock.com/the-palindrome-software-testing-challenge/. I observed that this Testing Challenge has related testing contexts at three different places, which I believe is a good thing.
1. @ justinrohrman.com/blog/recap-of-whatda-and-qa-or-the-highway/ - Focussing on Strategy and Test Framing along with Test Design Techniques (http://justinrohrman.com/palindrome.html)
2. @ https://www.stickyminds.com/article/next-generation-exercises-software-t... - How to test for Product Owner concerns and technical testing aspects
3. @ https://blog.gurock.com/the-palindrome-software-testing-challenge/ - End user expectations (Fit for use so that Teaching Mission can be accomplished)
2nd and 3rd lead to (http://xndev.com/palindrome/)
All are closely related and make this small application a wonderful candidate for practice testing. Since I tested this from 3rd perspective, so my testing assignment results will not meet other two contexts expectations. I am not a web tester and I know this limitation often causes problem for me while taking such web based challenges, but that's a part of learning of new areas.
My work is listed here https://github.com/testanalyst/TestingPractice/blob/master/Exploring_Pal....
Honestly, I missed the Reversed vs Reserved bug. This reminded me that I need to work on my Assumptions, Biases and Ignorance levels. Largely, I focused on Oracle Identification, Test Data Identification and I went for testing beyond the simple definition (characters based) of Palindrome because I interpreted that, testing context I chosen, required me to perform that level of testing.
I have a question as well...
Palindrome application is built using a JavaScript function? So everything is validated at Client side. No call to server. Where is the (web?) API here in picture? I used the Dev Tools / Fiddler as well to see if Submit button click generates any web traffic, but it doesn't and I expected the same.
Now I am working on testing this application as per hints given by Justin and Matt on respective portals, to enhance my Thinking and Testing skills.
Please provide feedback on the testing results I posted. I will be glad to know my shortcomings.
Thanks
-Sandeep
@testanalystat
Hi there, first of all, let me thank you, Matt and everyone here for the good articles and for letting me take part of that somehow. I would like to give some observations about the exercise above: Defects? Faults?
1- it considers only one blank between the characters;
2- it doesn´t consider the 'Enter' button as the pressed 'Submit' button;
3- it should delete the result after clean the entry field.
4- 'Reserved' instead of 'reversed'.
5- It should limit the number of characters or inform the user an acceptable number.
Any comments will be very welcome, thank you all, see you another time.
Hi there, first of all, let me thank you and everyone here for the good articles and for letting me take part of that somehow. I would like to give some observations about the exercise above: Defects? Faults?
1- it considers only one blank between the characters;
2- it doesn´t consider the 'Enter' button as the pressed 'Submit' button;
3- it should delete the result after clean the entry field.
4- 'Reserved' instead of 'reversed'.
5- It should limit the number of characters or inform the user a acceptable number.
6- The 'Submit' button should be disable unless you fill the entry field.
Any comments will be very welcome, thank you all, see you another time.
Hi Matthew.
I've only been in software testing for around 6 months as such I'm trying to develop my skills and test myself, this is what I've found while testing the Palindromes exercise.
1) The program can't handle spaces between words for instance nurses run is a Palindrome when spelt backwards without the space the program seems not to think so.
2) If you add a bulletpoint at the backend of your words it takes this into account when spelling backwards and says that it's wrong.
3) Having nothing entered and pressing space says that it can be reversed which should rather fire a validation indicating nothing was entered.
4) When you start entering very large Palidromes and try to delete characters it takes a while before the field starts updating with deleted characters.
5) If you copy and paste an extremely large amount of test the blinking line when you enter text starts getting misplaced and it ends up being displayed far away from the end of the text.
6) If you navigat away from the page and then navigate back to it the text is displayed back at the begining instead of where you were last.
7) It won't recognize the palindrome if the word is not in same capital state so Mom is incorrect but MoM is correct.
8) If you enter a single letter it says it can be spelt backwards but it's not entirely a word.
9) If you put unequal spaces in the begining and end of a word it says it cannot be reversed it also seems to ignore these spaces when presenting you the answer.
10) It says no when brackets are placed around a word.
11) Submit is not activated when pressing enter key.
12) ! after yes and no should be placed at the end of the sentence or perhaps better replaced with a comma.
These are a few of the things that i've found while I did my testing. I don't have any experience really testing code so I can't say anything really about the API. Hoping to learn all of that soon enough thanks for providing the challenge though :).
In the few minutes I had to look at this, here is what I observed:
* There isn't an edit for nonsensical words (just entered a bunch of letters and got a result instead of a message)
* Does not recognize words with accent marks (didn't get to test other diacritical marks, but will)
* Does not recognize blank data (no data input, just click the submit button and it retuns a result instead of a message)
* it does appear to be Internationalization (tested with Russian characters/palindromes, examples: обо, шиш)
Pages