Learning To Learn (About Testing)

Matthew Heusser's picture

It is not every day that I find myself playing billiards with some of the best thinkers in software testing.

Still, there I was, in Westlake, Ohio, at the Marriott Residence Inn, the day before the conference. We had a pool table, and five people - David Hoppe, Justin Rohrman, Peter Schrijver, and Erik Davis.

The next day we were set to discuss skills in software testing, so we set down to play a game or two.

Except, of course, my brain doesn't work like that—I immediately began to think about skills in pool.

Growing Your Billiard Skills
First we spent an hour or so "just playing pool." Rack the balls, hit them with the cue ball, and on we go.

Most of us had never studied the game seriously. We could generally make the trivial shot, where you "just" want the ball to go straight. Periodically, though, we found ourselves trying to make a complex shot—hitting the cue ball at just the right point so that when it hit the ball, the ball in play would move at a different angle.

This is, well, hard. We got it wrong a lot.

After an hour of playing, I noticed something: We weren't getting any better.

Each complex shot was very different from the shots before, which left us with no point of reference, no muscle memory to improve.

Building Muscle Memory
In traditional sports, the way to learn is to do the same activity over and over again. Serving in volleyball, making shots in basketball, or hitting the ball in billiards—you set up the same complex shot, and do it over and over again until it is repeatable—and you know why. Then, in the heat of the game, your body says "Oh, this is kind of like that other thing I did a million times," and you don't think but react.

I'm oversimplifying a bit here; this is a blog post, not a book. The point is that repeat performance (practice) builds skill in most human domains.

Even in programming we have the idea of the Kata, of doing the same exercise again and again, discovering different ways to write the code, until it becomes familiar.

"Test Katas," on the other hand, make a lot less sense. You find where the bug is the first time, and on the second run through, finding it is going to be really easy. We talk about simulating a real project, where you get multiple "builds" to test, to create a more realistic (and practice-able) kata—but i haven't seen one yet, and I read about testing, well ... a lot.

In fact, if you go to a typical test conference, it is possible to go entire days without seeing any sort of realistic exercise, as simple as "here's a UI—how would you test it?"

Depending on the angle you look at, they seem more obsessed with new and shiny, with automating the process, or measuring it, or talking about sidebar issues (hiring, firing, project management) than in the doing of the testing. The examples we do have tend to be trivial: "A triangle has three inputs ..." When they aren't trivial, we tend to focus on the technique ("equivilance class partitioning!") more than the skills.

The implication seems to be that coming up with test ideas is easy—and as far as confirmatory testing, that may be true. Take a spec, turn it sideways, and every "the system shall ..." turns into a test idea.

That's also a weak and shallow form of testing.

Creating exercises that teach real skill is hard. Kaner, Hoffman, and Padmanahban were working on their Domain Testing Workbook for something close to a decade. At four-hundred and sixty-three letter-sized pages, the book is intimidating to even the most serious student of testing.

I am convinced that skill development is what we need.

Katas in testing are what we need.

We don't need lectures; we need thoughtful practice that ties in to our everyday work.

I've done a little bit of this; Parkcalc, a test exercise I developed, now yeidls 892 hits on google, and the Miagido School has its own web page too.

Still, as 2013 winds down, I can't help but think that we've got a long way to go.

Right now, as of December 16, 2013, I am in a position where I can choose my shot for next year, and have a little time to focus on it. My biggest interest is skills in testing, real skills in testing. As the editor of Stickyminds.com, I will be looking to publish articles that talk about it, but I'll be writing more than a few myself, speaking about it in public and private. The assignments I'll be looking for will be those that allow me to practice skills while studying and documenting them.

I'd like to make 2014 the year of tester skills.

Will you join me?

What would you like to talk about—and what's your idea?

User Comments

4 comments
Kobi (Jacob) Halperin's picture

As testing skills are based on experience, which links to a vast number of potential problems which no one can memorize nor spend the time to thoroughly read all...

I am looking at another direction - taking these test ideas into an On-Line DB, from which anyone could ask for a random idea while doing ET or even planning tests.

Hopefully one could filter by fields of interest / type of application,

And have the ability to add ideas to the common DB to benefit from communal knowlegde.

Still need to figure out how to implement that.

 

@halperinko - Kobi Halperin

December 22, 2013 - 2:39am
Jeremy Carey-Dressler's picture

I have been thinking about this for a long time and feel like it is an area we need to grow in.  We have plenty of theory, but very little 'practical' practice.  I suppose the 'go test an open source project' works, but I would like to see something more structured with a better 'expected outcome' outside of "test it".  I could imagine doing that by using my own open source work and providing builds which I have some known bugs.  Give some practical instructions, a method (blackbox, whitebox, tour, exploratory, etc.), an area of the application and say "go test this" for a timeboxed period.  Provide some method for entering in the bug details and maybe once a user is done, have them peer-grade each other.  Perhaps blogs and twitter would be enough.  We are a little ways off from something like that, but I would love to see it.

December 27, 2013 - 1:20pm
Rob Black's picture

I like "Jacob's" idea.  Basically a 'cookbook' format for testing different types of things.  I.e. common trends in mobile apps, security tests, database testing ideas, etc.  While it makes sense to write tests around the requirements initially, some organizations rely on the developers to vet their code against the requirements prior to handing the code to the test teams.  Thus the requirements are already validated in the solution and additional levels and types of testing need to be performed.  Having a place to go for ideas could catch on.  I also agree with "J CD", there is quite a bit of theory and problem solving technique available.  Along that line of thinking, there is very little information regarding common pitfalls of particular types of technology, data migration hurdles, common developer mistakes to test for, common architecture issues, process testing suggestions, etc.  

I realize that testing is a art that requires a great deal of science.  Even artists have communities where seasoned artists can share ideas on how to tackle problems of lighting and color in landscapes for the aspiring artist.    Stickyminds is a great resource as a starting place, but we can take things much further and share real world examples of tests or even bugs, without pointing fingers at any particular companies or their products of course, so that others can glean knowledge from their experiences.  

There have been a few attempts at cloud sourcing testing efforts, where applications are released in an environment where multiple people can test against the solution in the hopes that defects can be uncovered quickly due to the volume of testers.  Why not peruse the many different types of tests that other people have done in order to increase the test set for your own application or system?

Thoughs?

January 7, 2014 - 7:13pm
Srinivas Kadiyala's picture

I would like to take this opportunity to practice the testing examples...

January 10, 2014 - 2:15am