With Extreme Programming, programmers are taking responsibility for writing their own unit tests. What work does this leave for testers? Some people think that XP saves costs by eliminating the need for testers. Does programmer testing really take the place of tester testing? In this column, Bret Pettichord offers ways for testers to provide value to XP teams.
I've found that there are at least three things that people can mean by "XP":
- What the books say1
- What teams coached and trained by the XP founders actually do
- What people who've only read the books are doing.
According to the books, XP includes unit testing (done by the programmers) and acceptance testing (done by the "customers"). Programmers use unit tests merely to verify that the software works as they intended. The acceptance tests are necessary to validate that the software actually works the way the "customer" wants it to.2
Specifically, the "customer" is expected to write the user stories (similar to use cases) and then write tests for the user stories. The books are a little vague on who the "customer" is, which is why I've put it in quotes. The "customer" is the person who makes the business decisions. Outside XP, this person is often called the product manager or business analyst.3
Even teams well trained on XP often have trouble getting this "customer" to take the time to write the acceptance tests. So they end up being written by programmers or testers and they are often written late. Indeed, studies have shown that acceptance testing is one of the least-well-adhered-to practices that make up XP.4 To help repair this deficiency, XP co-founder Ward Cunningham has recently developed an open-source testing framework for acceptance testing.5
The XP books don't claim that programmer testing supplants tester testing. Rather they claim that the total cost of testing is reduced, largely by avoiding the surprises of the interminable testing phases that often plague the final stage of software projects. But testing still must be done, both from programmer and customer perspectives.
There were some early claims that XP was going to put testers out of business. This was partly hyperbolic—meant to inspire programmers to do better unit testing. But I suspect it may also have been a response to experiences with testers who were obstreperous and unconcerned with the success of the larger project. Indeed, Lisa Crispin fought some early battles to find a place of respect for testers on XP teams; she later wrote a book describing her methods.6
With test-last programming, there was no doubt of the need for testers. But the XP context forces testers to articulate how they can be of service to a team. Many testers, trained in unsuitable methodologies, will fail.
Indeed, some believe XP tries to minimize the role of the black-box tester because of bad experiences with ineffective testers. Specific complaints are that testers
- Object to process improvement that doesn't conform to their traditionalist views
- Focus on bugs that are of little importance to the project
- Lack technical skills needed for serious contribution, and
- Lack education and insight into modern development methods and architectures
Cem Kaner has said, "Unless the level of technical skill in our field improves substantially, we can expect programming teams to continue to actively plan ways to work around low-functioning test teams. If we continue to combine low technical skill with process traditionalism, judgmentalism, and righteous political activism on projects, I think we'll see ongoing, reasonable, and justified sympathy for excluding testers from any serious roles in projects, among programmers, project managers, and development executives."7
If you are a tester on a team adopting XP, here's how to demonstrate your value to the team:
- Show how you provide a helpful perspective in the definition of software expectations (whether you call them "requirements" or "tests")—a perspective that is difficult for either programmers or "customers" but that makes a real difference for the success of the project
- Show how you can be satisfied with an information-provider role, not insisting on being a gatekeeper or quality policeman8
- Show how you can adapt to an iterative methodology, changing direction as the project changes, rather than insisting that the team stick to the plan9
- Show how you can function with a minimum of formal specifications, asking for more information when you need it and taking the initiative to document key information yourself when you see the need
Interestingly, these are not only the criteria for how testers can provide value to XP teams; they are also the ingredients for exploratory testing.10,11
Thanks to Ståle Amland and Elisabeth Hendrickson for comments on early drafts.
1By "the XP books" I mostly refer to Extreme Programming Explained, by Kent Beck and Planning Extreme Programming, by Kent Beck and Martin Fowler
2Correspondence on "Agile-Testing" mailing list, Ron Jeffries, 29 April 2002
3Planning Extreme Programming, Beck and Fowler, p. 17
4 "Circle of Life, Spiral of Death," Ramachandran and Shukla, in XP/Agile Universe 2002 Proceedings, Wells & Williams, ed.
5Framework for Integrated Test, Ward Cunningham
6Testing Extreme Programming, Lisa Crispin and Tip House
7Correspondence on "Software-Testing" mailing list, Cem Kaner, 18 September 2002
8"Don't Become the Quality Police," Pettichord, Stickyminds.com, 1 July 2002
9"XP, Iterative Development, and the Testing Community," Kaner, Stickyminds.com, 21 October 2002
10"What is Exploratory Testing?" James Bach, Stickyminds.com, 29 January 2001
11"A Survey of Exploratory Testing," Brian Marick