Estimating Testing Time

[article]
Summary:

Testers are always facing a time crunch. As part of a recent assessment, a senior manager asked, "How long should the testing really take? It takes our testers from four, five, six, to thirty (insert your number of choice here) weeks, and we need it to take less time. Why can't it take less time, and how can we tell what's going on so we know how much testing we need?" In this column, Johanna Rothman answers with a timeline. By estimating how many testing cycles will be needed, plus how long each will take, she can map out the entire testing process. From this viewpoint, she is able to pinpoint where the process can be streamlined thus reducing the time spent testing.

Partway through an assessment, the senior manager asked me, "How long should the testing take?"

The answer to the senior manager's question is, "It depends."

If you do test-driven development, there is rarely more than an iteration's worth of at-the-end testing. When I coach teams who are transitioning to test-driven development, I recommend they plan for one iteration at the end for final system test and customer acceptance test. Once the team has more experience with test-driven development, they can plan better. I have found there is always a need for some amount of customer acceptance testing at the end. The amount of testing time varies by project and how involved the customers were all along.

If you're using an Agile lifecycle even without test-driven development, I recommend starting with one iteration's length of final system testing. I am assuming that the developers are fixing problems and refactoring as necessary during an iteration, which is real Agile development, not code-and-fix masquerading as Agile.

But I suspect many of you are not yet using Agile lifecycles with short iterations.

If you're using any of the following . . .

  • an incremental lifecycle such as staged delivery where you plan for interim releases (even if the releases aren't to the customers)
  • an iterative lifecycle such as spiral or evolutionary delivery
  • a serial lifecycle such as phase gate or waterfall—then planning for testing is difficult because it depends so much on what the developers provide, and you won't know until you start testing.

I'd expect as much testing time as development time, but it doesn't have to come all at the end as final system test like it looks in the waterfall pictures. Any proactive work you do—such as reviews, inspections, working in pairs, unit testing, integration testing, building every night with a smoke test, fixing problems as early as they are detected—can all decrease the duration of final system test. If you're the project manager, ask the developers what steps they are taking to reduce the number of defects they create. If you're the test manager, work with the project manager and the developers to create a set of proactive defect-prevention practices that make sense for your project.

Wherever you are in the organization, recognize that final system test includes several steps: testing the product, finding defects, and verifying defects. Your first step is to separate these tasks when you estimate the duration of final system test.

One question you should be able to answer is, "How long will one cycle of 'complete' testing take?" We all know we can't do complete testing, so your version of complete is the tests you planned to run and any other exploratory tests or other tests you need to run in one cycle of testing to provide enough information about the product under test. I realize that's vague and depends on each project. I don't know how to be more explicit because this is a project-by-project estimate. If you work with good developers, the cycle time can decrease a bit from the first cycle to the last—because the testers know how to set up the tests better and the product has fewer defects, which allows the testing to proceed faster.

About the author

Johanna Rothman's picture Johanna Rothman

Johanna Rothman, known as the “Pragmatic Manager,” helps organizational leaders see problems and risks in their product development. She helps them recognize potential “gotchas,” seize opportunities, and remove impediments. Johanna was the Agile 2009 conference chair. She is the technical editor for Agile Connection and the author of these books:

  • Manage Your Job Search
  • Hiring Geeks That Fit
  • Manage Your Project Portfolio: Increase Your Capacity and Finish More Projects
  • The 2008 Jolt Productivity award-winning Manage It! Your Guide to Modern, Pragmatic Project Management
  • Behind Closed Doors: Secrets of Great Management
  • Hiring the Best Knowledge Workers, Techies & Nerds: The Secrets and Science of Hiring Technical People

Johanna is working on a book about agile program management. She writes columns for Stickyminds.com and projectmanagementcom and blogs on her website, jrothman.com, as well on createadaptablelife.com.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!

Upcoming Events

Apr 29
Apr 29
May 04
Jun 01