Evaluating Test Automation Tools for Government

[article]
Summary:

Test automation is hard to implement in government departments because many of them lack official software test teams. The few departments that employ software testers tend to have business-analyst-type testers who lack technical software-testing expertise, but who possess excellent knowledge of business logic. This is a problem from the perspective of test automation. In many departments, the business team performs function testing while the IT team performs system and load testing. The tool needs of the former are different from those of the latter, and this distinction is often not understood. This article discusses this and other lessons learned from a recent test automation tool evaluation performed for a Canadian federal government client.

After being involved in limited, informal evaluations of test automation tools for various companies, I recently had an opportunity to do the same more formally for a Canadian government client, which I refer to here as "the Agency." This article will share that experience.

One of the Agency's business departments wanted to augment its manual testing with automated testing. Since they did not have anyone in-house with the expertise to do that, they brought in an outside consultant to evaluate the situation and make a recommendation.

The evaluation was performed in the following sequence of steps, which turned out to be very useful:

  • requirements development for test automation
  • criteria development based on these requirements
  • preliminary evaluation with three major vendors
  • analysis of the results of the preliminary evaluation
  • detailed evaluation with two vendors
  • analysis of the results of the detailed evaluation
  • prototype development using tools that won the evaluation
  • analysis of the results of the prototype
  • possible deployment of the tools

During the course of the evaluation, certain realizations stood out as key considerations in any tool evaluation in a government setting. They are discussed below.

Categories of Test Automation Tools
After interviewing some of the key players, the first thing that became apparent was that test automation was a nebulous concept in the organization. People were confused between automated function testing and load testing. People in the business department currently performing manual testing wanted the automation of function testing. The people in the IT group, on the other hand, who had a system-level perspective, wanted load testing. They all wanted one tool to do both. One of the first challenges I faced was convincing them that no one tool could do both.

Cost Differences between Function and Load Test Tools
There is a huge disparity in cost between function testing and load testing tools. While typical function testing tools range between $5,000 and $9,000 (Canadian dollars), load-testing tools typically start at $25,000 and can easily cost $60,000 or more.

Current Canadian federal government purchasing rules mandate that the purchase of any tool that costs over $25,000 must be through competitive bidding. The departments balked at this because of the sheer amount of work involved in bidding, and the consequent delays. Competitive bidding typically adds at least six months to the whole process. In addition, nontechnical criteria that have no bearing on the effectiveness of the tools will get introduced into the process.

Fundamental Difference between Function and Load Test Tools
There is a fundamental difference between function test tools and load test tools. Function test tools work at the GUI level while load test tools work at the protocol level. So, for a function test tool, the big question is: "Does it recognize all of the objects in the various application windows properly?" Object recognition for function test tools is never 100 percent. If object recognition is less than 50 percent, your test automation people will be forced to perform so many workarounds that it will defeat the objective of test automation.

For load testing tools, this question is irrelevant. The big question here is: "Does it recognize the client-server communication protocols properly?" For example, if your multitier client/server application uses IIOP (a CORBA protocol called Internet Inter-ORB Protocol), you'd better ask whether the load test tool can handle this protocol. Even if this protocol is listed as supported in the tool specifications, verify it in your environment.

There Is a Business/IT Divide
Evaluations tend to be skewed in favor of macro-level criteria if the consultant has an IT background. Those with actual automated test script development backgrounds tend

About the author

Jayan Kandathil's picture Jayan Kandathil

Jayan Kandathil is a consultant based in Ottawa, Canada. He has spent the last seven years in various capacities as manual tester, systems engineer, test automation specialist, test team lead, QA manager, and test tools consultant in both the United States and Canada. He is a certified Software Quality Engineer (CSQE) by the American Society for Quality. Email him at jayan.kandathil@eastmansoftware.com.

StickyMinds is one of the growing communities of the TechWell network.

Featuring fresh, insightful stories, TechWell.com is the place to go for what is happening in software development and delivery.  Join the conversation now!