Including a testing/QA component early in a software project necessarily prolongs the schedule, right? Not so, according to Ross Collard. In this, the second of a three-part series, Collard explains how overlapping test phases and emphasizing quality and testing early in the lifecycle can save time and trouble.
In the first article in this three-article series, I argued that speed doesn't necessarily sacrifice quality. Software can be developed faster and better, with the help of efficient testing and quality assurance activities. I listed the following ways of reducing the test cycle time and speeding the overall delivery process:
- managing testing like a "real" project
- strengthening the test resources
- improving system testability
- etting off to a quick start
- streamlining the testing process
- anticipating and managing the risks
- actively and aggressively managing the process
The first article in this series discussed points 1 and 2 in the above list. This, the second article, will discuss points 3-5. Later, the third article will finish the list, discussing points 6 and 7.
3. Improve System Testability
Build a cleaner system in the first place , so that less test-debug-fix cycles of rework are needed. Several quality practices are well known, which also help developers build cleaner systems: requirements modeling, component reuse, defensive programming practices, code inspections, unit testing, and source code control.
Establish test entry criteria. Obtain consensus from the project manager and developers on the criteria that will be used to determine if the system is ready to test, and the testware is ready.
Run a smoke test , and do not accept the system for testing until it passes the smoke test. Waiting until the smoke test passes may appear to delay the start of testing, but, if handled right, publicizing the fact that a smoke test will be done can exhort the developers to greater efforts to meet the test entry criteria.
Increase the early involvement of testers in the project. The testers need to climb the learning curve early and have enough time to master the system, determine how to test it, and prepare test cases and the test environment. The testers should actively participate in developing the overall project work plan.
This means that the testers need to be involved from the very beginning of the project. If this does not happen, people who are relatively ignorant about testing will commit the testers to unreasonable deadlines. The testers also will not buy into the dates that are externally (and perhaps arbitrarily) imposed on their testing efforts.
Ensure that the testers have a thorough understanding. They need to understand
- the project goals and success factors
- the project context
- the system functionality
- the system risks and vulnerabilities
- the test equipment, procedures and tools
Use design for testability (DFT) reviews to instrument and place probes into the system being tested, to increase observability. DFT is intended to give black-box testers access to the hidden internal behavior of the system (and sometimes this behavior can be very short-lived).
Encourage a common system architecture and component reuse. Though these areas are not generally the primary concern of testers and QA analysts, a common architecture across a family of systems and planned component reuse can drastically shorten the test time and the overall development cycle time.
The development teams may be too busy or too partisan and involved to manage the overall architecture and systems framework, and similarly too busy or too partisan to manage a reusable component library. A better place to manage these activities may well be the QA group.
Stabilize the system being tested as early as possible. Place the system to be tested under change control and version control, and establish a cutoff date beyond which no changes are allowed except emergency showstopper fixes (the code freeze date).
Set the cutoff early, even if this means reducing functionality, in order to allow a decent period of time after