Many automation projects often have boxes of time that are deemed more critical than others. As a byproduct, project members may only give attention to test automation implementation and its returns during those time periods, as opposed to focusing on the returns over the life of the implementation. While there are serious pitfalls to this "timeboxed" view of automation, Dion Johnson describes some situations where it may be acceptable and even necessary.
What I'm about to write may initially seem insane, but here it goes: Although we constantly rail against it, low test-automation return on investment (ROI) is not always a bad thing.
I know what you're thinking. "This guy has lost his mind! He's drunk with power!" Just hear me out, before you make up your mind.
To explain my crazy talk, I will borrow from the concept known as timeboxing. A timebox is a set window of time given to accomplish a set of tasks. In agile development, teams are tasked with producing new software releases, timeboxed to a specific number of weeks. The concepts in this article are not specific to agile, however, so some liberties will be taken with the concept of a timebox.
Given that most projects have periods of time that are deemed more consequential, more important, or at least faster-paced than others, for the purposes of this article I will define a timebox in terms of key snapshots in time, which may or may not coincide with a project's official timeboxes.
Mature organizations with well defined processes generally focus on how they operate over the life of a project, while less mature organizations give significantly more attention to these timeboxed periods of a project. Working as an automator in a mature organization is the ideal situation.I once did a long-term contract for a relatively mature organization, and it was bliss. I functioned as the lead automation architect in charge of a team that designed and implemented a highly structured keyword framework, and we successfully implemented that framework for years with high returns on investment. In the time since I left the contract, it has continued to grow and thrive, spreading to other divisions within the organization and being ported to new tools and technologies. It was definitely one of the highlights and greatest success stories of my career. Not every automation experience follows this story line, however, and too often we have a problem adapting to the reality with which we are presented. We attempt to force a mature test automation framework down the throat of a relatively immature, timebox-focused organization and then wonder why we are met with fierce resistance.
To understand, why test automators often have this issue, let's turn to the basic return on investment (ROI) formula:
The investment is an expenditure of money, time, or effort with the expectation of future benefits, while the gain is the result of that investment. The lower the investment and/or higher the gain, the better the ROI will be. ROI is tricky, however, because its outcome largely depends on the time period in which you calculate the investment and gains. Skilled automators are innately programmed to design and implement automated test frameworks that will produce a positive ROI over an extended period of time (EP ROI). This means they start tabulating costs and benefits from an early point in the automation implementation and continue tabulating to the present, and even some date in the near future. Conversely, test automation ROI can be viewed with a timeboxed mentality that only looks at investments and gains within a specific timebox (TB ROI). This approach may produce some quick returns, but it will more than likely greatly reduce EP ROI due to excessive maintenance during slow periods and lack of reliability during subsequent timeboxed periods, specifically as the functionality increases in size, complexity, and modifications. To an automator, it seems fairly cut-and-dry; make the early investment in your automation framework for an ultimately high EP ROI. Why would anyone not understand this? Well, depending on the goals or maturity level of the organization, the only part of this explanation that project members will understand or care about is the part about the quick returns. So, what we as automators need to do is understand that sometimes this mind set is acceptable. Sure, implementing a mature framework would be ideal, but sometimes we need to be able to accept that which is acceptable. Let's take a moment to talk about "ideal," "acceptable," and "unacceptable."
Ideal: We already know that the ideal situation is a mature framework that produces a high EP ROI.
Acceptable: High TB ROI, low EP ROI (figure 3):While this is not ideal, this is the nature of how some organizations work. Therefore, when introducing test automation, you may have to do it within this mode of thinking. This situation is marked by quick automation with relatively high TB ROI and excessive maintenance between timeboxed periods that drives down the EP ROI. As long as the ROI is good during crucial moments, it doesn't necessarily matter how much time is wasted during other times on the project. Why? Without automation, that time would be wasted anyway.
Acceptable: High initial TB ROI, low EP ROI: This is even less ideal than the first "Acceptable" example, because it is marked by quick automation with relatively high ROI during the initial timeboxed period. But not only does it have excessive maintenance between timeboxed periods, it also has excessive maintenance in subsequent timeboxed periods. This may occur because changed or new functionality may be presented within the subsequent timebox. In addition, the application under test may undergo platform, domain, data, or object property changes within the subsequent timebox. As a result, the automated tests may need to be scrapped and recreated within the subsequent timeboxes. If, however, the tests are relatively simple, script development is relatively quick, and returns are relatively high within each timebox, then this less-than-ideal approach can be justified.
Unacceptable: Low TB ROI, low EP ROI (Figure 4): If both the TB ROI and EP ROI are low, then what's the point? Sadly, this is often the reality of timeboxed focus automation. The environment is so dynamic that, absent a robust automated test framework, the automated tests can't keep up with the changes. In this situation, you may want to focus on automating processes as opposed to automating tests. (For more information, read "Is Your Automation Agile or Fragile", Automated Software Testing Magazine, May 2009.)
Unacceptable: High initial TB ROI, negative subsequent TB ROI: This typically is a result of overdependence on unreliable, volatile automated test scripts. When automated tests yield a positive ROI in the first timeboxed period, the team may develop too much confidence in the automated tests. So, when the second timebox comes up, they assume automation will successfully cover certain tests, while manual testers can focus on other tests. Instead, the automated tests break to the point of not being very useful when they are needed most, resulting in a heavy loss of productivity within the timeboxed period.
In the name of maintainability, robustness, and a host of other quality attributes, we as automators can sometimes create frameworks that aren't used enough to justify the necessity of those quality attributes. (From "Is Your Automation Agile or Fragile.")
So, while it is important for us to continue to move our projects closer to the "ideal," it's also important to realize that we must be able to meet the project where it is, in order to take it where it needs to be. This means understanding that a project's move towards process maturity is itself a process and that the work isn't going to wait for the completion of this maturation process. Given some time and modest successes, the organization naturally will move in the direction of maturity and greater returns. As long as you can effectively manage expectations surrounding the timeboxed, "acceptable" solution, it's sometimes acceptable to think inside the box—or timebox—when it comes to our automated testing, if only for a short while.