What impact does cloud computing have on software testing? Matt Heusser takes a look at some of the opportunities, such as virtualization and distributed computing, and challenges associated with testing in the cloud.
Unless you've been under a rock, you've likely heard of cloud computing. But, you might not have a clear picture of how it impacts you as a tester.
Oh, sure, the impact is clear for programmers: The services they build will be different. Likewise, there is a clear change for system administrators and the folks in operations. In testing, we mostly get to hear slogans and feel anxiety.
But, cloud computing offers new forms of tools that can be valuable for testers. We should learn to use them.
What Is the Cloud?
Ten years ago, you could go to www.google.com and get results. You didn't exactly know where the Google web server was. Google was running a number of server farms, and you didn't care which server provided your request. For that matter, you might search on something else five seconds later and hit a different machine.
In that sense, the cloud has been a reality for years. If we count Telnet and email, it's been a reality for decades.
Yet, in another sense, the cloud era is very new. Now, we get to build the cloud-based services for ourselves. Like Google spread its work over 10,000 computers, we testers may, in some ways, spread our work out.
This brings two very different ideas to the masses. The first is virtualization, which allows us to combine the services of many computers onto one physical machine. The second is distributed computing, harnessing a large number of computers as one.
In the bad old days, we used to have a thing called a test lab. It took up hundreds of square feet of space and contained between five and twenty computers. Each computer had different combinations of system software. We would have to install, test, and then possibly re-image the machine for the next test run.
Even if we only had to test different browsers, we'd likely need two different computers. Add different browsers and different versions of software—say, Microsoft Word to test cut and paste—and you'd be introducing a combinatorial explosion.
Today, you can use system software like VMWare Fusion (plus a big hard drive) to get all those systems running on one computer. Right now, for example, I have three versions of Internet Explorer and three versions of Firefox available on my MacBook, plus Safari running native on the Mac. Re-imaging a “clean” device takes a half-dozen clicks and a thirty-second pause.
Likewise, if you are testing server software, getting test equipment can be expensive. Even one server per tester can be prohibitive, but several servers in order to check critical fixes? Forget about it.
Once again, virtualization comes to our rescue, with the ability to run several instances of the server software on one physical device. OpenVZ, for example, is a Linux virtualization tool with a scriptable interface. Combining the scripts with your install could make creating a test environment a one-line command. (We used OpenVZ at my last position at Socialtext, where building an environment was literally as easy as running “vest -v boxname -r buildname”).
We can get even more benefits by adding distributed computing into the mix.
Consider, for example, the typical test-GUI-driving test-automation strategy. Over time, the test programs tend to take longer and longer. Eventually, something happens—a test run stops in the middle for no apparent reason, developers check in fixes during a test run, or the lead manager doesn't want to wait forty-eight hours for the two-year-old test suite to finish. Suddenly, we have a problem.
Now, it's time for distributed computing to come to our rescue. Imagine splitting the test suite into a dozen tests (or a hundred) and spooling up fifteen computers, each to take a test, report back with results, and then take another test. If you max out a virtualization tool, you can rent space from a cloud-hosting provider for pennies per processor-hour. A test run only takes as long as your slowest test.
Or, consider model-driven test automation, which defines input and valid paths and then takes random walks through the software, looking for inconsistencies. Imagine using cloud computing to do a model-driven test run on a new build over lunch—not overnight.
Is it just me, or is the future looking pretty bright? But wait, there's more!
Sure, you can rent space and use your own tool, but there is also a host of companies offering tools to do it for you with cloud computing. These range from load testing to functional testing to automatic backup. If you've ever wanted to press a single button and have dozens (Hundreds! Thousands! Tens of thousands!) of computers from all over the world attack your website, you can hire someone to do that. Just like tapping into your electric grid, if your socket is big enough, you can have a lot of power—you've just got to pay for it.
The cloud also enables a new sort of software: zero-install applications delivered through your browser. While these might not be considered classic “cloud computing” platforms, they still mean more and more organizational data is being shared on a bunch of computers "somewhere over the Internet." This means we'll be able to access more and more tools without an installer and share those tools and their data when we are at home, on the road, or even on our smart phones.
Limits and Challenges
Cloud computing has also made some huge promises on the development side. One of those promises is that as usage ramps up, your company will automatically be able to spool up new servers and redirect requests to the new servers. That means you'll need some sort of monitoring of performance and system resources, as well as some code—or to be able to rent it from a cloud-provisioning vendor.
Likewise, as a tester, you'll want to ask, "What happens if the cloud fails?" This is no idle question; a recent Amazon.com Elastic Compute Cloud outage automatically brought down a half-dozen name-brand services, including Reddit and FourSquare. The bad press appeared on mass-media outlets including CNN and The Economist magazine. Many Web 2.0 companies have decided to live with apparently random downtime for apparently random intervals; will your company be one of them?
You'll also want to worry about backups. What happens if the cloud service fails and restores from a backup that is an hour old? A day? A week? What if some of the data is just “lost”? Who owns it? How much risk exposure will your company take on, and how can you mitigate that risk? This problem has happened to me and may happen to you, and these are the open-ended, investigative questions that testers can help answer.
In addition to reliability, cloud computing suffers from another challenge in security. True "cloud computing" generally involves renting servers from some outsourced provider. That means migrating your data to the provider, which can involve questions of trust, security, and, in some cases, federal regulation. Some of these are compliance issues, some legal, but a good tester who can analyze and communicate risks could be invaluable to this process.
Finally, make no mistake: A cloud conversion will cost time and money. Building test tools will take time that could be spent building new features; changing the software itself will cost even more money. Renting servers will create an ongoing expense. And, at a hundred computers at a time, a command-line mistake found overnight could cost a fair bit of money.
The Bottom Line
We don't need to rewrite all of our applications for testers to start seeing value from cloud computing. If you have to support multiple browsers, you could start today. At the same time, if you want to make a bigger move into cloud computing, you can—you'll just have to pay for it.