Getting Your New Web Test Automation Up and Running

[article]
Summary:
So you have the responsibility of a new team and getting an entirely new web automation test infrastructure up and running. Here are the hurdles, pitfalls, and successes one QA director encountered, along with the milestones the team defined to measure success, how they migrated their existing manual tests, and the path they took to establish the new web test automation initiative.

You’re excited about implementing a new web test automation initiative, but suddenly, it hits you: Where the heck do I start? Do I just starting writing tests? What automation tool will I use? Should I set up some kind of infrastructure? Do I write tests locally on my machine and then port the environment over to some staging environment? What hurdles should I consider before I move forward? There are so many factors to think about.

Before you take a step forward, let’s just take a step back and consider what, exactly, you want to accomplish.

Test automation is not a new concept. There are numerous resources out there that discuss its pros and cons, as well as many different approaches to achieve successful test automation infrastructure.

Let me walk you through a scenario I went through when I was given the responsibility of a new team and getting the web automation test infrastructure up and running. The end goal was defined, but it was completely up to me to decide what path I would take to get there.

Here, I'll discuss the hurdles, pitfalls, and successes I encountered on my journey to build a new web automation test infrastructure at my company, along with how we migrated our existing manual tests. Hopefully when it’s all said and done, you can use my experience to streamline your process a bit more efficiently.

These are the milestones I defined to measure success and the path my team took to establish our new web test automation initiative.

Do Your Research

Like any other big task, you always want to start by doing your due diligence and researching all of the tools necessary.

The first issue to consider was what tool we would use. Is it scalable? How’s the maintenance? Is it something that can fit into the team’s existing ecosystem? What would be the learning curve for those who would maintain the automated tests? What about the existing development team’s infrastructure—does it integrate with that? And what are we going to do about reporting? We had to consider the team's familiarity with existing tools within the company, as well as who would maintain the test, both short- term and long-term. We went over the same points regarding what scripting language we'd be writing in.

After considering many factors, we decided to use an API testing tool for web automation testing and an analytics platform for reporting purposes. It addressed the majority of our questions, it was easy to use, and it didn’t require any prior knowledge of any programming languages.

Every company, every team, and even every individual will have a different set of questions to answer before moving forward, but the main point is to try to get as many of your questions answered up front rather than later in order to reduce the bottlenecks you may encounter up ahead.

Define the Scope and Coverage of the Tests

Next, what should you define as the scope of tests to automate? Don’t be that person who tries to automate everything. These are web functional tests, so you have to focus on the high-traffic areas and most commonly used parts of the application’s web interface to get the most value out of your automated tests.

Because the application under test (AUT) was new to me, I had to work with both developers and QA to understand the current test cases and manual smoke test procedure. Their existing manual test cases were at a higher level, for exploratory testing, so the QA engineer couldn’t just point me to obvious test cases for automation. It was a constant collaboration in every sprint—and even, at times, in our daily stand-ups—to make sure we had the coverage we wanted to automate. Once the scope was defined, we then prioritized the coverage areas so I knew exactly what to work on first.

This is a good rule of thumb: Even if you know the application, you should always work in collaboration with the existing team when defining the scope.

Create and Maintain Automation Tests

With the infrastructure set up and both scope and priorities defined, I could finally begin creating the automated tests. I was excited to get to write my first set of automated tests.

For this project, I started with using the browser playback feature to get a good understanding of my API testing tool, then easily migrated to editing existing browser playback tests and creating my own. I’m not embarrassed to say that my first few tests were not implemented in an ideal manner. But that’s how we all learn—by trial and error.

My initial tests were very dependent on the environment, where they could only be executed in a specific sequence. There was no setup or teardown as part of my test. This obviously made it harder to maintain and troubleshoot for other team members.

We started using the tool's built-in capabilities to set up and tear down tests, reuse existing tests (such as shared tests as subsets of another test), and parameterize them so they could be portable in different environments. It was easy to integrate REST API tests within our web automation functional tests, which made our life a heck of a lot easier to populate any prerequisite data. Single sets of tests were executed against different browsers seamlessly.

Occasionally we ran into a browser-specific issue, such as being unable to perform a click action where an element is not visible. But the tool's powerful built-in feature for different wait conditions, capability of executing arbitrary JavaScript, rich documentation, and active user forum became a savior to us.

Publish the Results Transparently

The last goal I had identified was the reporting aspect of the test results. Here, it’s all about visibility. This wasn’t some secret formula I concocted and wanted to keep to myself. On the contrary, I wanted everyone to be aware of the results so that the whole team was responsible for maintaining the tests.

I set up test results to be reported into a reporting and analytics platform. I was able to easily create a dashboard with multiple gadgets to display the test results, so they could be displayed on the big TV screen within our development department. This way the truth was clearly visible.

The only way we would benefit from this method of reporting down the line was if we kept the test results at 100 percent passing. Otherwise, it would just be noise that no one cares about. Before I even started, I had established with development that this would be a team goal to keep tests maintained. They all agreed, and now, when I walk into the office every morning, I can easily look up and see where we are with the test results from the previous run.

Everyone Succeeds Together

Getting everything completed was never meant to be a one-person job—nor did I want it to be. It took a lot of collaboration and support from the team, including management.

One thing I learned is that you must stay on top of the tests. Keep them maintained and passing at 100 percent. Your automated tests are like a living organism that has to be looked after on a daily basis.

Do your research first before diving into the project and you’ll be able to address some of the bottlenecks ahead of time, and do not hesitate to optimize your tests. All in all, this was a great learning experience for me, and I look forward to getting thrown into another team and repeating the same procedures to establish an effective, transparent, and collaborative web test automation initiative.

About the author

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.