As a software development engineer in test, or SDET, I can participate in the development of software and its testing. It is quite fascinating and exciting for me to see how many software organizations today are adopting test automation as a part of their agile and DevOps journey.
As I see it, automated testing is not just an option to be explored anymore—it is a necessity. And this trend will continue to rise. Testing early and often with automation in the product lifecycle is not only helping us to find bugs faster, but also saving time and cost to a large extent. As a result, organizations are expecting their SDETs or test automation architects to design and develop robust, maintainable, intelligent test automation frameworks.
However, many automation testing initiatives are failing due to lack of vision. SDETs need to ask themselves, Is the test automation framework that I am developing reusable, maintainable, configurable, understandable, and scalable?
More often than not, whenever a request comes to us to implement a framework from scratch or to accommodate a new requirement in the existing framework (sometimes within a short time frame), in order to implement the changes quickly, we forget to consider the best coding practices or cleanest coding approach. We go ahead and implement the changes without any proper documentation and as fast as possible.
As a result, the changes get merged to the master branch, and as the framework grows, this becomes a common practice among the other team members too. When the time comes to scale the automation framework to cover more projects, with requirements to add more functionalities or to migrate from a tool used in the framework to a new tool, it turns into massive technical debt for the entire team. This results in the team having to spend more time understanding and debugging the code, and more rework due to poor design and perpetuating antipatterns.
When developing an automation framework, we need to treat it like any other application development project, and write its code as the production code.
These are three of the most important clean coding practices that we need to keep in mind in order to make a scalable test automation framework.
Include Proper Documentation
It’s highly unlikely that a programmer working in a test automation framework development project will be writing the code alone. In an environment where automation code is getting added and updated by multiple people, including proper documentation will not only help you to organize your own code, but also help your peers to understand what your code is actually trying to say. Later on, if you leave the team or if someone wants to add functionality using your code, it will be easy for them to debug, update, perform unit testing, and analyze the results.
In one of my previous projects, my colleague received a requirement to integrate a new test management tool into a test automation framework that already had another test management tool. Though the new implementation was simple and straightforward, it took him a lot of time just to debug and understand how the existing tool was integrated into the framework, all because there was a lack of comments, documentation, and consistent naming convention.
Avoid Code Duplication
Let’s say you have been asked by a project team that uses your framework to add a new functionality that will allow them to test their web services and generate reports. After the web services coding part is complete, now you are thinking of writing code to generate report. But before that, have you considered the possibility that the present reporting capability of the framework will be able to handle this requirement? If you haven’t, then you are setting up a situation for code duplication.
As the framework grows, unnecessary duplication of code will lead to more rework, and maintenance costs will start increasing. It’s much easier to fix a bug in the report generation code in one spot rather than to go through the complete framework, debugging and correcting the code everywhere it’s present.
Keep the Code Simple
When coding, try not to complicate things. Keep it as simple (and readable) as possible!
I have observed that many programmers, experienced and novice alike, keep on adding code at a class or function level without quite understanding what message the class or function was supposed to convey in the first place. If you are writing three hundred lines of code at the function level, then you are possibly crossing the danger zone, where code understandability and quality start to fall apart. Similarly, if your class consists of thirty methods, it is probably not serving a single purpose and needs to be fragmented into smaller classes—and, if required, those classes need to be grouped inside different packages.
Your code should closely adhere to the Single-Responsibility Principle, which states that every module, function, or class should have responsibility over a single part of the software functionality. Later on, if you get a new requirement to test a particular functionality, you will know up front about the packages, classes, modules, and functions that will get affected.
As I matured as an SDET who loves automation, I realized that when you work in a framework team, it is not only about taking requirements from different teams and making them work. We also need to consider the clean coding principles while programming, right from the very start.
It’s best to review each other’s code and ensure that the best programming practices have been followed before approving and merging the code. This way, we will build the foundation for a robust framework that will be both scalable and maintainable.
Really helpful. Would love to apply these at ZTABS. Thanks for sharing.
Something I would like to add:
In every automated test, document what's being being tested in a bussiness readable language.
Stakeholders like to know what actually is being covered by the framework. It's very hard to priorize what to automate if nobody even know what's already automated.
I believe that we have different code practices based on what you want to achieve. You have your product code, but also unittests. There has already been much written about these differences. Integration/contract/api testing fall somewhere in the middle.
I also think documentation and reusability requests can be detrimental a project. They are both liabilities, you need to focus on creating only the right abstractions and needed documentation. Unfortunately, when you first start on a green field project you do not know what that is. The person who tends to complain about the missing documentation is also the one not reading or contributing to the existing docs. In your example had everything been documented it would have taken longer, reading through trying to grasp the system and then diving into the code and trying to grasp how the two relate. Sure good documentation that was exactly what you need to know would avoid this problem, but you don't know what person X needs to know.
I don't know how many times I read docs, couldn't find what I needed to know, dived into implementation and come to find when I go to update documentation that not only is it document, it is done in the way I would and I even had read it.