A common question we hear in the Test Automation arena is “what’s new in test automation?” For the past few decades, we have seen automation evolve rapidly. Ten years ago, those in the automation industry would remember the days of test automation in a waterfall model, which affected the pace of execution and presented a variety of challenges. Test cases were designed on finalized requirements that were given to a test automation team to work on, with a proper estimate of a certain number of daily test cases to be automated. There were expectations to cover huge regression test suites through test automation, as the main scope mainly targeted regression testing.
Is it the same case now? Can we really plan to automate tens of thousands of test cases for regression?
Coverage of test automation has spread across Web applications, ERPs, standalone, and mobile, to name a few. All the UI automation tools support a vast variety of application streams. With advent of CI-CD and agile development, delivery models with faster TTM (time to market) are coming into force. Critical questions that one must ask include, “Given the challenge of Day 1 or Day 0 automation with little time to react, would it be possible to design thousands of regression cases before starting test automation? What should be the focus in test automation? How shall teams get ready for the latest test automation processes without compromising on quality?“
Let’s address scope and test coverage first. In the current fast-track development lifecycle, teams do not have time to plan thousands of UI test cases. On the other hand, we cannot compromise on quality. Doing so might put the product’s reputation at stake. So, what should be present in the regression suite? The answer is API/web services.
API and web services testing are not new to the market. Middle-level testing has been in use for some time, but our effectiveness in using middle-level testing matters the most.
Test designing teams should minimize UI automation. The cost and maintenance of UI automation are increasing. The reasons for this are frequent changes in requirements and fewer technical documents. From the time we start automation until the point at which we see the ROI of automation, there will be a frequent changes to the UI. In this situation, putting too much effort into the UI may be costly. Instead, if teams can minimize UI automation, validating basic flows for E2E use cases and their integration is effective.
The remaining combinations of data validations can be covered through API testing and web services testing. API/web services are very fast, do not change frequently, are easy to automate, and can be handled using many freeware tools. Even in a covered UI suite, it is better to plan E2E scenarios instead of bits and pieces. This realistically simulates actual use cases of how the end user will use it.
Ex: It is better not to write an automation script to validate the functionality of a Submit button. We have passed that generation of test automation. If you have an enhancement to a single item, that validation point has to be covered through a flow, not as a single test case.
This way, we can drastically reduce the number of UI automation scripts, which can result in a reduced cost for test automation development and maintenance. This approach is effective even as E2E flows will be very near to the end user’s standpoint. This model is equally good for fast development cycles, lightweight applications like mobile apps, or even complex scenarios like cloud testing etc.
Another area in which test automation is being aggressively used is in CI/CD modelling, continuous integration, and delivery. Though test automation is primarily meant to test the application against its expected behaviour, automation teams have to think beyond the actual scope of test validations to certify a build before its release.
Test automation has a key role in integrating an automation suite to deliver code. The expectation is that the automation suite can execute unattended on every code drop in any environment, and can run through and report build failures and successes. So, the scope of automation has grown from test validation to a completely unattended build certification. Though the code needed to validate a scenario is the same, teams have to think of all the ways to integrate it in order to execute unattended integrations. Most of the integration tools on the market support this. These tools are free to use, encouraging deployment teams to use them.
Finally, what is the cost of maintaining teams and how the teams should be built? Who are on the market for test automation? Should all automation engineers need strong programming backgrounds? What happens to the people who migrated from commercial automation tools to open source tools?
It is too costly to demand that all automation engineers possess strong programming skills. If people have exposure to the expectation of unattended automation and have the zeal to learn cutting-edge technologies, thorough knowledge of functionality and end user perspectives on the product are a good start. Because we cannot expect fully-fledged support from open source communities, we still have the responsibility to develop and deliver automation architecture and design needs. Thus, automation teams should be supported with a few test architects to drive, architect, design, and review the work of the team members. Having this present at an initial stage of development helps immensely. This, coupled with a structured governance model approach to design, development, and delivery, makes a huge difference. This essentially forms a proper pyramid of work control that drives towards higher quality.
Quality Delivery Manager