Meeting the Challenges of Automation Testing

Implementing Performance Testing Under CI/CD for Automation Testing – The challenge for many companies today is that while they are able to experience a good deal of success automating functional testing, they have difficulty when it comes to automating performance testing. Thus, automating performance testing is usually left out of the scope of CI/CD. This is mostly because of the complexity and breadth of large scale performance testing. At some point the test scenario becomes so onerous that that only way to move forward is to do performance testing manually.

Fortunately improving the breadth and degree of automated performance testing within CI/CD is possible. Most times, all that’s required is for the company to make a few changes in its approach. First, the company needs to have a well informed understanding about the difference between functional testing and performance testing. Then, it needs to create a level based test plan that determines the layers in the application stack most suitable for automated performance testing. Finally, there needs to be a realistic understanding among the company’s technical stakeholders and contributors about the limitations of automated performance testing within a CI/CD process. Once a company changes the way it thinks about automated performance testing and changes some behavior around the way it does the testing, improved results will follow.

The sections that follow take a closer look at the thinking described above.

Functional Testing vs. Performance Testing

Functional testing is about making sure that code logic works as expected. Performance testing, on the other hand, focuses on the operational behavior of the code. For example, a functional test will exercise a login component to discover if a set of test credentials allows access to the expected resources. Either access is allowed or it’s not. Whether it takes 10 seconds for the test to run or 10 minutes, the time to execute is not relevant to the purpose of the test. The same can be said about CPU utilization. As long as the test passes, it doesn’t matter if takes 10% of CPU capacity or 90%. Functional testing is all about logical behavior.

Performance testing is different. Time does matter, as does CPU utilization, network latency and a host of other operational parameters. Going back to the login scenario mentioned above, performance testing a login process will measure the time it takes for a login request/response to complete. In fact, many Service Level Agreements dictate login access time. Thus, performance testing ensures that the process completes in the time expected. Also, performance testing will look at ancillary login metrics relevant to items such as data query execution, CPU utilization, load tolerance and system failure recovery.

For the most part, functional tests can be automated without much difficulty – the script runs; either tests pass or they don’t. Automating performance testing is a harder undertaking. However the difficulty of automating performance testing can be mitigated when the testing process takes a level based approach.

Creating a Level Based Test Plan For Implementing Automated Performance Testing

As mentioned above, automating performance tests can be a difficult undertaking. For example, going back to the login scenario described above: A test submits the authorization credentials to the login component. Successful execution of the login is supposed to take no longer than 1000 milliseconds. The test results show that the test takes 2000 milliseconds. Clearly the component is not performing to expectation. However, all we really know about is the request/response time at single access point. We have no real idea about what’s happening further down the stack. In fact, we might not even be sure what the stack is. Yes, we can automate the request/response against that single point, but the reality is, it’s a trivial test. More is required.

The trick to automating performance testing in a meaningful manner is to take a level based approach. Level based performance testing is a process by which automated performance tests are executed on components at various levels of the technology stack.

Implementing level based, automation testing is more than identifying test points and running tests. Before any performance testing takes place, systems and test engineers define exactly what the test stack is, from high level load balancers and HTTP servers to low level storage components. Also, the security apparatus and network instructure needs to be well understood as does message queues, caching mechanisms and even the hardware array in play. All of it will come into play at some point, even during a simple login process.

After all the components are well known, performance monitoring mechanisms are implemented against the components at the various levels of the stack. Test validity depends on accurate measurement. Adequate system monitoring is essential.

Once the monitoring infrastructure is set, then testing takes place. Performance testing, particularly automated performance testing is best done in an isolated manner at each level of the stack. Execution behavior can be mocked or emulated to conform to the part of the use case being implemented.

Running short automated test performance scripts against various levels of the technology stack is a more realistic approach than a top level assault on the system overall. There are just too many parts in play to be adequately accommodated by a single, high level approach to performance test automation.

Performance testing using a level based approach allows for a good deal of automation testing. But, still there are limitations to running automated performance testing under CI/CD. Most of these limitations are due to the intrinsic nature of CI/CD itself.

Understanding the Limits of Automated Performance Testing within CI/CD

Continuous Integration/Continuous Deployment (CI/CD) is commonplace in modern software development shops. The CI/CD sensibility is one in which tests execute automatically as soon as new code is committed into a source code repository. Then, automation testing moves the code from test to a staging environment, and eventually onto production. It’s a cornerstone methodology of forward thinking development organizations. As mentioned earlier, functional testing is well suited in a CI/CD processes. The logic of the code is tested. Either the tests pass or they don’t.

However, performance tests have limits when running within CI/CD. Performance tests are particularly sensitive to the the details of the given runtime environment. While functional tests can run on a 4 core laptop or a 32 CPU rack server and produce the same results, the same is not true of performance testing. Infrastructure counts. Thus, in order for a performance test to be reliable, the infrastructure in which the tests run must be consistently appropriate to the need at hand. In fact, some performance tests require that the runtime environment be specially provisioned to support the purpose of the tests. Again, the physical environment counts and there are limitations. However, environment is not the only limitation when in comes to running performance tests under CI/CD. Performance test execution time is also an impediment.

Performance tests can take a lot of time, particularly if the test creates a lot of virtual test nodes that execute tests simultaneously. CI/CD is about moving code from the developer environment onward, fast. Waiting around for a long running, comprehensive, automated load testing process to complete is antithetical to the purpose of CI/CD. But, this type of intense performance testing needs to take place. It just can’t happen within the typical CI/CD process. Rather, large scale performance testing is best done just before code is released to production, outside the CI/CD pipeline. And, that performance testing must take place in an environment that is provisioned to be nearly identical to the production environment.

Most CI/CD don’t support environments that match a production runtime. It’s to expensive. Thus, many companies will use cloud based testing services to do this sort of mission critical performance testing. A cloud based performance service provisions testing environments and provides the automation testing tools required to suit the exact needs of the test on a pay-as-you-go basis. Companies can provision a large scale, computing environment intended to be run for a few hours. Then, the tests execute and the results are gathered. At the end of the testing session, the environment is destroyed and billing stops. The costs incurred match the testing needs – no more, no less. It’s a pretty efficient deal. Standard CI/CD environments can’t accommodate the time it takes to provision up a big runtime environment, let alone support the time it takes to run comprehensive, automated performance testing. Hence, the limitation of CI/CD and the value of dedicated testing environments.

Conclusion

Functional and performance testing are critical parts of the Software Development Lifecycle (SDLC). While many companies have little problem implementing appropriate automated functional testing within a CI/CD process, implementing automated performance testing is more daunting. However, CI/CD can support reliable, automated performance testing when a level based approach is taken. Still there will be times when automated performance testing will need to take place outside the CI/CD pipeline. In such cases, many companies will go to a cloud based testing service that provides the automation testing tools, software and reporting capabilities necessary to perform stringent, long running performance tests on a pay-as-you-go basis. Using a cloud based service can be money well spent. You get the hardware you need as well as the runtime capacity required while only paying for what you use.

Whether a company goes cloud, hybrid or keeps its performance testing infrastructure in-house, the important thing is to make performance testing — from level based testing focused on isolated components to full scale pre-release, regression testing — an essential part of the company’s automated QA testing process, within and beyond the Continuous Integration/Continuous Deployment pipeline.

More Automation Testing?

Discover more automation testing, load testing and performance testing content on the Neotys Resources pages, or download the latest version of NeoLoad and start testing today.

Keep Me Informed