Agile Load Testing
Implementing reliable performance testing within a modern, Agile-based, software development lifecycle (SDLC) takes some getting used to. Unlike traditional waterfall development scenarios in which load testing is an isolated phase toward the end of the SDLC, load testing under Agile happens continuously. It’s a different mindset for doing business. As such, those doing Quality Assurance work in the world of scrums and sprints have to take a new approach to the implementation of useful performance testing, in general, and load testing, in particular. Understanding the nature and practice of performance Agile testing is the purpose of this article.
Understanding the Dynamics of Work Allocation in Self Directed Teams
The reason that Agile came about is that companies could no longer tolerate a distasteful fact: there is a likelihood that 68% of software projects under development will probably not succeed. Companies wanted a better way to make software because the way they had wasn’t working.
Agile offers that better way. The essential premise of Agile is that self-directed teams are the ones that work most efficiently. Agile demands that managers accept that the contributor is best qualified to determine the feasibility of implementing a product feature and setting the conditions of delivery of that feature.
It’s akin to making a cake. You can go to the baker and request a custom designed cake for ten dollars, to be delivered the next day. However, if it turns out that making the cake you want needs to sell for twenty dollars and will take three days to make, you’re sunk. You can demand all you want, but the reality is what it is. Instead of making the demand, you’d have done better to explain to the baker the cake you want. Then, ask how much it will cost and how long it will take to deliver. If you can’t have the cake you want in a day at ten dollars, Agile says to work with the baker to figure out how to get you an acceptable cake in the time you want, at the amount you want to pay. If a solution can’t be negotiated, then you need to change your budget or your delivery date, maybe both.
Agile says, when it comes to cost and delivery, the expert is the expert. Duh.
Creating an Agile Performance Testing Plan that Focuses on the Sprint
Thus a question arises when teams become self-directed, how do you manage them? The answer is to focus on the sprint. A sprint is a predefined period in which work takes place. Sprints usually run about two weeks. The team meets before a sprint begins to work with the project manager to identify features in a backlog to implement, according to priority. Team members analyze the given feature and use their expertise to determine the feasibility of implementing the feature given the time and resources available during the sprint. Once the feature is deemed possible for delivery within the sprint’s timeframe, team members commit to doing the work they’ve agreed to do. Commitment to success is a key element of Agile.
Agile may seem like an obvious path to success, but it gets tricky, particularly around testing, particularly load Agile testing. Most modern developers follow Test Driven Development (TDD) to some degree. TDD fundamentally means the developer tests every line of code s/he writes. The usual unit of a test is the function, thus the concept of unit testing.
All the tests that a developer writes must pass on the developer’s local machine. When the developer checks the code into a repository branch, the unit tests are run again in the company’s automated CI/CD process. Again, all the tests must pass. Failure can “break the build.”
TDD and unit tests at the developer level and within CI/CD are commonplace. Performance Agile testing is harder because such testing goes beyond the scope of the developer’s activity. Typically load Agile tests are performed by an independent mechanism that creates Virtual Users (VUs). Each VU exercises aspects of the code to ensure that t runs within acceptable performance parameters.
Testers will use a performance Agile testing tool, such as NeoLoad to create the number of virtual users needed to exercise the code. And here is the rub: all it takes is for one, sleepy-eyed test engineer to make a mistake configuring the number of virtual users in a load testing tool to bring a testing process to its knees. Emulating 1000 VUs to make requests against an API is usually acceptable. Emulating a million VUs is a nightmare!
Load testing can be a significant blocker if it’s not implemented properly under Agile. The intention of a sprint is to move fast and to move with purpose. Items that are blocked or delayed are deadly to the success of a sprint. Being a blocker is a big deal.
Many testers coming to Agile for the first time are accustomed to having the bandwidth to run load testing scenarios that take a good deal of time to execute. Creating a load test scenario that takes a few hours to run is perfectly acceptable as code gets closer to production release. However, implementing tests that take hours is orthogonal to the Agile sensibility. The dynamic in a sprint is to release often, fail fast and fix. There are companies out there that are making daily releases to production. Holding up a deployment going from development to QA for a few hours so that load testing can be performed is an unacceptable impediment to a mission-critical testing process. An hour converts to a week in Agile time. So the question is, how can testers approach load testing within a sprint that suits the needs of the sprint? The operative term is, “fail fast, often fail and fix.”
Designing Agile Performance Testing
Key to designing a load test suite that meets the needs of Agile is to make load tests small enough to execute quickly, yet comprehensive enough to detect blatant performance issues. The best way to figure out how to do this is to do as Agile instructs, ask the expert. Thus, the first thing to do is make sure that those doing Agile testing are included in the sprint, from planning to end. Including testing personnel in all aspects of the sprint gives the testers the information they need on an ongoing basis to make sure that the testing activities provide value and are appropriate to the rhythm of the sprint.
Next, the load testers need to determine the testing methods to use. Also, they need to determine when testing will be administered as well as the metrics by which Agile testing will be measured, and success is to be defined. These items must be communicated to the team. The load testers need to be willing to accept feedback about the Agile testing plans. The load tester is the expert, no doubt, but there is room for the opinions of others. When a team is mature and well functioning, information is presented and received without inhibition or defensiveness. As a result, feedback is typically given in a supportive manner with an earnest desire to improve the overall effectiveness of the team. Opinions presented in an open, supportive manner will reveal useful insights that will help the testers refine their work to more effective ends.
Finally, load testing needs to be conducted within the sprint, using automation, as defined by the testing plan.
The typical practice within a sprint is to have Agile testing done in an automated manner during nightly builds, in a way that is fully integrated with the CI server. Nightly builds need to focus on non-regression performance testing. Regression tests can be time-consuming and thus, need to be avoided as part of the nightly build when the testing runtime is at a premium. Larger load tests, including those that cover regression, are done at the end of the sprint, to ensure that the release can go in production.
Things might not go according to plan. In the first few sprints, there will probably be mistakes. The trick is to understand that Agile accepts failure as part of the iteration process. Mistakes made in one sprint are remedied in the next. The challenge is to make sure that all members of the team are open and well coordinated. Typically open, honest, committed team interactions yield successful sprints. The trick is to walk the talk and do the work.
So, the last outstanding questions are, regarding Agile testing, what constitutes a successful sprint and when is done, done?
Software projects can go on for months when a clear understanding of completeness is not understood by all. The typical manifestation of this problem is feature creep. In the absence of an agreed-upon end state, stakeholders consider the project incomplete and keep requesting “one more thing” to make matters right. Sometimes “one more thing” is an additional feature. Other times “one more thing” is improved code performance. The sad fact is that without a concrete and documented definition of what the done state is, there is no “one more thing” to add that will make matters right.
The best way to determine that the tasks within a sprint are complete is to agree upon the metrics of success and then document them within the sprint’s project plan. The Agile way to define success is to write stories that describe the completed state, for example:
- As a user, I want to the web site’s login page to accept valid data user data and respond within 100 milliseconds. The scope of use is to emulate one hundred concurrent virtual users logging in. The tool we’ll use to do the virtual user emulation is NeoLoad.
Again, concrete counts. If done is not known, neither will be a success.
Putting it All Together
The Agile way of software development is intended to increase the probability that code under construction ships successfully. The fundamental building block of Agile is the sprint. Sprints are time-boxed development periods in which features are implemented according to priority. The scope of work to be done during the sprint and the definition of success of that sprint are defined by the self-directed team doing the work.
Testers who are unaccustomed to working in sprints need to adjust the way they work to the fast-paced release cycles of Agile. This is particularly true of load testers. Effective Agile testing in a sprint is focused on creating tests that are designed to execute quickly throughout the sprint. The focus of this type of load testing is to determine blatant performance problems. More stringent testing is better done later on in the SDLC.
Agile testing is an important aspect of any software development process. Having it become an integral part of Agile sprints will have far-reaching impact for those teams and companies whose mission is to make quality software that counts. As those of us who have been doing it for a while have come to understand, making software that counts and getting it into the hands of the user community is what it’s all about.