Lots of blogs and articles have been published over the last years showing one point: users are extremely sensitive to performance. Even the smallest impact on the page load time impacts traffic noticeably. Today’s businesses have to make sure that their websites work really fast for all their front-end clients (web browsers, mobile app, etc.).
The most significant study was done by Amazon almost 10 years ago showing that 100ms increase in latency results in a 1% reduction in sales. Google’s experiments in this area showed that user traffic took months to recover after they deliberately slowed page load times for certain users.
User experience is a combination of several dimensions :
Response time has become one of the key factors for user experience. All companies delivering customer facing applications should be are aware of this aspect and try to reduce the risks related to performance as much as possible.
Many organizations have moved to agile in order to focus more on customer requirements and to be able to adapt the functionality depending on customers’ feedback. This methodology brought up additional challenges for testers: more frequent releases to be tested but in shorter timeframes! That’s why we need to adapt our working habits to be able to deliver several tests activity at the same time. Why don’t we combine functional testing activities with tests from the performance engineers?
Functional testers are often afraid to talk to performance engineers because they have always delivered complex reports and never given a simple answer towards performance like: PASSED/FAILED. In the same breath, performance engineers avoid talking to functional testers because they usually perform manual tests and spend more time in reporting and planning rather than in issue investigation. Performance engineers complain because their testing activities are reduced because functional testers require more time to test their piece of the application.
Let’s make peace and work together to deliver better quality in a reduced amount of time!
Why automated functional testing is essential:
Functional testing is an essential part of the testing cycle as it validates and ensures the overall functionality of an application. However, functional testing can be expensive and is often more time consuming than unit or integration testing. As functional tests are still frequently carried out manually, the time needed to test an application is further increased.
With today’s trend towards agile software development, time is a precious commodity; there is simply no time to do the same things over and over again manually. The answer to this issue is a simple one: Automation! To maintain a good level of test coverage, functional tests are still necessary and cannot be omitted to save time. As a result, functional tests have to get automated as well and have to be run together with other test types like unit or integration tests. Only when utilizing functional test automation, it is possible to achieve overall test results in a fast and automated manner: every week, every day, and even after every build if desired!
What are the benefits from combined functional testing and load/performance testing:
Many companies are using lots of test cases to achieve good test coverage of their applications. In most cases, the testing effort is prioritized by the risk of a certain failure in order to focus on tests of key use cases which must not fail. So testers have to test the application either in a manual or automated manner. It is a fact that automation is required to increase the test coverage and reduce the overall test effort.
Functionality has a higher priority over performance; that’s a fact and makes lots of sense. Having a very fast application which frequently crashes and doesn’t work in a stable manner does not make sense at all. Functionality is the foundation upon which performance and ergonomy are based on. More over, running load tests on an unstable application will just generate exceptions on the architecture. Thus, the load generated won’t be relevant or usable at all as it only highlights a strange behavior of the application.
Load and performance testing requires a list of use cases being scripted in the testing solution which is similar to what is required in functional testing. The objective is to simulate a representative production activity (or future activity) to draw conclusions on the end user performance and the infrastructure stability. The use cases for performance are not selected using the same logic as selecting tests for functional testing. In performance we are selecting the use cases which are most frequently used in production or which could have technical impact on the architecture. As a result, it frequently happens that key functional scenarios are not included in the scope of performance testing because they are only used once or twice a day.
Covering the risk on those key scenarios by running the regression tests on an empty and idle environment won’t cover all requirements. It makes more sense to run your regression tests on your key scenarios under realistic conditions: the network is occupied, the infrastructure loaded, databases are having several request at the same time, etc.
Combining NeoLoad and Ranorex
The current integration allows you to:
- Automate your regressions with a load test. Ranorex will drive NeoLoad by triggering a load test and increase/decrease the load depending on your regression test.
- The performance engineers will take advantage of this integration by using their favorite load testing solution which provides all the metrics related to user experience. They would be able to validate the key scenarios in terms of performance and the functional requirements would be also covered because every Ranorex test execution would be available for the functional tester.
To learn more about the integration, check out this joint Neotys and Ranorex webinar.