Integrating NeoLoad into the Accor Software Factory

In a classic development cycle, it all starts with an idea. Someone in the business then passes this over to the delivery team, who investigate developing it before moving it along to production. But when the business is generating a lot of ideas to test, the delivery team develops many, many lines of code. This large amount of information can be problematic when it reaches the production stage, and result in extremely expensive deployments.
To optimize development, Accor has adopted a more digitally-focused approach achieved through the creation of the Software Factory. In this process, when an idea is communicated by the business, it is developed, tested and sent to the production team piece by piece.

Presented by:

Michael Djahel, Method and Tools Manager for the Accor Group, presents the operation of the Accor Software Factory and integration with NeoLoad.

Michael Djahel
Methods and Tools Manager, Accor Group
Frank Jarre
Testing Practice Representative for Itecor Paris

In short: how does the Software Factory work?

 

The Software Factory follows a precise process in order to progress ideas:

  1. Develop lines of code
  2. Unit tests
  3. Qualimetry tests
  4. Packaging
  5. Perform automated tests
  6. Validation
  7. Verify safety and performance

After all steps of the Software Factory are validated, the app can then be delivered in the Cloud or via on premise environments. Included in DevOps approaches, this cycle is constantly repeated with developments that are delivered on an agile basis.

The history of performance testing at Accor

 

The beginnings of performance tests

Accor’s performance dates back to 1999. At that time, LoadRunner was one of the few products on the market to cover Web, Sybase and Oracle technologies. Performance tests were carried out by security teams who wanted to validate the quality of apps. Development, production and security teams were all involved in the process.

This method required a lot of coordination, as well as a lot of scripting work, and only identified rather general indicators with little detail.

With advancements in the technologies themselves, Accor started to use Sybase and Oracle protocols less and less – instead choosing to focus on performance tests mainly on the Web or using APIs.

 

Advent of digital at Accor

In 2016-2017, a digital transformation was mandated – with a push for investments in new tools. This transformation focused on the virtualization of environments, agility in deployment and cloud technologies as well as the transformation of apps into APIs.

For the purposes of performance and monitoring, this transformation encouraged the acquisition of several products

  • NeoLoad for performance management
  • Dynatrace for application monitoring
  • Splunk to trace application logs in production

The transition

In 2017-2018, Accor was supported by Itecor during a three-month transition period. During this time, for each LoadRunner project the same script was duplicated in NeoLoad. In this way, the solution could be qualified, and teams realized that the scripting process could become much more efficient this way.

As a result, projects were switched to NeoLoad, and Dynatrace was integrated to allow test campaigns to provide much more detailed information during performance sprints in the validation environment.

NeoLoad’s Web Publishing feature was also used to provide live statistics.

How are performance tests managed at Accor today?

 

Performance tests at Accor operate in two streams:

  1. Users: Accor has created a performance testing community, making it possible to group all those who have requested at least one performance test. They can use NeoLoad and start working on the ownership of the tool.
  2. Automation: Accor follows a left-shift performance testing strategy by integrating dedicated scenarios in Gitlab. Environments are managed directly from the Software Factory via Kubernetes. Results are published in NeoLoad, and to ensure an element of quality control exists in the pipeline the teams are measured against SLAs.

What adjustments can further optimize performance tests?

 

Create a dedicated database

It’s worthwhile setting up a dedicated database to act as the base of the testing system. This will enable you to exploit the data without disturbing anyone else’s work.

 

Generate your own data set

Different to the scripting carried out during traditional testing, teams need to be able to generate their own data sets in this new phase of performance testing. From the moment the test becomes continuous, the data set cannot be perishable. If it is, it must be possible to re-generate all required data before the test.

 

Use capping tools

If there is no need to use data sets, it is possible to use capping tools to limit the impact of testing and not disrupt any services that could be called on through the app.

 

Containerization of the application

This is no longer an essential step because it’s now possible to install the application on premise and conduct tests from the Software Factory (in the cloud).

Specifically, how does it all happen?

The objective

When we think about performance, we think of an app that already exists. Therefore, end-of-line performance tests are put in place to confirm all automated testing has taken place. But when an app does not work at the end of the production line, there’s little to no time to make adjustments because the functionality is required quickly. This is why it’s good practice to integrate performance testing as early as possible into the development cycle.

The goal is to focus on key features and establish baseline figures that can be monitored throughout the development lifecycle. This enables you to spot trends and differences between different build lifecycles and highlight any regressions.

 

The method

  • Support project teams: this process was facilitated by NeoLoad Web, as it puts a raft of testing data at their fingertips – including results of completed tests and in-progress updates. There are currently more than 50 users of NeoLoad Web at Accor, with about 80% of them connecting to the system around fifteen times a week.
  • Make testing less demanding: reducing the number of required users helps ensure the platform won’t encounter problems during execution. To do so, teams must focus on the most critical user stories and associated important features.

 

The results

  • Track and monitor early: the integration of Dynatrace into NeoLoad makes this possible, providing access to system metrics to support the analysis of changes in metrics over time.
  • Share: NeoLoad Web facilitates the sharing of different findings between a variety of different stakeholders, including Dev, Ops and security.
  • Standardize quality KPIs (Keys Performances Indicators): for effective performance testing you should initiate discussions around KPIs to collectively agree quality targets with project teams.

What individual tools make up the Accor Software Factory?

 

The Software Factory is a complete structure that makes use of different tools:

  • Amazon AWS for architecture – with development, qualification, and production environments for performance testing
  • GitLab for scenarios made with YAML
  • Nexus for deposits
  • NeoLoad for loading
  • CheckMarx, SonarQube and Selenium for automated testing

The ideal scenario for testing is a fully automated end-to-end pipeline, but currently Accor projects are not ready for this level of testing maturity. Human intervention is usually required at some stage to validate the operation and move on to the next stage.

How does it work?

 

  1. Code is created on NeoLoad in YAML
  2. With the introduction of GitLab, we now plan the scenario that will be launched. For this, we use AWS and the NeoLoad Image Docker
  3. We collect the license on the team server
  4. We begin to load the app in order to test.

As soon as the test starts, the results are sent to NeoLoad Web and notifications are sent to the project team to inform them that the performance test is either in progress or is now complete. This lets project teams monitor the progress of tests and immediately know the results.

What is in the YAML NeoLoad file?

 

The YAML NeoLoad file contains all text descriptions including the project name, environment name, Docker information with confirmation of the parameters used (login, password, license) and the duration of the test as well as the final result.

Tests should have an established SLA to define the quality levels required. At the end of the test, if the SLAs are met, the team receives a notification to inform them that the test has been successfully validated. Otherwise, an error report is sent to the teams, with detailed data on what went wrong.

A simplified model for project teams

 

Any project team wanting to use NeoLoad in their pipeline can simply use the YAML file (where all issues are registered by the Software Factory).

A simplified file is available, enabling the project team to launch a test by adjusting only three variables:

  • The project name
  • The scenario used
  • The environment in which to test.

Try NeoLoad, the most automated performance testing platform for enterprise organizations testing from APIs to Applications

 

Keep Me Informed