Test summary

The Test Summary tab contains all the main statistics for the test.

The Tools button allows generating a report for the selected test, among other functions. For more information, see Publish a test report.

Select a summary

To display a test result summary, a click on the Results drop-down list helps select a test.

Content

When a test is selected in the Results drop-down list, all the tabs in the Results section are updated.

Results summary

This tab displays basic details of the test: project and scenario name, test duration, load policy, and so on.

Statistics summary

The statistics summary displays the following global metrics:

An icon next to each statistic indicates the severity of the statistic discrepancy with regard to the SLA profile linked to the scenario.

Hot spots

This section focuses on significant results. The times are expressed in seconds.

Top 5 errors

This section lists the five pages having produced most of the errors.

Top 5 alerts

This section lists the five longest alerts filtered by alert importance (critical alerts first, followed by warning alerts). The duration is the duration of the different occurrences of an alert as a percentage of the total test duration.

First critical alerts

This section presents the first critical alerts occurring during the test run.

Top 5 average response times

This section lists the five pages having produced the longest average response times.

Top 5 maximum response times

This section lists the five pages having produced the longest maximum response times.

Top SQL requests

The section displays the results for indicators relating to SQL requests.

Errors

This section describes the errors encountered on requests and on logical actions generating errors (Javascript actions). For every type of error, it lists the error code, the number of requests or defective logical actions, and the error description label. For more information about NeoLoad error codes (beginning with NL-), see Neotys status codes. For more information about HTTP response codes, see HTTP status codes.

Alerts

This section displays the details of the alerts triggered. The section only displays elements with at least one alert threshold set. For each element, the following statistics are displayed:

An icon is shown to indicate the state of each type of monitor with counters that have alert thresholds set:

Cloud

Three available views:

Map

Displays statistics in relation to the geographical areas.

The map is displayed with the summary of the zones.

Zones are identified by an icon that changes color (Green, Black, Red) or size depending on the number of Load Generators.

There are four sizes of icons in relation to the distribution in percentage of the Load Generators. The icon size increases by 25% increments.

On the map, click on the icon of the zone to query in order to display the zone statistics and a graph on:

Use the arrows < or >, on the left and on the right of the graph to change graph.

The second graph illustrates:

Graphs

Two available graphs:

Statistics

The table indicates for each zone, the following information:

General statistics

This section displays the following aggregated statistics for all User Paths, pages and requests:

Statistics are allocated per User Path. They are displayed in a hierarchical way. When actions are shared among User Paths, the aggregated statistics are displayed in the beginning of each section.

Transaction statistics

This section displays the following aggregated statistics for all Transactions:

Statistics are allocated per User Path. They are displayed in a hierarchical way. When actions are shared among Virtual Users, the aggregated statistics are displayed in the beginning of each section.

Execution context

This section displays the settings applied during the execution of the test selected (not the ones in place when the report is generated).

Given the fact that runtime settings have a huge impact on the result, displaying the settings applied for a test allows to review settings between tests.

Servers

This section displays the following information for all servers used during the test:

Populations

This section displays statistics per population:

User Path, Transactions, push messages and pages

The following statistics are displayed for each User Path, Transaction, push message and web page:

Statistics are allocated per User Path. They are displayed in a hierarchical way. When actions are shared among User Paths, the aggregated statistics are displayed in the beginning of each section.

Media contents

For every media request, the following statistics are displayed:

The statistics are displayed per Virtual User.

Monitors

Monitors group together performance counters and indicators. The performance counters and indicators are grouped by monitored machine and by Monitor (Tomcat, MySQL, etc.). The following statistics are displayed for each performance counter:

External data
Overview

This section allows displaying external data received during the execution of a Custom action.

NeoLoad sets up a data exchange server which receives external data.

For more information on the data exchange server configuration, refer to the Data Exchange API User Guide.

Result qualification

Some external data entries have a pass or fail status.

When a request is not subject to a Pass or Fail condition, it is displayed with the status: Unknown (example: Battery Level %).

Display

NeoLoad displays the results in two tables.

Scenario

This section shows a summary of the test scenario configuration, with the following information:

Main graphs

This section includes a collection of predefined graphs: