Last month, we had the pleasure of gathering our 12 participating performance testing industry experts for the third edition of the Performance Advisory Council #NeotysPAC. The venue – a chalet in the French Alps!
During the Performance Advisory Council (PAC), a significant amount of the presentation content focused on how to help organizations deliver continuous performance testing. Opening remarks by Stijn Schepers (Accenture), addressed Robotical Analytical Framework and the calculation of “performance scores” post-test execution.
Andreas Grabner from Dynatrace referred to the Kayenta approach to release score definition, suggesting that standards need be built where developers articulate the performance objectives, the technical metrics to evaluate performance and the test as a single file. The scoring topic is atop the list of most oft-discussed today as it helps organizations measure release quality and performance issue detection.
Leandro Melendez (Qualitest) shared an effective method used on several projects to understand application context which helps reduce asset building time. The process is based on having developers who add an instrumentation library on each code. Once this is in place, the application workflow, and isolation of any performance issues is more easily understood. Leandro also offered that in addition to the custom instrumentation library approach, utilization of existing framework provides a secondary option.
On the whole, additional PAC topics included performance testing practices – what’s being employed today and how they conduct/interpret data analysis. Stephen Townshend (IAG) shared an effective technique of his – performance testing results analysis using the raw results. As he suggested, the building of Tornado graphs can help teams visualize performance issues.
To the tester, performance testing requires that you understand the context of the application so that efficient test creation can take place. Srivalli Aparna (The Testing Consultancy) reminded us of this as a core principle of performance testing.
The conference concluded with Twan Koot’s (Sogeti) review of effective analysis methods associated with USE (Utilization Saturation Errors), displaying the latest tools helping us to understand the impact on the OS: BPF/BCC. Twan’s presentation referred to the work done by Brendo Gregg on monitoring.
These are just a handful of the experts’ talks. For the full list all PAC presentations, mainly, to learn what each considers to be the critical industry trends this year, click here.
We hope that you find the content worth the read, that you consider sharing it for your team’s benefit.