Impressions from the First Neotys Performance Advisory Council

Neotys PAC Sets Sail with Maiden Voyage

Earlier this month, Neotys proudly organized its first Performance Advisory Council (PAC) in Scotland’s Borthwick Castle. The invite-only event included 15 of the industry’s top load/performance testing experts from around the globe who came together to explore several relevant topics associated with Performance Testing today such as DevOps, Shift Left/Right, Test Automation, and Artificial Intelligence.

Attendees represented some of the top experts from our partner ecosystem and network yielding the perfect blend of best practice sharing and brainstorming. One of the conversations that produced significant airtime was the performance engineering evolution and future outlook.

Neotys is blessed to have such a great network of technology experts and appreciates the value it brings to our product development and delivery. We appreciate and thank all of the council’s participants: @Todd deCapua, @Sriram Rajaram, @Bruno Da Silva, @Andreas Grabner, @Stijn Schepers, @Joerek van Gaaelen, @Bruno Audoux, @Stephen Townshend, @Stuart Moncrieff, @Jonathon Wright, @Hani Chalouati, @Ian Molyneaux, and @Wilson Mar. We uploaded every presentation on their page in the PAC section of our website.

As part of their involvement, each attendee was asked to produce content associated with a designated topic. Each has graciously allowed us to bring these unique, insightful perspectives directly to you. These individual content pieces will be delivered as part of an upcoming Neotys blog mini-series. In the meantime, we have compiled a teaser on the output from two of the topic areas: Artificial Intelligence (AI) and DevOps.

Artificial Intelligence

A hot topic wherever you go, AI is the backdrop of every exchange about the future of technology. Neotys understands the concerns AI is imposing on the industry, and equally as sensitive to getting in front of how it can help NeoLoad users with their load and performance testing efforts as they continue to evolve. During the PAC meeting, the AI topic was covered, however, in more general terms, and by two authors: Todd DeCapua and Andreas Grabner.

Todd DeCapua walked the group through his perspective on AI. His session fostered thoughtful dialogue around things like how-to validation of AI applications, the proper efficiency measurement one should apply to AI algorithms?

While the group didn’t pretend to conclude with having all the answers, the discussion provided great knowledge sharing of the AI challenges highlighted by what seems to be growing need for a formal performance dataset and test results analysis standardization.

Andreas Grabner presented on the principle of the Dynatrace AI engine. After sharing his point-of-view, it was clear the mystique of AI was a secret no more. Andreas Grabner clearly articulated the value of the Dynatrace approach as a critical method to help expose the performance bottleneck.

Although hotly debated, the future of AI remains, at best, subjective commentary based on one person’s interpretation of a crystal ball reading. If any of the session’s attendees came into the conversation harboring uneasiness over AI’s impact on their role tomorrow, they were quickly reminded that theirs is not limited to performance validation but influencer of the next line of code – the process to pinpoint and resolve production issues.

DevOps

When performance testing employs Shift Left (the process of test execution as early as possible during the SDLC, E.g. component testing) and Shift Right principles the user can take full advantage of available production data allowing for optimal response/testing strategy optimization. This is DevOps in-action.

Stijn Schepers shared how his role has become more strategic reinforces the phenomenon that organizations now expect and appreciate such thought leadership from performance testing task owners. Schepers underscored the benefit of the test early, often approach, in concert with APM incorporation to replicate realistic user experiences with continuous testing.

Stephen Townshend reminded us of the common Continuous Performance testing paradox today – no matter how continuous, a component-only testing will not instill confidence in application behavior during production. In turn, alluding to the role of performance engineer as a risk manager, reiterating that true Continuous Performance testing incorporates system-wide testing and web performance testing.

Hani Chalouati explained a current lab project where his team’s testing an approach of adapting its load testing scenario design based on AppDynamics production data.

Wilson Mar highlighted inherent field challenges born out of DevOps, culminating in recommended approach to the short development cycle environment working model.

Enough About the Future, There’s Still Work to be Done Today!

As we wrap-up a successful first installment of the Performance Advisory Council, it is important to recognize that future talk will always be predicated on what we do today. The role of the performance engineer tomorrow relies on tactics like DevOps and Shift Left/Right into motion in the name of Continuous Integration and Continuous Delivery – more to come on this.

Discover all of Neotys Performance Advisory Council attendee content delivered in Scotland, including Stuart Moncrieff’s Top 7 Performance Testing Mistakes that reminded us there are still projects that are:

  • Not using any think time, pacing
  • No functional validations in their script
  • No monitoring

The basis of performance testing needs to be respected before starting complex automation.

For more information about the PAC, the expert panel, and/or upcoming meeting schedule, click here.

Leave a Reply

Your email address will not be published. Required fields are marked *