Developing With Performance Testing in Mind

A tester friend of mine recently came to me with a complaint that I think is fairly common in the testing community. He said: “Every time there is a new release of the software for us to test, we have to rework our testing scripts.” I’ve heard this complaint throughout my career, not only in performance testing but in functional testing with automation tools as well.

This state of affairs arises from three fairly straightforward observations:

  1. Change is inevitable. Everything changes, and in no industry is this more apparent than software development. It makes no sense for testers to ask developers to stop changing the code, but it does make sense to encourage wise changes.
  2. Developers and testers don’t always communicate well. The proverbial wall between developers and testers is still quite formidable. When developers throw a new version over the wall to be tested, too often they’ve given little thought to how it will be tested.
  3. All testing tools are not created equal. Some make it easier to identify and handle changes than others. If your testing tool is designed to handle change well, then your entire team is better positioned to embrace change rather than fear it.

Thinking Like a Tester

Most development organizations make a real effort to improve communication between developers and testers, but it’s not always enough. Beyond encouraging developers to talk with testers, I ask them to take it a step further and think like testers.

I find that it’s a good idea for developers to sit through some of the training that the test engineers complete. In my experience, the developers who do are more careful and avoid making arbitrary changes with little or no justification. They don’t, for example, change the name of a field in a form simply because they don’t agree with the name the initial developer gave it. When developers are aware of the kinds of changes that make a tester’s job harder and what kinds of changes make it easier, then from an organizational standpoint the entire process is more productive.

An Analogy From the Early Days of Functional Testing

Some of the earliest automated functional testing tools for GUIs would simply record the location of the mouse pointer on the screen during tests, and then play back those mouse clicks to execute the test script. If a developer moved the location of a button on the screen, the script would break. Other tools would record the label on the button, so the button could be moved around the UI with breaking the script but changing the button text from “Submit” to “OK” would break the script. More advanced tools used the button’s ID to identify it in the script, so that the developer could change both the position and the label of the button without making the tester’s job more difficult.

One key lesson here is that the choice of testing tool makes a big difference in the productivity of the testing team when the software under test changes, even in relatively trivial ways.

The other key lesson is that developer awareness of testing tools and procedures goes a long way in facilitating a smooth testing operation. I saw this firsthand during a training session I gave years back. While describing how button label changes affected testing, a developer who happened to be sitting in on the training sat upright when he finally understood why his colleagues in testing were so frustrated by many of his changes. He never knew why they objected so much to his changing a button label from “Clear” to “Reset”. Going forward, that knowledge didn’t stop the developer from making necessary changes. It did, however, make him pause when he made such changes to consider whether they were really necessary.

Performance Testing Tools That Make it Easy to Handle Change

In performance testing, we are not concerned with the location of buttons, but we’re not immune to seemingly trivial changes.

For example, when a web form is submitted to the server, the form fields will be a series of name-value pairs. Changing the name of a form field, adding a field, or deleting a field can cause problems during performance testing. With a less capable testing tool, these problems can be hard to identify and diagnose, especially if there is poor communication between developers and testers.

File difference viewers (diff viewers) that enable the tester to compare multiple recordings against each other are particularly helpful in pinpointing the changed fields. When it is time to modify the script, an effective tool will enable you to add, delete, and update fields without programming. Just right-click and choose add, or simply drag-and-drop to update your load testing script.

Form fields are relatively easy to handle for load testers. Parameters that are session specific are more difficult (These parameters change from session to session, but stay the same for the duration of each user session). By default, the hard coded session values are captured by a load testing tool in each script, and a test engineer needs to parameterize them to make the script usable for load testing. Double-clicking on a hard-coded value to make it a variable is easier than diving into the script code. Here again, tools that help automate the process can reduce test creation time from many hours to a few minutes.

When a new script is needed or maintenance is required on an existing script, tools that are easier to use can make the task orders of magnitude faster.

Overcoming the Fear of Change

I know of development teams that gradually became more and more afraid to change their software, because of the difficulties that the changes introduced in testing and elsewhere in the process. Needless to say, this had a negative effect on their ability to deliver new features and fixes. A root of the problem, it turned out, was the testing tool that they were using, which made changes arduous and error-prone. Once they switched to a modern tool, the required script changes were easier to make. Performance testing times shrank from a week to less than a day, and development was once again free to make long-needed changes. Agile development shops in particular depend on this ability to rapidly implement changes in testing scripts, and get the tests going in minutes or hours instead of days and weeks.

So, if your organization is starting to fear change, encourage your developers to think like testers, and encourage your testers to use tools that make inevitable change easier to handle.

, ,

3 Responses to “Testers Are From Mars, Developers Are From Venus:
Tips & Tricks to Improve Your Relationship With Development”

  1. Mark says:

    Test scripts are effectively code. They must be treated as such and thus have developers involved. I suggest people look at how Netflix does testing because they are continuously deploying.

    I believe many of the problems we have in software development is due to artificial separation of jobs, both organizational structure and job title, vs having roles and SMEs.

  2. This was a very insightful article, thank you for writing it. I tried to summarize your major points on my blog, Hacker’s Valhalla. Thank you again.

  3. [...] in January our own Steve Weisfeldt discussed the issue of needing to improve testers’ relationships with developers; Steve even made some suggestions for developers. In this article, Thomas McCoy takes aim at [...]

Leave a Reply