Image Source: www.jeffbullas.com

We get it. As a tester you are working under pressure. It’s your job to point out what’s wrong with applications, which doesn’t make you the most likeable person of the team. Nobody wants to hear “Hey, can you go back and re-do this?”

It may be a thankless job, but it’s a necessary one. Because without you, the application your team has spent so much time and effort producing could utterly bomb when it gets in the hands of users.

We want you to be the best tester you can be: ready, prepared, and capable. So here are the top 6 mistakes testers are most likely to make when testing mobile application performance – and how to avoid them.

1. Intermittent or infrequent regression testing.

The problem:

During the development process, testers can easily fall into the trap of focusing too much on new code or new tests, at the expense of testing for regressions. This is an easy way for problems to develop and go unnoticed – especially performance problems. Regression tests are critical to show you bottlenecks so you can uncover the bugs in your code.

Best practices:

The best way to make sure you are conducting regression tests regularly is to automate regression testing – both the tests themselves, and the communication process with the team:

  • Build full regression testing into a continuous integration process.
  • Have automated regression testing occur at every check-in if possible, or on a daily basis at the very least.
  • Setup and automatic reporting system to provide the team with regression testing metrics and results on a regular basis, so everyone stays informed and aware of issues as they develop.

2. Using a device or emulator at the wrong time.

The problem:

Testers can sometimes favor either device testing or emulator testing too heavily when testing mobile applications. By using one testing strategy over another you are missing out on key information about performance, quality, and user experience. For example, overuse of emulators will not provide you with key information on how your application performs on specific devices and carrier networks, dealing with the realities of bandwidth and connectivity issues. Meanwhile, an over-dependence on physical devices as a testing strategy will prevent you from getting an accurate performance picture as you scale the load on the application.

Best Practices:

You need a healthy mix of both physical device and emulation strategies to check the health of your mobile application performance.

  • Leverage emulators with simulated browsers and browser capabilities for your automated functional testing and regression testing.
  • Pair an emulator with a load testing tool to generate/scale load appropriate to test your application under stress.
  • Use real devices on real carrier networks to see the true mobile experience your users will have with your application.

3. Not accounting for users’ geographical location.

The problem:

Let’s be realistic, your users aren’t all sitting together in one room. Technically speaking, your users are accessing your servers from many locations around the globe. There is a huge variation in mobile bandwidth – and therefore download speeds – when you travel from country to country, meaning your users will have different experiences based on where they are in the world. As a tester, this is something you need to be thinking about.

Best practices:

Ask yourself: how is the application experience affected in different regions, on different networks, and across distance? These answers will help you create the most realistic test scenarios – a key component of which is an accurate testing environment that simulates load from multiple geographies.

  • First, determine where your users will be accessing your application. Create a geographic profile of your user base based in historical data.
  • Talk to your marketing team to get a better understanding of where they plan on promoting the application – a predictor of where load will be coming from in the future.
  • Incorporate dispersed geographic testing centers into your environment to match the locations of your users.
  • If you want to understand the importance of geo-realistic testing scenarios, check out our blog post here.

4. Waiting too long to test the front end of the application.

The problem:

Testers tend to focus on functional testing while the application is being built and don’t begin testing performance until late in the application development process, largely because the front end of the application is typically not built until after the back-end is ready. So front-end performance testing gets squeezed, potentially leading to situations where problems are missed because there wasn’t enough time to build out appropriate test suites. And when problems are caught too late, it could spell disaster for the entire team.

Best practices:

Even though the front end of the application isn’t built yet, you can still simulate your front end by building tests that interface directly with the back end. This is a great way to get load testing started early. For example, an ecommerce application may have the function to take an item to a shopping cart on the back end before the actual button is built – but you can simulate en-masse button presses through a web-based API to begin testing performance.

  • For best results, build test scenarios that use web services and an API to test load and performance earlier in the process.
  • Write performance specifications right into your requirements at the very beginning of your development cycle, and treat them like the rest of your functional requirements.
  • Don’t wait, take initiative and work with what you can.

5. Testing the front end in isolation.

The problem:

A common misstep is when testers exercise the front-end but actually neglect the end-user experience. That’s because the end-user experience is a combination of front-end behavior, back-end behavior, and the communication patterns that take place in between. As a result, you can have a well-functioning front-end that doesn’t make users happy because it feels slow and sluggish.

Best practices:

When testing an application you need to set up full-path tests that exercise the front and back ends of the application in conjunction. Yes, rendering time, packet loss and other metrics are important to consider but so is the round trip time of your application – especially when the back-end is under heavy load.

  • Incorporate round trip times under various levels of load to provide insight on how the user experience is affected by what’s happening on the back-end.
  • Conduct some amount of manual, physical device testing to make sure the end-user experience is real and tangible, evaluated by an actual person.
  • If you see a problem, fix it first before continuing on to new tests.

6. Working completely outside the codebase.

The problem:

While testers don’t have all the knowledge about the internals of the code, building a test suite without knowing how the code works means you may miss problems and bottlenecks that could be significant. Just take a look at what happened to Healthcare.gov.

Best practices:

Getting closer to a codebase can be a little scary for a tester, but there is really nothing to fear when you are working with a team. Here are some of our tips:

  • Work with your developers to identify parts of the application that should be tested based on the way the application is built.
  • Design tests in conjunction with your developers to fully understand the areas of the application that need to be tested.
  • By communicating with your development team regularly you will know when major sections of the application are being changed so you can test accordingly.

Catch Problems Before They Start

We know mistakes happen, but please take the time to learn from them. These 6 mistakes are easily avoidable when you know what to do. Understand that as a good tester you need to test early and often, as well as fix problems as they come. Don’t wait for problems to arise, take the initiative, you will waste less resources in the long run.

, ,

Leave a Reply