IoT Load Testing Best Practices

IoT’s Big Bang

The Internet of Things (IoT) is a vast network of connected, physical objects or “things.” These “things” can be devices, vehicles, wearables, home appliances, sensors, and actuators. Each has an assigned IP address or other unique identifier and can collect and transfer data over a network without human assistance or intervention.

Gartner indicates that “The Internet of Things installed base will grow to 26 billion units by 2020.” That’s a whole lot of devices that will connect to the applications that drive manufacturing, supply chain, asset management, and many other industries! Regarding scope, McKinsey Global Institute postulates that IoT technologies could have a total economic impact of $3.9 trillion to $11.1 trillion per year by 2025. IoT is a rapidly expanding technology that is quickly transforming the process of the information collection that powers business decisions.

IoT Performance Testing Challenges

IoT has not only created opportunities for industries, companies, and devices to work together in new ways, but also for companies to commercialize the IoT data they gather. The business race to embrace IoT is driving an unprecedented level of complexity, data variety, and data volume within IoT ecosystems and a laser focus by business to monetize (or at least derive maximum value from) resulting data. Several factors conspire to make IoT performance testing hard.

Unbridled IoT Growth and Complexity

Consider that every IoT device has its hardware and requires software to make it work. This software needs to integrate with whatever IoT application software will be issuing commands to the device, and the IoT application needs to understand and analyze the data gathered by the device.  The ever-expanding number of “things” on the Internet and the variety of data they generate have increased the complexity of IoT ecosystems by orders of magnitude. A heart monitor collecting data about a patient uses different software/protocols and may communicate with its host application differently than an information gathering sensor in a car.

Devices and Data Ubiquity

The plethora of “things” and the data they gather is driving the requirement for more/faster integration with IoT host applications as well as support for data analytics initiatives that derive value from the data. Businesses thirst for the benefits they can reap from real-time insights and analytics outcomes. The result of this can put additional pressure on testers to validate the IoT ecosystem and components under test.

IoT Uses New Protocols

As “things” hit the market, they frequently create the need for new protocols to manage device-to-device, device-to-server, and server-to-server communications. The state of the IoT ecosystem is already incredibly complex because there is little standardization regarding the IoT protocols used. At this writing, the most common protocol seems to be Message Queuing Telemetry Transport (MQTT) since it performs well in high latency, low bandwidth situations. However, Constrained Application Protocol (CoAP) and Extensible Messaging and Presence Protocol (XMPP), DDS, and REST APIs over HTTP are commonly used as well. These new protocols, and the diversity of IoT devices they support the demand that performance and load test tools evolve to support QA test.

IoT Load Testing Considerations

In the IoT world, already complex architectures are further muddled by different versions of operating systems and firmware, the vast variety of things, and the volumes of data they manifest. Together, these variables present testers with new, and often perplexing, performance test challenges, mainly when load factors related to “the cloud” are incorporated.

Consider the following cloud-based use case example:

Weather forecast models indicate that a massive winter storm is descending on the Eastern seaboard. Smart thermostat vendors, such as the NEST, anticipate that intelligent thermostats within this region will communicate with and send computational updates to their IoT host servers (Nest’s private cloud) all at once. Vendors want to ensure that all relevant device data and “states” are captured by Nest servers and that appropriate communications and updates sent to devices.

As you can see, the daunting number of possible IoT combinations to stress test can vex the most seasoned tester. Defining what subset of variants to test and what load factors to account for is a critical precursor to your overall test initiative. Success demands a thoughtful approach and leverages action planning.

IoT Load Test Action Planning

First, gather information to understand which devices and software versions are present in your target IoT ecosystem. It may not be possible to test all variables, so your test scenarios need to focus on the most popular device/protocol combinations. Make sure that your strategy addresses the following:

  • IoT complexity means that QA must test all the new devices that communicate with the network and account for factors such as latency, packet loss, network bandwidth, load, etc. When not accounted for, the impact of these factors in the IoT ecosystem can be catastrophic – consider a car is not responding to a corrective command while the driver is behind the wheel.
  • Diverse usage conditions can impact the stability of the end user’s internet connection and play a significant role in determining test results. Therefore, test plans must ensure that data is detected and stored during service disruptions.

Next, choose the right testing tool – one that supports the underlying protocol(s) used by the IoT application under test. The platform needs to address things like sensors, devices, and actuators in the physical layer; the IoT protocol(s), databases, analytics components, and the processing of computations in the system layer. Also, requirements of the host IoT application such as business IoT requirements for healthcare or manufacturing vs. consumer IoT requirements for wearables and smart home.

Whatever tool you decide to use, your IoT performance and load test strategy should encompass the following parameters:

  • Simulating the interactions of devices and sensors
  • Scaling from thousands to millions of devices
  • The continuous sharing of data. Because IoT devices connect to a server, performance testing must focus on the communication between devices and the network infrastructure.
  • Supporting IoT protocols specific to the ecosystem under test
  • Ensuring that notifications, requests, and responses are sent and received in proper form
  • Providing appropriate data capture, integrations, and performance with analytics systems

Load Test Best Practices 

Align your test plan with the IoT platform: Understand the types of devices and protocols used on your IoT platform and design your test plan accordingly. You’ll also need to gather user requirements related to peak loads expected at given times. Test cases can be initiated defining typical and atypical use cases of the objects.

Prioritize test cases: Identify key areas that require extensive test effort and time. Evaluate how specific scenarios, such as the loss of data, will affect the IoT application under test (think manufacturing production) and how messages get passed in real time. Other scenarios, such as analytics of sensor data across geographies and devices require that applications can support anticipated loads – testers need to measure response times and track device stats.

Learn More

Discover more load testing and performance testing content on the Neotys Resources pages, or download the latest version of NeoLoad and start testing today.

 

Deb Cobb
Deb Cobb has deep expertise in enterprise product management and product marketing. She provides fractional product marketing, go-to-market strategy, and content development services for software and technology companies of all sizes. To learn more about Deb, her passion for collaborative thought leadership delivery, review her portfolio, or to connect directly, click here.

Leave a Reply

Your email address will not be published. Required fields are marked *