Even the smallest IoT device lives in a complex environment, which may not be fully understood at the time of development. In fact, we have already seen the security problems associated with devices being connected to the Internet for the first time. In a previous post, we discussed the benefits of service orientation for design, development, and testing. In this post, we’ll take service-based testing and service virtualization to the next step – virtual labs. Building a realistic physical testing lab environment is difficult, and even when complete, it becomes the main bottleneck in system testing. Virtual labs remove this bottleneck while providing new benefits to service-based IoT device testing.
A recent study found that 80 percent of IoT apps are not being tested for security flaws. The Barr Group found that 56% of embedded device developers don’t review source code for security vulnerabilities and 37% don’t have a written coding standard. These are not encouraging statistics, and it’s clear that IoT device manufacturers need to take quality, safety, and security more seriously. Test automation is one important step in order to make sure testing is being done more rigorously, consistently, and thoroughly. Testing, especially for security vulnerabilities, is often seen as too costly and complex, and is therefore rushed or overlooked altogether. But it’s an expensive mistake to let your customers (and attackers) test your IoT device security for you.
A real test lab requires the closest physical manifestation of the environment an IoT device is planned to work in, but even in the most sophisticated lab, it’s difficult to scale to a realistic environment. A virtual lab fixes this problem. Virtual labs evolve past the need for hard-to-find (or perhaps non-existent) hardware dependencies. Use sophisticated service virtualization with other key test automation tools:
The edge computing IoT ecosystem is shown below in Figure 1, depicting a typical environment in which embedded IoT devices are deployed. Sensors and control devices communicate information to the Edge, which is a series of appliances or applications that can receive information and use logic to communicate back to a device or up to the cloud. The cloud then has higher-level logic that allows it to act upon that information. The cloud is a set of services — microservices, connections to databases, additional logic, or third-party services — a complex web of functional building blocks, shown below to the right.
Figure 1: A typical IoT ecosystem in which embedded devices would be deployed
When it’s time to test in the IoT ecosystem, testing is required at many layers. To test new functionality introduced in the gateway, for example, validation that the gateway can receive information from sensors, and can communicate that in the way you’ve built the business logic.
In order to validate all of this complexity, Parasoft Virtualize (which simulates required dependencies) and Parasoft SOAtest (which drives tests) are used to simulate those inputs. These tools provide simulations of realistic calls from the devices over the network (whether they are protocols like REST/HTTP, or IoT popular protocols like CoAP, XMPP, or MQTT), and test that the device under test (the gateway in this example) is communicating with the cloud services appropriately, by validating the responses that come back from SOAtest. Figure 2 below shows an example of how a virtual lab environment can be created for edge devices under test.
If there are external ways of communicating information into that gateway, those calls can be simulated as well. Parasoft Virtualize is designed to stabilize the testing environment, to create predictable responses to requests that leverage test data from SOAtest, fully testing the gateway and services.
Finally, the top-level services might be communicating back to the edge, and back to other sensors and external actors, and it might be important to know that the flow from your inputs are making their way through the environment back to the back-end systems. Parasoft Virtualize is used to simulate the receiving of those calls down to the edge (down to the IoT devices) and then relay that information back to SOAtest to confirm that the call made the round trip and behaved the way it was expected inside the IoT ecosystem. The combination of Parasoft Virtualize and SOAtest provides full control to test the whole environment, even within the complexities of an IoT ecosystem.
Figure 2: The role of Parasoft’s Virtualize and SOAtest tools create a virtual lab environment for an edge device under test
Normal test environments are expensive, probably moreso than most development managers forecast for. A study by voke Research found that the average investment in a pre-production lab was $12 million. In terms of time, the average time to provision the lab was 18 days and a further 12-14 days were spent on configuration. These labs take lots of time and money to set up, and even after that, they act as the bottleneck for testing due to limited access. Further, day-to-day operational costs of physical labs are significant. In most cases, duplicating a physical lab to increase test throughput is cost prohibitive.
In another post, we boiled down the benefits of service virtualization to improving access to testing devices with better control of behavior of virtualized dependencies, which reduces costs and increases test speed. In a similar fashion, let’s break down the benefits of the virtual IoT test lab:
Given the state of IoT device development, changes need to be made to development and testing processes. Test automation is a proven approach to reduce costs and risk. The next big step in quality and security improvement for IoT devices is using virtual labs that combine service virtualization, service-based testing, virtual lab management, and runtime monitoring. This greatly reduces the provisioning and configuration costs while great increasing the quality of the testing being performed.
Parasoft’s industry-leading automated software testing tools support the entire software development process, from when the developer writes the first line of code all the way through unit and functional testing, to performance and security testing, leveraging simulated test environments along the way.