Being responsible for testing an IoT system, you will typically face challenges like this:

  • Unit tests are not enough, since timing and dynamic aspects are to be considered as well
  • For system-level tests all system components must be available – IoT devices as well as backend applications, SW as well as HW
  • In case of faulty behavior, the error is hard to localize in the overall system​
  • Even single components of your system might be rather complex which complicates troubleshooting
  • Some failure scenarios are tricky to test, e.g., downtime of cloud services
  • Multiple teams working in parallel on components and sub-systems cause complex integration tasks
  • How to ensure that a component will work in the integrated sub-system before handing it over to the integration team?

We’ve put together some tips on how to set up your system-level tests for your IoT system.

Tip 1: Simulate your environment using models and mocks

  • Test every component of your IoT system in isolation first​
  • Define one component as “system under test” (SUT) and integrate it into its “natural” environment​ by simulating the environment with models and mocks​
  • Provide models and mocks for both the physical environment and the software environment​
    • Models for the physical environment realize sensor and actuator functions, e.g., for simulating a temperature profile
    • Mocks for the software environment serve as stubs for missing software components, e.g., for mocking RPCs
  • Examples:
    • Test an IoT device by providing a mock for the backend application and models for sensors and actuators
    • Test the backend application by providing mocks for all IoT devices
  • ​Models and mocks may often be very simple, only testing relevant parts must be modelled​
  • You can realize models and mocks by scripting, MATLAB/Simulink, … and use standardized interfaces like FMI/FMU for co-simulation and model exchange

This approach gives you the following advantages:

  • You can test your IoT system early, no need to wait for the availability of other components’ software and hardware
  • Running your SUT in a virtualized environment will even allow you to test before the hardware of the component under test is available
  • Errors can be localized more easily since only one component is in the focus of the test
  • You can test critical scenarios using failure injection to the models and mocks

Tip 2: Test interactively during development​

  • Do exploratory testing​ of your IoT system before or in parallel to automated tests
  • Check out by interactive experiments, how your SUT reacts to certain stimulation values​
  • Stimulate your SUT by discrete values or over the time, e.g., by waveforms as sinus, toggle, …
  • Observe dynamic aspects of your SUT by visualizing values in analysis windows, e.g., graphical representations, traces, state trackers…
  • Visualize external inputs and outputs to/from your SUT in the same analysis windows as internal variables and debug expressions to easily find out (unwanted) correlations between them

By that, you can already experiment with your software and test the application as it is being developed, without the higher effort of automated tests.

Tip 3: Automate your IoT tests​ using hybrid test design

  • Complement exploratory testing by automated tests​
  • Separate the sequential test scripts from the asynchronous models and mocks; we call this a “hybrid test design”
  • Use the test scripts to
    • stimulate and check your SUT sequentially,
    • observe the SUT over the time by parallel background checks​, and
    • control the models and mocks, e.g., by parametrization, variant handling, and fault injection​

This idea of hybrid test design

  • keeps your test scripts simple (since sequential),
  • gives you the ability to combine the same test script with different models, e.g., to deal with different variants of the SUT
  • allows the reuse of models with multiple test scripts, and even
  • facilitates “test-the-tester”: test the correctness of your test script by providing a model for the SUT as well

Tip 4: Combine sophisticated test design concepts​

  • Use a test design environment that provides you with various test design capabilities
  • Use graphical test design, tabular test notation and/or different test programming languages – whatever is best suited
  • Combine the approaches of different abstraction level, e.g.,
    • diagrams for the high-level logic combined with
    • Python implementations for the detailed test step coding
  • Utilize dedicated concepts for variant handling, parameterization, and stimulation curves
  • Benefit from well-proven methodologies as the classification tree method​
  • Ensure traceability to any test management system as IBM DOORS NG/RQM, Siemens Polarion ALM, PTC Integrity, Jama Connect, Intland codeBeamer, …​

Such an integrated test design environment provides a lot of advantages:

  • Different test design notations (graphical, programming, …) satisfy different preferences and skills of test designers as well as different requirements on specific test tasks – no need to commit yourself to one single test approach
  • Graphical notations for the high-level view of test sequences simplify reviews on an abstract level
  • The combination of this abstract graphical view with coding for the detailed test steps allows you to split up the test design by different roles/persons
  • Parametrization concepts support the idea of logical and concrete test cases, that help to easily extend reuse as well as test coverage
  • Full traceability from requirements over test case design to execution results makes you look good in audits and reviews

Tip 5: Divide and conquer your system under test

  • If even one single component of your system is rather complex, decompose it for a concrete test setup, i.e., split for example the software of your IoT device into several (software) components​
  • Start by testing single software components in isolation​
  • Substitute other software components by mocks
  • Once confirmed that every single component works correctly, integrate several components and test at the sub-system level ​
  • Finally, once confirmed the correctness of all sub-systems, bring in the other sub-systems to test the fully integrated software SUT​
  • The higher the integration level, the more aspects are in the test focus (functional correctness, performance, robustness, …)​

So again, the same idea as before: focusing on a single “piece of puzzle” first eases testing and localization of errors.

Tip 6: Setup a continuous integration and test​ pipeline

  • Complement workstation tools for interactive and automated tests by a CI/CT toolchain​
  • Choose CI capable tools and utilities​ to support your development and integration teams in all phases of development and test
  • Setup DevOps pipelines, e.g., to run tests with every pull request
  • Ensure short feedback loops and early detection of issues
  • Fail fast!

A CI pipeline supports you in delivering high-quality products in time. It can address integration tasks of multiple teams working in parallel on the system under test – no matter to which organizational unit they belong or which development and test workflow they follow (e.g., one single team for development and test or separate development and test teams).

Continuous integration and test pipeline for IoT testing
Continuous integration and test pipeline

Tip 7: Apply shift-left testing for your IoT system

  • Take full advantage of existing integration tests for sub-system​s and the entire system
  • Re-use these tests of the integration teams in the development teams of the single components as well –> “shift left!”
  • Use mocks for the missing components to run the integration tests unchanged

This enables you to test on system-level and to discover integration issues really early. The integration team will thank you…

Conclusion

There are three main building blocks forming the basis for all those tips:

  1. Models and mocks
  2. System test design concepts
  3. Re-use of system tests in the V-cycle

You can utilize these building blocks in several setups and workflows:

  • Exploratory testing of your IoT system during development
  • Automated tests in CI/CT pipelines

This gives you the following benefits:

  • Fast feedback loops for developers
  • Smooth integration workflows in CI/CT environments

More details in our video: Development and test of IoT systems

  • 00:24 IoT System Example
  • 1:30 Layers of Test Interfaces – Unit Test, SIL, HIL
  • 2:40 Challenge: Managing Distributes Systems
  • 4:27 Concept 1: Environment Simulation by Models and Mocks
  • 5:43 Concept 2: Exploratory Testing During Development​
  • 6:34 Concept 3: Automated Tests Using Hybrid Test Design​
  • 7:38 Concept 4: Sophisticated Test Design Concepts
  • 9:37 Concept 5: Divide and Conquer
  • 12:25 Concept 6: Continuous Integration and Test
  • 14:28 Concept 7: Shift-Left Testing
  • 16:02 Conclusion

Get started with your projects

Share:

Legal Notice

This text is the intellectual property of the author and is copyrighted by coderskitchen.com. You are welcome to reuse the thoughts from this blog post. However, the author must always be mentioned with a link to this post!

1 thought on “Testing IoT systems: Challenges and solutions”

Leave a Comment

Related Posts

Daniel Lehner

Reducing efforts in system testing

Which are the best methods to reduce efforts in system testing? In this second part of the Coders Kitchen interview, system testing experts Hans Quecke

Traditional system testing vs. model-based system testing - Coders Kitchen
Daniel Lehner

Traditional vs. model-based system testing

Interview with system testing experts Simone Gronau and Hans Quecke about typical challenges in defining system tests, and how model-based methods might offer an alternative.

What is a software unit - Coders Kitchen
Andreas Horn

What is a software unit?

When we talk about software unit testing, the first question I always ask is “What is a unit for you?”. This can be a very

Hey there!

Subscribe and get a monthly digest of our newest quality pieces to your inbox.