//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
With edge intelligence becoming more common, and more sophisticated machine learning being deployed in embedded IoT devices, hardware compute architectures are becoming more complex, and in turn, software development is becoming more challenging.
These embedded IoT devices are small, constrained systems and code development needs to keep up to date with continuous and now more rapid advances in the hardware.
An approach that overcomes some of this challenge is software portability. As the CEO of MicroEJ, Fred Rivard, said in a recent interview with EE Times, “To exploit hardware innovation fast, you need to leverage your software assets.”
Using “containers” is one such way of leveraging those assets. As defined in “Why the future of embedded software lies in containers,” it explains that containers can wrap up a program along with all of its dependencies into a single, isolated executable environment. In fact, containers have also been described as lightweight virtual machines.
Google Cloud adds that containers make it easy to share CPU, memory, storage and network resources at the operating-system (OS) level and offer a logical packaging mechanism that allows applications to be abstracted from the environment in which they actually run.
It cites three benefits of containers:
- They provide a clear separation of responsibility, enabling developers to focus on application logic and dependencies.
- They can run virtually anywhere, greatly easing development and deployment on Linux, as well as on virtual machines, physical servers and a developer’s machine.
- They enable application isolation by virtualizing CPU, memory, storage and network resources at the OS level, providing developers with a view of the OS logically isolated from other applications.
Hence, containers can provide the portability that Rivard suggests. The container allows an application to run independently of the host environment, resulting in consistent execution across a wide range of environments.
As outlined by embedded software consultant Jacob Beningo, “Containers help ensure consistency across several environments, reducing the issues caused by different configurations. For example, have you ever tried to get a new developer up and running with the build system you are using? It’s often a giant pain to ensure everyone has identical versions of tools, libraries and so forth. Containerizing the development environment allows the same environment to be deployed to any number of developers, no matter what the configuration of their local system is.”
Microservices also enable plug-and-play embedded IoT
In addition to containers, microservices are another way of enabling software plug-and-play capability for embedded IoT devices.
They break an application up into a collection of small autonomous services, with each microservice independently deployable and coupled to other microservices in the application.
“A well-defined interface is used to allow communication between the microservices so that they can work together to achieve the overarching goals of the applications,” Beningo said. “A microservice architecture is more flexible and scalable than a traditional monolithic architecture.”
One such approach is with Luos, an open-source, lightweight containerization platform enabling a microservices architecture for embedded systems.
Luos works by containerizing embedded features into services on the devices, enabling a microcontroller (MCU) to host a series of services, such as data acquisition from sensors, actuators or specific pieces of behavior for the devices. These features are placed inside services, which can then be deployed anywhere in a Luos network and accessed directly—no matter where they are in the network. Services can also be dynamically connected and disconnected and can be detected and found by an application.
The Luos engine is an embedded lightweight and real-time C code library that can be included and used in firmware. This library provides a simple API to create, manage and interact with services. The engine is open-source under the Apache 2.0 license and available on GitHub.
Enabling distributed intelligence and software-defined products
This emergence of virtualization and software containers for low-cost, low-power devices supports the proliferation of intelligent edge IoT. It also supports the concept of distributed intelligence, with the ability to create a network of programmable devices with upgradable features—enabling software-defined products.
This “software defined” capability has been well known for years in areas like telecoms. More recently, the automotive sector is adjusting to the capability with the emergence of software-defined vehicles.
Software-defined products and services are enabled by the combination of hardware programmability and the ability to add or change functionality with over-the-air (OTA) updates. Virtualization and abstraction of workloads from the underlying hardware can enable more flexible and agile hardware platforms and delivery of software-defined or software-enabled services.
In software-defined products, functions become more independent from their hardware specification, enabling a broader feature set and faster evolution, as the functions are much easier to upgrade. By definition, the main product functions are software-driven and portable, able to take advantage of new hardware and easy to move to different hardware variations.
For a development organization, a software-defined approach reduces risks and costs. It allows the parallelization of hardware and software development, translating into greater new product output and a shorter time to market. It simplifies upgrades and maintenance and enables quick alignment to market needs, extending product reach and lifetime. Typically, a software-defined product will be able to change after shipment, as more usage data can be collected and more use cases defined.
This natural decoupling from hardware also increases silicon chip portability and reduces supply-chain risks.
Using the automotive environment as an example, carmakers have been evolving their designs such that a standard hardware platform can be used across a range of models. With the hardware comprising configurable devices, they are then able to use OTA updates to deliver services that users can purchase on a pay-as-you-go basis.
Industry is calling this the software-defined vehicle. Carmakers and analysts say this capability will transform the industry from vehicle “ownership” into one of vehicle “usership” as a result of multi-use, multi-environment deployment of cars. At a conference in London on the future of the car last summer, several CEOs said software offered the opportunity to differentiate a carmaker’s brand.
It’s not just in automotive; manufacturers in almost all sectors are expecting to be able to customize services to the consumer or enterprise—whether its cars, radios or networks.
All sectors understand the advantages of software-defined principles, starting with software, with everything becoming agile and flexible, from product development to post-release upgrades. The backbone elements of the evolution of software development in the cloud, software-defined networking, mobile apps and IT include:
- Agile, continuous integration and DevOps processes
- Virtualization, to get more out of hardware platforms
- Adoption of standard platforms like Linux, Android and Microsoft for cloud-native development
- Microservices, Docker containers, and Kubernetes orchestration
This approach, originally defined with the cloud in mind, cannot be repurposed “as is” for much smaller targets, such as edge devices, according to MicroEJ. One needs to be able to build a standard platform for the edge that brings virtualization to the smallest targets, offering containers and containerized apps with app orchestration, and that brings microservices to the devices.
The approach taken by MicroEJ is based on its MicroEJ VEE virtual execution environment, a standard embedded software platform capable of running on any processors, including MCUs, microprocessors and systems-on-chip.
The VEE acts as a software container that runs on any OS/RTOS commonly used in embedded systems (FreeRTOS, QP/C, ucOS, ThreadX, mBed OS, Mbed OS, VxWorks, PikeOS, Integrity, Linux), as well as running without an RTOS (bare-metal) or on a proprietary RTOS.
Enabling commercial opportunities
Device companies have played with all kinds of business models to add recurring revenue once product is in the field—with things like maintenance or service fees.
The ability to add software-defined services on those devices takes this to another level, enabling subscriptions and “as a service” business models.
Now hardware companies can experiment with and offer outcome-based or pay-per-use models as services evolve. Even chip manufacturers are already talking about offering embedded IoT hardware devices as a service.
Cloud and software companies can use software-defined devices to push their service model down to the customer touch point and create services as a combination of cloud, edge and device computing. With the better tools enabling software portability, including features like containers and microservices, plug-and-play embedded IoT becomes easier to deploy.