//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
As embedded systems move to center stage, real-time edge processing, artificial intelligence (AI) and AI accelerators, security, high-speed wireless bandwidth and scalability will play major roles, participants in a panel talk on embedded systems said at Electronica 2022 last month in Munich. Nitin Dahad, editor-in-chief of Embedded.com, moderated the discussion.
To facilitate those real-time decisions, data from the edge won’t necessarily move to the cloud for processing but will instead be processed at the edge, said Christian Eder, cofounder of Deggendorf, Germany-based Congatec and director of product marketing. Microchips, microprocessors and AI accelerators are important in this evolution, he said.
“It’s quite important to act in real-time,” he added. “Real-time is a strong thing, because (for example) an autonomous vehicle has to decide in real-time, go left or right or stop. And if you have to rely on communication (to the cloud) data set, to know what should I do now, it might be too late. Real-time and AI go along with edge devices. An edge device is not just this small device that handles the edge. We also talk about edge servers. We love data servers.”
“If you think about 20 years ago what you had … was some tiny chip … somewhere at the edge of the board doing a bit,” said Martin Kellerman, a Munich-based marketing manager at Microchip Technology. “But they really moved into the center. And this movement, this evolution, and also the possibility, what our clients can do with that in the embedded space, is really the center of the board — making the intelligence, bringing that out to the edge, bringing that into the factory floor.”
In addition to manufacturing, demand in the consumer world is helping create that drive for real-time performance and available functionality via embedding machine learning in a small footprint device, said Mark Patrick, technical marketing manager for EMEA for Mansfield, Texas-based Mouser Electronics.
Scalable solutions that can grow over time are critical so that users boost performance by simply scaling up to the next CPU generation, Eder said.
Getting a decision at the edge instead of transmitting edge data to the cloud for processing and then returning to the edge saves energy and makes “absolutely a class of difference,” said Mick McCarthy, director of advanced automation at Wilmington, Mass.-based Analog Devices.
Another growth trend, he said, is closing the control loop even more locally than currently: “What we see now is that micro- or nano-control loop actually becomes more and more important as we can drive efficiencies further.”
The now-familiar expectation of doing more with less also comes into play, Kellerman said. “More functionality, more features, more intelligence, quicker reaction time with less power, less space, less people,” he said. “There will be less engineers, also less time. So really, the problem of how do I do it?”
With the concept of doing more with less in mind, high-level abstraction will play an increasing role, he said, especially in the beginning.
Creating a highly optimized, detailed version of a product, application or software “would take forever,” he said. Meantime, it’s much faster to create “a high level of abstraction, just having the algorithm, the seat of the algorithm, being able to say yes, this is working and now I can go into my target …or an embedded controller.”
Such abstraction enables customers to get to market faster, Patrick said. “Saving people, saving that engineering time, accelerated time to market, is really important commercially, but also just enabling people to access the technology, to make use of it. It’s also putting out a lot of use case material which our friends at Edge Impulse have been really good at doing, picking a variety of hardware platforms, a variety of applications. So, if somebody wants to understand how to do a little bit of condition-based monitoring for a motor condition or something like that, the examples are out there. So, it saves this blank page syndrome as well. It gets people started.”
Security is another big issue to address and to future proof, Kellerman said. “The new challenge is, if I’m connected, I can’t trust anyone,” he said. “If I’m a connected device in a system, I can really do some damage if I’m the wrong thing. (So) don’t trust me. And really make sure that I’m the (device) that I say I am. Authenticate me.”
Another aspect of security, Eder said, is that security needs to be considered from the beginning. “When developing an application, we have to think about security from scratch,” he said. “If you want to hook (security) on later on, you might get lost or you might not get the right results.”
One security workaround is to consider two levels of security: a basic level of security for things not connected to the internet and a higher level of security for anything connected to the internet, Eder said. In other cases, security might be more basic on a sensor or other low-level device while higher-level devices would have higher, stricter security, he added.
In terms of system architecture, Kellerman said, the approach is becoming to have one processor manage security, another processor manage safety and a third processor manage the interface between the two.
For example, you can’t trust facial recognition as security authentication, he said. “A colleague of mine did a video on facial recognition as security and … there was a really interesting discussion of: can you trust that? No, absolutely not. I can just take a picture and bring it up and (the system) will ‘recognize’ me. All the security …must be protected … continuously.”
Any security must be future-proofed, Kellerman said: “Security from today … will be broken tomorrow. You need to have the possibility to upgrade, to have this future-proofness.”