The autonomous vehicle gospel, abuzz on Consumer Electronics Show (CES) floor in the past years, has quietly passed the baton to an older and steadier technology: advanced driving assistance systems (ADAS). The automotive design ecosystem—mainly comprising car OEMs, tier 1’s, semiconductor suppliers, and software developers—clearly seems to aim at ADAS applications, though press releases mention autonomous driving as well.
The announcements made at CES 2023 mainly focused on assisted and automated driving and presented these developments as part of the eventual roadmap to autonomous mobility. In other words, highly automated vehicles have become the key focus for the automotive industry, and eventually, they could pave the way for autonomous driving in the future.
Take the case of Harman, a subsidiary of Samsung, which demonstrated new technologies that will impact drivers and pedestrians and thus help ensure road safety. It showed how artificial intelligence (AI) and machine learning (ML) technologies could help classify a driver’s behavior into a focused versus distracted state. Harman also demonstrated a personalized in-cabin response to help mitigate dangerous driving states such as stress, anxiety, distraction, and drowsiness.
Below is a synopsis of two automotive design announcements which underscore that ADAS has now been in the driving seat.
Radar SoC for ADAS
At CES 2023, NXP Semiconductors displayed its new automotive radar system-on-chip (SoC) that integrates radar transceivers with multi-core radar processors built on NXP’s S32R radar compute platform. The Dutch chipmaker is mainly targeting this radar-on-chip solution on ADAS applications like automated emergency brakes, adaptive cruise control, blind-spot monitoring, cross-traffic alerts, and automated parking.
Figure 1 The 77-GHz radar SoC facilitates multimode and elevation sensing. Source: NXP Semiconductors
According to Matthias Feulner, senior director for ADAS at NXP, it’s the first 28-nm RFCMOS radar one-chip solution that enables advanced 4D sensing for safety-critical ADAS use cases and offers a 30% reduced sensor size compared to NXP’s previous-generation radar solution.
The one-chip solution integrates RF front-end and multi-core radar processor; it bolsters RF performance through enhanced MIMO waveforms while enhancing the range to 200 m for corner radars and 300 m for front radars. The 77 GHz radar SoC contains four transmitters and four receivers alongside a multi-core radar processor with hardware accelerators, Gigabit Ethernet communication interface, and memory.
On the compute side, this radar SoC uses accelerators for 64x performance instead of standard cores to minimize power dissipation and reduce chip footprint. It also employs proprietary radar algorithms for a performance boost. Next, the radar SoC features ASIL B compliance for the ISO 26262 functional safety requirements.
At CES, NXP demonstrated this radar SoC alongside its power management and connectivity solutions. Tier 1 supplier Denso is among the early adopters of the SAF85xx one-chip radar solution for ADAS applications.
Full stack for ADAS
Another notable automotive design announcement at CES 2023—a collaboration between tier 1 Continental and AI chip developer Ambarella—also zeroed in on ADAS while mentioning autonomous driving as part of the future roadmap. The two companies will jointly develop an ADAS full-stack system encompassing hardware and software.
Continental will combine its software and hardware expertise with Ambarella’s computer vision know-how and software modules to develop full-stack systems for highly automated vehicles. The full-stack solution will adopt a multi-sensor approach while using Ambarella’s single-chip processing platform for multi-sensor perception.
Figure 2 New design solutions are aiming to offer upward scalability for ADAS full-stack offerings. Source: Ambarella
Ambarella claims that its CV3-AD chip is based on algorithm-first architecture, making its camera-based perception solutions highly suitable for next-generation ADAS applications. It supports high-resolution cameras, radars, ultrasonic sensors and lidars as well as deep fusion of these sensors.
While ADAS applications have been a prominent highlight at the CES 2023 show, the lack of buzz on autonomous driving shows that it’s proving to be another revolutionary idea ahead of its time. And it seems that the automotive industry has finally come to terms with this reality and has decided to take a gradual path toward this ambitious technology undertaking.
ADAS fits the bill for this gradual approach toward autonomous mobility, moving ahead one step at a time.