At its recent global AI conference, US technology corporation Nvidia revealed several new developments directly relating to its automotive and autonomous driving solutions.
The most notable announcements related to the introduction of Drive Map, described as a multimodal mapping platform, and Drive Hyperion 9, the company’s next-generation platform for software-defined autonomous vehicles.
Looking to the next generation of Nvidia’s Hyperion platform, Hyperion 9 – the programmable architecture slated for 2026 production vehicles – is built on multiple Drive Atlan computers to provide intelligent driving and in-cabin functionality.
The platform includes the computer architecture, sensor set and full Drive Chauffeur and Concierge applications. It is designed to be open and modular, so customers can select what they need. Current-generation systems scale from NCAP to Level 3 driving and Level 4 parking with advanced AI cockpit capabilities.
Thanks to the use of the Atlan SoC, Nvidia says the platform will have double the performance of its current Orin-based systems, but with the same power consumption. Using Nvidia’s GPU architecture, Arm CPU cores and deep learning and computer vision accelerators, it will facilitate the implementation of multiple deep neural networks with capacity for future developments to be added.
The platform will make use of this added compute capacity by harnessing a greater range of sensors than current systems. Its upgraded sensor suite will include surround imaging radar, enhanced cameras with higher frame rates, two additional side lidar and improved undercarriage sensing with better camera and ultrasonic placement.
In total, the Hyperion 9 architecture includes 14 cameras, nine radars, three lidars and 20 ultrasonics for automated and autonomous driving, as well as three cameras and one radar for interior occupant sensing.