Which AI self-driving platform is the favorite of global automakers? KellyOnTech
Would you ride in a fully autonomous car at this stage?
An August 2021 U.S. poll found that only 23 percent of adults would ride in a fully autonomous vehicle. Let’s put autonomous driving aside for a moment and talk about driver assistance technologies. In the 1990s, adaptive cruise control began to appear as one of the features of high-end cars. Then came blind spot monitoring, lane control and camera- and radar-based automatic braking. In May 2022, the European Union made it mandatory for new cars to be equipped with a collision avoidance system that brakes the car to prevent rear-end collisions. This system is also standard on many new models in the United States.
Although assisted driving technology has matured, self-driving cars are still struggling and accidents are frequent. The world’s major car manufacturers have used various types of sensors as the eyes of self-driving cars.
What Is the Dilemma for Self-Driving Cars?
The sensor networks that act as the eyes of self-driving cars have done a good job, but it’s not enough to see what’s going on in every direction around the car. It also needs a self-driving car brain (AI systems that run on powerful computers connected to digital maps and other data archives) to analyze sensor data and other information to track pedestrians and other vehicles, predict where they are going, and calculate the path that a car should take to drive safely. Both collecting and processing data are complex real-time operations.
Autonomous driving encompasses a myriad of different tasks in the computing and sensor domains, but accelerating individual algorithms is not enough. Ultimately the important part is the systematic understanding and optimization of the end-to-end system. About half of the R&D budgets of current projects developing autonomous vehicles are spent on building and optimizing computational systems.
Which AI Self-Driving Platform Is the Favorite of Global Automakers?
Nvidia has designed specialized processors and software to act as the brains of self-driving cars, processing signals collected by the sensors that act as the eyes of self-driving cars. Incidentally, Nvidia’s ability to read digital maps has been further refined since its acquisition of DeepMap, a company focused on creating maps and localization for autonomous vehicles previously introduced.
NVIDIA introduced Nvidia DRIVE Hyperion, an end-to-end modular development platform and reference architecture for designing self-driving vehicles (AVs). The latest generation includes the Nvidia DRIVE AGX Orin, DRIVE AGX Pegasus and DRIVE Hyperion 8.1 developer kits, all built on the NVIDIA DRIVE Orin system-on-chip (SoC).
These scalable platforms include the underlying NVIDIA DRIVE SDK (DRIVE OS and DriveWorks) and sensors to provide the highest levels of safety and efficiency in AV development. The high-performance NVIDIA DRIVE AGX modular computing platform provides rich automotive IO and 254 TOPS per SoC (System On Chip) and is built on automotive-grade chips.
- Algorithm Development — Safety-certified sensor sets with production-validated sensor drivers and tuning accelerate algorithm development.
- Data Productivity — Accurate sensor calibration, precise time synchronization, efficient data compression and integrated utilities increase the efficiency of the entire data processing pipeline.
- Optimization and Testing — Physically accurate ray-tracing-based virtual sensors are available in NVIDIA DRIVE Sim to virtually test and augment datasets with synthetic data.
NVIDIA’s latest NVIDIA DRIVE Orin self-driving in-vehicle computer is used by more than 25 automakers around the world. BYD, one of the world’s best-selling electric vehicle brands, will begin rolling out next-generation new energy vehicles based on the DRIVE Hyperion software-defined platform in early 2023.
In addition, luxury electric vehicle brand Lucid, NIO, Li Auto, Xpeng Motors, SAIC’s IM Motors, Extreme Auto, China Express, VinFast, and other new energy vehicle startups are all developing software-defined fleets on DRIVE.
What is the future of the automobile?
According to Jen-Hsun Huang, founder and CEO of Nvidia,
“The car of the future will be fully programmable, evolving from many embedded controllers to powerful, centralized computers — providing AI and AV capabilities through software updates that are enhanced throughout the car’s lifecycle.”
Nvidia will launch software-defined vehicles based on a centralized artificial intelligence computing platform starting this year, following a series of design wins with global automakers. The next-generation platform, DRIVE Hyperion 9, will feature 14 cameras, nine radars, three LiDARs, and 20 ultrasonics as part of its sensor suite, improving performance in processing sensor data, and deploying the production-ready performance and scalability needed for next-generation software-defined fleets without compromising safety and quality.