PUBLISHER: ResearchInChina | PRODUCT CODE: 1070368
PUBLISHER: ResearchInChina | PRODUCT CODE: 1070368
L2.5 and L2.9 have achieved mass production for vehicles running on the road, and mass production of L3 and L4 in limited scenarios has become a goal for OEMs in the next stage. In March 2022, the U.S. National Highway Traffic Safety Administration (NHTSA) issued final rules eliminating the need for automated vehicle manufacturers to equip fully autonomous vehicles with manual driving controls to meet crash standards. The United States is expected to introduce more important policies for autonomous driving in the future to guide L3/L4 autonomous driving on the road.
In this context, ADAS/autonomous driving chips have seen a wave of upgrades, and many chip makers have launched or planned to unveil high computing power chips. In January 2022, Mobileye introduced the EyeQ® Ultra™, the company's most advanced, highest performing system-on-chip (SoC) purpose-built for autonomous driving. As unveiled during CES 2022, EyeQ Ultra maximizes both effectiveness and efficiency at only 176 TOPS, with 5 nanometer process technology. Although it looks less potent than chips from rivals Qualcomm and NVIDIA, the cost-effective and high-energy-efficiency EyeQ® Ultra™ may still be favored by OEMs.
SoC chips, which are mostly involved with heterogeneous design, include different computing units such as GPU, CPU, acceleration core, NPU, DPU, ISP, etc. Generally speaking, computing power cannot be simply evaluated from the chip alone. Chip bandwidth, peripherals, memory, as well as energy efficiency ratio and cost should be also taken into account. At the same time, the development tool chain of SoC chips is very important. Only by forming a developer ecosystem can a company build long-term sustainable competitiveness.
In chip design, the configuration of heterogeneous IP is crucial, and autonomous driving SoC chip vendors are constantly strengthening the research and development of core IP to maintain their decisive competitive edges. For example, NVIDIA upgraded its existing GPU-based product line to a three-chip (GPU+CPU+DPU) strategy:
Amid the evolution trend of the automotive EEA: "distributed architecture - domain centralized architecture - cross-domain fusion architecture - central computing platform", Tesla's latest version of Model X has achieved a certain degree of central cross-domain fusion computing. Model X's automotive central computing platform includes two FSD chips, an AMD Ryzen CPU chip and an AMD RDNA2 GPU. The FSD chip and AMD CPU/GPU chip communicate through the PCIe interface and are isolated from each other.
Integrating multiple chips such as CPU, GPU, and FSD into one SoC chip through Chiplet technology will further reduce the chip communication delay. Tesla has reportedly partnered with Samsung on a new 5nm chip for autonomous driving and cockpit SoC chip integration.
The industry's giants like NVIDIA and Qualcomm have all begun to implement cross-domain integration of autonomous driving and cockpits. For example, NVIDIA has launched DRIVE Concierge and DRIVE Chauffeur for smart cockpits and autonomous driving respectively. DRIVE IX can realize the fusion of algorithms in the cockpit. Based on the powerful software stack tools, NVIDIA's next-generation Ampere architecture (Atlan SoC) will conduct the simultaneous control over autonomous driving and intelligent cockpit with a single chip.
In February 2022, Chinese SoC company Horizon Robotics announced that it will cooperate with UAES to preinstall and mass-produce cross-domain integrated automotive computing platforms.
Autonomous driving datasets are critical for training deep learning models and improving algorithm reliability. SoC vendors have launched self-developed AI training chips and supercomputing platforms. Tesla has launched the AI training chip D1 and the "Dojo" supercomputing platform, which will be used for the training of Tesla's autonomous driving neural network.
Besides, training algorithm models are becoming more and more important, including 2D annotation, 3D point cloud annotation, 2D/3D fusion annotation, semantic segmentation, target tracking, etc., such as the NVIDIA Drive Sim autonomous driving simulation platform, the Horizon Robotics "Eddie" data closed loop training platform, etc.