PUBLISHER: ResearchInChina | PRODUCT CODE: 1930698
PUBLISHER: ResearchInChina | PRODUCT CODE: 1930698
In 2025, NOA standardization was popularized, refined and deepened in parallel. In 2026, core variables will be added to the competitive landscape.
The evolution of autonomous driving follows a clear and step-by-step path of technological advancement: highway NOA -> commute urban NOA -> mapless city NOA -> experience deepening + safety refinement -> L3 + universal autonomous driving.
With its unique advantages like "low complexity and high value", highway NOA has become the "pioneer" in the implementation of autonomous driving. Since its advent in 2021, this technology has reached an explosive turning point in 2025. The monthly penetration rate of standard highway NOA soared from 5.8% in January 2025 to 19.6% in October 2025 with a 3-fold increase in 9 months, marking that domestic passenger car highway NOA officially entered the era of standardization.
After highway NOA laid the market foundation, commute NOA came into being in 2023. As a transitional solution for urban NOA, commute NOA is achieved rapidly thanks to repeated training on fixed routes, without waiting for the HD maps to be fully rolled out. The function can be unlocked in just a week for simple routes and 2-3 weeks for complex routes, effectively lowering the implementation threshold for urban autonomous driving. From the second half of 2023 to the second half of 2024, a fierce competition in the urban NOA field emerged, and OEMs accelerated their efforts herein. The implementation of end-to-end foundation models and mapless solutions pressed the accelerator button for the industry, helping it enter a new mapless stage of "full-domain development across the country".
In 2025, the development logic of urban NOA underwent a fundamental change. The industry transferred from the extensive competition to the all-scenario closed loop of "D2D", the safety refinement of complex scenarios, and the deepening of the value of driving experience.
Four Core Evolution Directions of Urban NOA in 2025
Evolution Direction 1: All-Scenario D2D
One of the core breakthroughs of urban NOA in 2025 is the transition from "road autonomous driving" to "all-scenario closed-loop autonomous driving", with "D2D" and "point-to-point full-domain navigation" becoming the mainstream upgrade path. The essence of this trend is to solve the separate experience of "parking lots - public roads - destination parking lots" in traditional autonomous driving, extend the coverage of autonomous driving to the "last mile" scenario, and ultimately realize autonomous driving from "P gear to P gear" without any breakpoints, reconstructing the users' full-process mobility experience.
Specifically, various brands have implemented relevant functional layouts one after another. As a pioneer, XPeng launched the complete D2D autonomous driving link in January 2025, covering the full process from parking spaces, cruising in multi-storey parking lots, automatic passage at turnstiles, to driving on public roads, and precise parking at destinations, truly enabling users to "get in cars and go". In June 2025, XPeng further upgraded it to cover all routes and scenarios with comprehensive functions, adapting to more vehicles models and parking lot types, completely breaking up the separation between parking lots and public roads, and realizing "zero breakpoint" full-link services. This function is built on the technical architecture and hardware platform of XNGP, extending autonomous driving capabilities from highways and urban roads to terminal scenarios such as parking lots and parks, covering all scenarios.
Based on the Cedar system of NT3.0, NIO unveiled a point-to-point full-domain navigation assist system - NOP+ in June 2025, which highlights seamless connection between highways and urban scenarios and lays the foundation for the "D2D" fully closed loop. ZEEKR achieved a core breakthrough in the second quarter of 2025, launching the "D2D" function, which directly realizes the entire closed loop of autonomous driving from P gear to P gear from the starting parking space to the destination parking space. Users can enjoy a worry-free autonomous driving experience with one-time activation, accurately echoing the evolutionary trend of all-scenario closed loop.
Evolution Direction 2: Transferring from the extensive competition to the security redundancy and refined development in complex scenarios
The core of the evolution of urban NOA shifted from the extensive competition in 2023-2024 to complex scenario development and security system strengthening in 2025. The value transition from "ability to drive" to "good driving, stable driving, and driving suitable for all scenarios" accurately solves users' autonomous driving problems in unconventional road conditions.
Specifically, the technology iterations of brands closely follow this core logic. In November 2025, XPeng relied on VLA 2.0 to launch the "small road NGP" function to solve problems such as narrow alleys and unmarked roads, and increase the Miles per Intervention (MPI) on complex small roads by 13 times. NIO upgraded functions such as General Obstacle Alert and Assist (GOA) and Rear Collision Warning (RCW), greatly improving the recognition accuracy and braking response efficiency. Denza N7 expanded the effective vehicle speed of AEB to 120km/h, added a new special obstacle sensing model to enhance safety protection in different scenarios in an all-round way.
Evolution Direction 3: The autonomous driving experience is refined and upgraded, and the core logic shifts from "available functions" to "good experience"
Differentiated competition focuses on users' real driving experience, and autonomous driving shifts from standardized function coverage to scenario-based and personalized adaptation. The rear anti-motion sickness mode launched by NIO in November 2025 specifically solves the problem of dizziness among rear seat passengers. It uses the edge of the screen to display dynamic dots that are synchronized with the body posture in real time, and uses visual compensation technology to improve riding comfort. ZEEKR optimized full-domain NZP in the second quarter of 2025, added the U-turn function and introduced the "smooth driving" function, which not only improves traffic efficiency through intelligent planning of continuous road changes, but also uses the adjustment of the suspension and braking systems to achieve comfortable passing of speed bumps and greatly reduce the bumpy driving experience. Denza N7 focuses on personalized adaptation, supporting multi-gear customization options for intelligent driving modes such as parking speed and lane-changing style. Meanwhile, it optimizes the number of maneuvers in valet parking and expands the speed range for autonomous driving functions, enabling mapless autonomous driving to accurately match different users' driving habits. These measures jointly confirm that the focus of competition in high-level autonomous driving has shifted from the competition in technical parameters to the in-depth care for users' real driving experience.
Evolution direction 4: Autonomous driving evolves from an "auxiliary tool" to an "intelligent partner"
In 2025, autonomous driving moved from passive execution to active interaction. The VLA-based command function launched by Li Auto in September 2025 makes language a new interface for driving control, upgrading the car from a tool for passively executing instructions to an autonomous driving partner that can understand, think and act, lowering the threshold for enjoying autonomous driving.
Currently, this function is adapted to Li Auto's MEGA and L vehicles equipped with the AD Max system. Users need to manually turn it on in the CID settings. Its core capabilities cater to multiple types of driving scenarios: including mobility planning, basic control command execution (such as left turn, right turn, lane change, etc.), flexible adjustment of vehicle speed/vehicle distance/lane (such as "drive faster"), memory and management of driving preferences (such as "take the leftmost lane on this road from now on" "check all speed memories and delete the third one"); it also have active communication attributes. For example, when the roaming function of NOA is turned on in the park or underground garage, the system will actively ask the driver's intentions, and will give instructions such as "park nearby" and "pull over" as per the scenario, demonstrating the ability to think proactively beyond "passive execution".
At the safety level, the VLA-based command function achieves the synchronization of "experience upgrade" and "safety upgrade" by reducing human interference and strengthening emergency response. This is also the key to autonomous driving moving from "available functions" to "good experience". Specifically, the elimination of screen operation allows drivers to focus their attention on road conditions throughout the journey, reducing the risk of distraction. In emergencies, drivers can quickly intervene using simple commands such as "emergency stop," improving emergency response efficiency. At the same time, the system can automatically identify blind spots and take measures such as slowing down and yielding to pedestrians in advance, further expanding the safety margin of driving.
At the technical level, the "driver agent" is the core "intelligent cerebrum" that supports this capability. It first receives the user's voice instructions (no matter complex or simple), and at the same time collects real-time traffic conditions through multi-view cameras and LiDAR, and combines navigation data to encode the information into self-recognizable "Token information blocks", and then the cloud 32B VLA model disassembles the instructions into detailed operations. Finally, the automotive 4B VLA model (4 billion parameters) integrates all information (disassembled operations, encoded road conditions/navigation, pavement details), analyzes driving intentions and specific actions, generates driving trajectories and guides the vehicle to complete driving.
The maturity of technology and the improvement of scenarios have directly promoted the market penetration of NOA. From the NOA-related data of newly launched vehicle models from 2023 to October 2025, it can be seen that the configuration priority of highway NOA and urban NOA has seen a phased transition of "trial -> development -> standard configuration" with the maturity of the technology. Entering 2025, the penetration rate of urban NOA in new vehicles climbed to 28.1%, officially starting a new industry cycle of "full popularization + large-scale penetration". Urban NOA gets close to the explosion in 2026-2027.
In China, urban NOA is no longer an exclusive label for high-end vehicle models, but tends to cover those with varying price ranges. In 2022-2023, urban NOA was concentrated in high-end vehicle models priced at RMB300,000-350,000, exclusive to a small number of early users. After 2024, the penetration into vehicles priced at RMB100,000-150,000 and RMB150,000-200,000 accelerated. From January to October 2025, 150,200 vehicles valued RMB150,000-200,000 were sold with urban NOA, which had a penetration rate of 4.2%.
As per the current competitive landscape of the urban NOA market, top OEMs still focus on independent R&D, but third-party autonomous driving suppliers play a main force in "increment " instead of a supporting role in 2025-2026. As of October 2025, Huawei, Momenta, DeepRoute.ai, Zhuoyu Technology, Bosch and WeRide had achieved large-scale mass production and application of urban NOA. In 2026, open chip platforms represented by Horizon Robotics formed alliances with top algorithm companies such as QCraft and SenseTime, creating a "third model" between full-stack independent R&D of OEMs and full-stack outsourcing by suppliers with the core mission of achieving technological equality and cost revolution.
Reshuffling of urban NOA competitive landscape in 2026: Horizon Robotics emerges, QCraft becomes one of the core growth engines
Prediction 1: QCraft will rely on Li Auto's AD Pro and multiple OEMs to make urban NOA available in millions of vehicles in advance
QCraft will rely on Li Auto's best-selling AD Pro to make urban NOA available in millions of vehicles in advance as a rising star among third-party autonomous driving suppliers. The trend is signified in the NOA system installations from 2023 to October 2025 (as shown in the figure below): Li Auto's AD Pro ranked fourth among autonomous driving systems with NOA installed in 614,343 vehicles cumulatively, accounting for 6.8% of the total market. The core software algorithm of Li Auto's AD Pro is offered by QCraft. The high sales volume of Li Auto's AD Pro has become a key fulcrum for the rapid application of QCraft's urban NOA solution. On January 21, 2026, Li Auto fully pushed the urban NOA function to its vehicles fitted with AD Pro, so that QCraft's urban NOA technology landed in millions of vehicles. Millions of vehicles not only generate massive data, but also form a positive cycle of "data-algorithm-experience" to continuously drive system evolution, significantly reduce the marginal learning cost of each vehicle, and provide a solid foundation for rapid iteration.
Entering 2026, as solutions from suppliers such as Horizon Robotics and QCraft are applied on a large scale, the urban NOA market structure will be reshuffled. QCraft has become one of the biggest dark horses in the third-party urban NOA market in 2026 with the combination of "J6M + cross-chip adaptation + Driven-by-QCraft 2.0 + designated mass production of cross-brand multi-level vehicle models".
(1) Technical support: core competitiveness of single-J6M solution and end-to-end architecture
First of all, QCraft has reached a technological level with the industry's first single-J6M (Journey 6M from Horizon Robotics, 128TOPS) urban NOA solution mass-produced for the intelligent refresh version of Li Auto's L series vehicles, refreshing the application boundaries of medium-computing-power chips. This solution, based on Horizon Robotics' Journey 6M, integrates explainable one-model end-to-end technology and reinforcement learning, and has successfully embraced the mapless version of urban NOA. Through extreme computing power efficiency mining and optimization, it breaks through the application boundaries of medium-computing-power chips, allowing a 128TOPS platform to offer an autonomous driving experience comparable to that provided by a 256TOPS platform, while ensuring the security and explainability of the end-to-end system. This is its core advantage. As for technological innovation, the solution brings about the optimal experience under limited computing power through in-depth collaborative optimization of software and hardware, truly maximizing the value of each TOPS.
QCraft's "safe and interpretable one-model end-to-end + reinforcement learning" high-level autonomous driving technology architecture can be adapted to multiple chip platforms such as Horizon Robotics' Journey 6M and NVIDIA Orin Y. The architecture first receives heterogeneous perception data from multiple sources, including multi-frame temporal images from cameras, LiDAR point clouds, navigation maps, and ego vehicle pose data. It then generates a unified BEV world representation through a 3D encoder and a spatiotemporal BEV fusion module, providing a globally unified environmental cognition foundation for subsequent decision-making and reasoning. Next, the multi-task decoder outputs a series of explicit and interpretable intermediate features, such as traffic participant states, road topology, drivable areas, occupancy grids, and traffic signs. This design not only supports the model's internal decision-making but also fundamentally solves the "black box decision-making" problem of traditional end-to-end technology. Subsequently, the unified world state latent encoder encodes the BEV world representation into latent space features, and combines the navigation route with the flow matching planner to generate the initial driving candidate trajectory. After multi-agent motion prediction and multi-modal trajectory sampling, the safety reinforcement learning module that integrates reward functions and rule constraints finally selects the optimal driving trajectory that complies with safety regulations, realizing a full-link end-to-end closed loop of perception - decision-making - planning - control. This architecture not only retains the link efficiency of end-to-end technology, but also accurately solves the core problems of safety and interpretability in the field of high-level autonomous driving through explicit intermediate representation output and regularized constraints of reinforcement learning.
(2) Core product: iteration and layout of Driven-by-QCraft
"Driven-by-QCraft" is an autonomous driving solution launched by QCraft. It was first released in November 2022. It uses self-developed innovative technology as the core engine and integrates QCraft's full-stack software algorithm to support point-to-point autonomous driving in multiple urban scenarios, highways and expressways. When it was first launched, it was available in high, medium and low configurations to meet the diverse needs of OEMs.
Since then, the solution has continued to iterate around chip platform upgrades and core technology implementation. In April 2024, a new autonomous driving solution was launched based on the Journey 6 from Horizon Robotics to solidify the performance foundation for hardware adaptation. In April 2025, the urban NOA solution based on a single Journey 6M chip and end-to-end technology was implemented, achieving a key breakthrough in core scenario functions. On January 23, 2026, "Driven-by-QCraft 2.0" was officially released, with comprehensively upgraded capabilities.
So far, "Driven-by-QCraft 2.0" has been finalized into three solutions: Driven-by-QCraft Air (ultimate highway NOA), Driven-by-QCraft Pro (standard inclusive urban NOA), and Driven-by-QCraft Max (advanced ultimate urban NOA). It has cross-chip platform compatibility and can be adapted to domestic and foreign mainstream autonomous driving chips such as Horizon Robotics Journey 6M and NVIDIA Orin Y. It can be flexibly deployed according to requirements of OEMs, greatly reducing the cooperation and adaptation threshold and accelerating the mass production and application. "Driven-by-QCraft 2.0" covers all vehicles ranging from the entry level (RMB100,000) to high end (RMB400,000).
(3) Large-scale application of urban NOA: from a single brand to multiple OEMs
The list of vehicle models equipped with QCraft's urban NOA solution in 2026 indicates QCraft's cooperation matrix has covered many mainstream OEMs such as Li Auto, SAIC, GAC, Geely, ROX, etc.. QCraft has not only leveraged Li Auto's more than ten vehicle models fitted with AD Pro to form a large-scale delivery foundation, but also involved in the main product lines of SAIC Roewe, GAC Aion, Geely (including Galaxy, Xingyue and the like), ROX and other brands, covering vehicles from family cars to mid-to-high-end vehicles.
As these vehicles are mass-produced and delivered in 2026, QCraft's urban NOA solution will be applied to multiple brands and vehicle models of all levels on a large scale. Combined with millions of Li Auto's vehicles fitted with AD Pro sold, QCraft will become a core growth engine with both delivery scale and brand coverage in the field of urban NOA in 2026.
Prediction 2: Horizon Robotics will play a significant role in building an inclusive ecosystem for autonomous driving through an open model
Horizon Robotics is evolving from an autonomous driving chip supplier into a core builder of an autonomous driving inclusive ecosystem. Its open cooperation model dubbed "HSD Together" accurately addresses the core problem of current OEMs: in the context of sufficient supply of high-computing-power chips, most OEMs lack the technical capabilities and development efficiency to transform computing power into high-experience, high-level autonomous driving functions.
Based on this open cooperation model, Horizon Robotics opens its high-level autonomous driving algorithms that have been proven in mass production to its ecological partners, helping the latter significantly reduce research and development costs and shorten project cycles. Under the new cooperation framework, the partners can not only purchase chips, but also obtain all algorithm services in one stop. What is particularly critical is that the core model algorithm supports white-box delivery, leaving ample room for secondary development for OEMs. At the end of 2025, Horizon Robotics officially commenced the mass production of HSD which is a high-level autonomous driving solution. It was first installed on Deepal L06 and Chery ET5, and was launched and delivered simultaneously. This milestone marks that urban NOA with practical value has officially spread to vehicles priced below RMB150,000, and penetrated fast to vehicles priced at RMB100,000. The popularity of autonomous driving has entered the substantial application stage.
Prediction 3: SenseAuto defines a new level of autonomous driving safety with an AI native technology paradigm
Strategically, SenseAuto is positioned as a strategic partner to accelerate autonomous vehicles into the artificial general intelligence (AGI) era. It has built a "cockpit-driving-cloud" trinity AGI technology architecture, and formed a diversified product system consisting of intelligent driving, intelligent cockpits and AI cloud. In the field of autonomous driving, SenseAuto has unveiled AD Pro and AD Max based on Horizon Robotics J6E/J6M, and AD Ultra based on NVIDIA Orin/Thor to make NOA available in vehicles with varying price ranges. SenseAuto's multi-platform mass production solution has been steadily implemented. In March 2025, its purely visual autonomous driving solution based on Horizon Robotics J6M was mass-produced and applied to GAC Trumpchi S7. The company had fulfilled the mass production and delivery of urban NOA and end-to-end solutions based on multiple chip platforms such as J6M and NVIDIA Thor by the end of 2025.
SenseAuto's autonomous driving technology paradigm has always been centered on accurately predicting industry problems and accelerating the closed-loop implementation of technology. Each step of evolution has hit every key upgrade of autonomous driving: when the industry still relied on multi-module split architectures in 2022, SenseAuto was the first to launch the UniAD (one-model end-to-end solution) with integrated perception and decision-making, which directly outputs vehicle control instructions, solving the problems of information delay and data fragmentation in traditional architectures. It not only won the CVPR 2023 Best Paper Award, but also made autonomous driving officially enter the end-to-end era, and it took only 18 months to complete the verification from technical prototype to mass production; in 2024, in response to the common industry problem of "scarce long-tail scenario data and high labeling costs" for end-to-end models, SenseAuto launched the SenseWorld (autonomous driving world model), building pre-training capabilities by generating high-fidelity virtual scenarios, and built an "end-to-end data factory" with SAIC IM within only 3 months after the launch to achieve full-link simulation of 20+ core hazard scenarios, significantly reducing testing costs; in 2025, in order to break through the safety boundary from "driving" to "safe driving", SenseAuto further introduced reinforcement learning and launched the R-UniAD, which adopts the five-step method of "cold start->virtual enhancement->real fine-tuning->model distillation->automotive deployment" to allow the model to independently explore the safety boundary in the virtual scenario. In just half a year, the technology was promoted to mass production in cooperation with Dongfeng Motor, completing an efficient closed loop from academic breakthrough to industrial application. This evolutionary path of "one-model end-to-end -> generative world model -> multi-stage reinforcement learning" not only redefines the underlying logic of autonomous driving technology, but also becomes a Chinese paradigm for the transformation of the global autonomous driving industry from "function-oriented" to "safety-oriented" with a 1-2 year lead in technology.
SenseAuto's self-developed SenseWorld is the core engine of its end-to-end autonomous driving solution. Its crucial value lies in generating high-fidelity, multi-view simulated driving scenario videos to provide efficient training materials for the end-to-end autonomous driving model, breaking the crux of "real long-tail scenario data scarcity" in autonomous driving research and development, and building technical barriers with "generative AI-driven data efficiency" as the kernel.
In July 2025, SenseAuto officially launched the first generative world model platform, which was simultaneously open to B/C-side users for trial use, marking the technology's move from the research and development stage to large-scale application. Currently, SenseAuto has reached an in-depth strategic cooperation with SAIC IM. They have jointly built an "end-to-end data factory", leveraging generative AI technology to accurately overcome three core problems - "scarcity of long-tail scenario data, low annotation efficiency, and high testing costs" in autonomous driving research and development, and accelerate the safe application and capability iteration of IM AD 3.0 +. Based on the cooperation, the WorldSim-Drive (customized data set exclusive to IM) created by both parties has seen key breakthroughs and can cover the full-link simulation of more than 20 core hazard scenarios such as cut-in, collision warning, road occupation emergency braking, roundabouts, etc., providing high-value test samples for autonomous driving systems. In the future, the two parties plan to further build a library consisting of tens of millions of generative scenario, a full-dimensional test sample system, full-domain coverage of various driving scenarios, and continue to enhance the scenario adaptation capabilities and safety performance of IM's Autonomous driving system.
From the application of highway NOA to the technological maturity of mapless urban NOA, the evolution of autonomous driving has always been driven by both user value and underlying safety. It is an important trend to realize cockpit-driving integration and promote autonomous driving to new heights with AI technology. In September 2025, the Ministry of Industry and Information Technology of China (MIIT) issued the mandatory national standard "Safety Requirements for Combined Autonomous Driving Systems of Intelligent Connected Vehicles", which clearly delineates the core rules for DMS and EOR: when DMS detects "the driver's eyes are off the road", it must trigger EOR within 5 seconds.
This standard mandates that L2 and above autonomous vehicles be equipped with DMS as standard, directly promoting this technology from high-end vehicles to RMB100,000 vehicles which are the mainstream in the market, and prompting the OEM shipments to grow two-fold soon. In this process, leading suppliers represented by SenseAuto (from January to November 2025, SenseAuto ranked first in the domestic DMS supply market with a market share of 27.7%) have a competitive advantage thanks to "fatigue + distraction" detection technology and adaptation to extreme scenarios such as backlighting and drivers wearing sunglasses. Small and medium-sized vendors with insufficient technical barriers are being eliminated at an accelerated pace, and the industry's concentration tends to intensify.
In the future, with the continuous iteration of L3 and higher-level autonomous driving technology, the constant improvement of cross-scenario data closed loops, and the deep integration of the industrial chain collaborative ecology, autonomous driving will completely reshape human mobility and promote the comprehensive transformation of the autonomous industry from "vehicle manufacturing" to "intelligent mobility services". A new era of safer, more efficient, and more considerate mobility is rapidly approaching.