Nvidia expands DRIVE Hyperion ecosystem to speed full autonomy
Nvidia has used CES 2026 to broaden the supplier and sensor bench around its DRIVE Hyperion ecosystem, positioning the platform as a quicker path to Level 4-ready vehicle programmes for both passenger and commercial transport. In a CES blog post, the company said the expanded lineup now spans tier-1 suppliers, integrators and sensor partners, including Aeva, AUMOVIO, Astemo, Arbe, Bosch, Hesai, Magna, Omnivision, Quanta, Sony and ZF.
DRIVE Hyperion ecosystem: more tier-1 ECUs and validated sensors
The practical pitch is risk reduction: OEMs and developers can source electronic control units and sensor suites that have been qualified against a common reference architecture, rather than assembling an ad-hoc stack and discovering integration issues late. Nvidia said companies such as Astemo, AUMOVIO, Bosch, Magna, Quanta and ZF are building DRIVE Hyperion-based ECUs, while a separate set of partners is bringing camera, radar, lidar and ultrasonic sensor suites through qualification on the platform.
For eeNews Europe readers, this is the continuation of Nvidia’s multi-year push to productise a reference platform rather than “just” sell compute. The company has been talking about production-ready reference architectures since at least when eeNews Europe reported its earlier DRIVE Hyperion reference-platform work, but the CES message is that more of the surrounding ecosystem is now being made plug-compatible.
What the DRIVE Hyperion ecosystem enables at Level 4
Nvidia is framing the latest Hyperion generation as a sensor-fusion and actuation platform that supports low-latency, cross-domain control (braking, steering and suspension) alongside perception. At the centre is a dual-SoC configuration based on NVIDIA DRIVE AGX Thor, which Nvidia says delivers more than 2,000 FP4 TFLOPS (around 1,000 INT8 TOPS) for real-time fusion of a 360-degree sensor view.

An open DRIVE AGX Thor developer kit highlights the in-vehicle compute hardware used for automated-driving and sensor-fusion development. Source: Nvidia.
The company is also leaning hard into transformer-based perception and vision-language-action style models running on-vehicle, with the idea that a shared hardware baseline lets partners differentiate above it in software and services.
Safety framework and new open models
Alongside the partner expansion, Nvidia is bundling the message under its Halos safety and cybersecurity umbrella, positioning it as an end-to-end framework from data centre workflows through to in-vehicle deployment. In parallel, Nvidia also launched the Alpamayo family of open models, tools and datasets aimed at “long tail” driving scenarios and reasoning-based autonomy, detailed in a separate announcement. Tier-1 participation is also moving beyond generalities: for example, Magna said it will offer integration services and Hyperion-compatible ECUs for OEM programmes, in its CES-timed release.
Net-net: Nvidia is trying to make “Level 4-ready” feel less like a science project and more like a supply-chain option. Whether that translates into quicker series programmes will depend on how much of the integration burden OEMs can genuinely offload to validated modules — and how regulators respond to the newer, more model-driven autonomy stacks now being proposed.
If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :
eeNews on Google News
