MENU

Automotive trends at CES 2024

Automotive trends at CES 2024

Feature articles |
By Nick Flaherty



The Consumer Electronics show in Las Vegas has become a key place for the latest automotive technology demonstrations, from in-cabin sensing and infotainment to Level 3 and Level 4 autonomous driving.

Nvidia took the opportunity of CES 2024 to announce several car makers that will be using its Orin and Thor processors for ADAS safety and self-driving cars.

Li Auto will be used the 2000TOPS Thor chip in its next-generation vehicles with a thousand TOPS of processing power. At the same time GWM (Great Wall Motor), Zeekr and Xiaomi are using the Orin platform for intelligent automated-driving systems.

“The transportation industry is embracing centralized compute for highly automated and autonomous driving,” said Xinzhou Wu, vice president of automotive at Nvidia. “The AI car computer of choice for today’s intelligent fleets is NVIDIA DRIVE Orin, with automakers increasingly looking to the advanced capabilities and AI performance of its successor, NVIDIA DRIVE Thor, for their future vehicle roadmaps.”

The Drive centralized car computer based on Thor integrates a wide range of intelligent functions into a single AI compute platform, delivering autonomous driving and parking capabilities, driver and passenger monitoring, and AI cockpit functionality.

Li Auto currently uses two Orin processors to power its assisted-driving system, AD Max, for its L-series models. The processors, which provide a combined 508 TOPS for real-time fusing and processing of sensor information for advanced driver-assistance systems, assisted driving for lane change control (LCC), automated parking and automatic emergency braking (AEB) active safety features.

The new AD Max 3.0 upgrade transitions the system to an end-to-end algorithmic architecture dominated by large AI models. It delivers a safer, more comfortable intelligent driving experience using an occupancy network and spatio-temporal trajectory planning and model-predictive control algorithms.

GWM is basing its Coffee Pilot on the DRIVE Orin chip. This can support parking, high-speed and urban scenes to achieve full-scenario smart navigation and assisted-driving functions without high-precision maps. Advanced intelligent-driving features, such as Urban Navigate on Autopilot and cross-floor Memory Parking, will be first rolled out in GWM’s WEY models.

“LLM-driven AI technology will profoundly enhance future mobility as well as the entire automotive industry,” said a GWM spokesperson. “GWM is committed to working with NVIDIA and other industry-leading players to offer greener, smarter mobility for all.”

Zeekr, the premium EV subsidiary of Geely alongside Volvo, has launched its fourth model to be powered by Orin. This has a new full-stack smart driving system using two Orin chips for intelligent parking and automated operation on high-speed and urban roads.

Smartphone maker Xiaomi is using dual Orin chips for its first EV, the SU7 sedan shown at the show.

This is built using Xiaomi’s leading large language model for perception and decision-making to navigate through Chinese cities, regardless of locale, administrative divisions within the country or type of road.

Among the car makers, Mercedes-Benz showed its MB.OS in a range of cars, including the Concept CLA Class.

BMW is using Amazon’s Alexa AI framework for managing its vehicles. Instead of digging through a car manual, users will be able to ask BMW’s assistant for things like recommendations on different BMW drive modes and then have the assistant activate the chosen mode. They can also ask for instructions on how car features work—like the parking assistance system—and hear explanations in easy-to-understand terms through the BMW assistant’s customized voice.

The demo at CES 2024 follows Amazon’s previous announcement that BMW’s next-generation Intelligent Personal Assistant will be supported through our Alexa Custom Assistant technology (ACA).

Cerence is introducing CaLLM, an automotive-specific large language model that serves as the foundation for the company’s next-gen in-car computing platform. Cipia is also showcasing its embedded software version of Cabin Sense, which includes both driver and occupancy monitoring and is expected to go into serial production this year.

EyeLights is unveiling its new cockpit vision, enabled by generative AI and accelerated compute to turn the windshield into an augmented reality display.

Kodiak is exhibiting an autonomous truck, which relies on NVIDIA GPUs for high-performance compute, and is also moving to the Ambarella CV2 chip.

Lenovo showed its vehicle computing roadmap, featuring new products based on Nvidia Thor including the XH1, a central compute unit for advanced driver-assistance systems and smart cockpit; Lenovo AH1, a level 2++ ADAS domain controller unit; and Lenovo AD1, a level 4 autonomous driving domain controller unit.

There were also more details on the Pebble semi-autonomous recreational trailer which is heading for production later this year and uses the Nvidia Orin chip.

Radar has also been a key technology at the show with new sensors from TI and NXP as well as a new system from Provizio in Ireland.

Nvidia was also highlighting the role of its Omniverse cloud software for automotive applications. Mercedes is using the technology for a digital twin of its latest factory, while car makers are using the technology in showrooms around the world.

Omniverse is a software platform for developing and deploying advanced 3D applications and pipelines based on the OpenUSD protocol. This provides the ability to instantly visualize changes to a car’s colour or customize its interior with luxurious finishes.

Independent software vendors, or ISVs, can use the native OpenUSD platform as a foundation for creating scene construction tools — or to help develop tools for managing configuration variants. The Omniverse Cloud taps GDN, which uses Nvidia’s global cloud-streaming infrastructure to deliver the high-fidelity 3D interactive experiences. 

Configurators, when run on GDN, can be easily published at scale using the same GPU architecture on which they were developed and streamed to nearly any device. All this means less redundancy in data prep, aggregated and accessible data, fewer manual pipeline updates and instant access for the entire intended audience.

Chiplets are also becoming a key area of technology for automotive. Intel is planning to be the first to deliver automotive chiplets working with Belgian research group imec and leading European car makers. Dutch AI chip developer Axelera AI is also looking to automotive chiplets, alongside Renesas.

www.nvidia.comwww.intel.com; www.amd.com; www.ambarella.com; www.axelera.ai

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s