AI computing platform targets level 5 autonomy

AI computing platform targets level 5 autonomy

Technology News |
By Christoph Hammerschmidt

The Drive PX Pegasus computer platform is designed to control automobiles that run completely automatically – i. e. cars that do not have a steering wheel, accelerator or brake pedals. For this type of application, the platform must not only have sufficient computing power to process the data of an entire armada of sensors in real time and derive driving decisions from it, but also meet the highest requirements for functional safety (ASIL D according to ISO 26262): The computer is designed to be resilient and fail-safe. It is equipped with multiple different processors that serve as a back-up for each other, explained Dany Shapiro, Senior Director Automotive for Nvidia at the event.

Computig power for robot taxis: Nvidia’s Drive PX Pegasus 

The high demands on reliability and redundancy mean that these computers require very high computing power. Compared to autonomous vehicles at level 4 (autonomous driving, but drivers must be able to intervene in an emergency), level 5 vehicles require overlapping surveillance with up to 16 high-resolution cameras, lidar and radar sensors. The vehicle itself must constantly know its position to the centimeter and immediately recognize other vehicles and people in the vicinity. Because of these requirements, fully autonomous cars require 50 to 100 times more computing power than today’s vehicles, even if they are equipped with advanced driver assistance systems and master the level 3 of the autonomy scale, said Nvidia CEO Jensen Huang in his keynote speech.

To achieve this computing power, the PX Pegasus is equipped with four processors – two SoCs from the new Xavier family and two next-generation GPUs, each designed specifically for deep learning and autonomous driving. The Pegasus has a computing power of 320 TOPS, ten times as much as Nvidia’s current AI platform Drive PX 2. The platform is designed for updates via the air interface (OTA) to keep the software running on the computer up-to-date at all times. The AI computer is due to be delivered in the second half of 2018; a collection of development tools and libraries is already available under the name Drive Works.

The second innovation at Nvidia’s event was the Holodeck, a virtual reality platform aimed primarily at developers in the automotive industry, but can also be used in other industries. This system, presented by Nvidia as the “Design Lab of the Future”, generated a photorealistic representation with a high degree of immersion from the CAD data of a vehicle in real time. Developers can interactively collaborate on the virtual object, with a latency of less than 20 milliseconds, according to Nvidia. However, this only applies if the design data (in a special format for the VR representation) is stored locally in each case. An alpha version is already available to interested parties as part of a pre-program.

Nvidia’s Holodeck gathers designers and developers
around a virtual copy of their creation

At the event, Nvidia also announced a cooperation with automotive supplier ZF and Deutsche Post DHL (DPDHL). The logistics company will equip its electric delivery vehicles with artificial intelligence to cover the “last mile” of delivery with autonomous vehicles. The technical basis is the AI-capable ProAI control computer from ZF, which is based on Nvidia technology. DPDHL currently operates a fleet of 3400 electric delivery vehicles from its subsidiary StreetScooter. These can be equipped with ZF camera, lidar and radar sensors to enable autonomous navigation in mostly complex inner-city traffic situations. According to DPDHL, the use of artificial intelligence plays a key role in this process. From 2018 onwards, the logistics company intends to provide a test fleet of such vehicles and test them on its own test site near Aachen (Germany).

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News


Linked Articles