Ansys shows autonomous driving tools ahead of Synopsys deal
Cette publication existe aussi en Français
Ansys is collaborating with Cognata and Microsoft on a web-based platform enabling users to test and validate ADAS/AV automotive sensors in a digital twin virtual environment that mimics real-world conditions.
The Ansys AVxcelerate Sensors simulation software will boost the Automated Driving Perception Hub (ADPH) platform, managed by Cognata and running on Microsoft Azure, with high-fidelity radar and electromagnetic (EM) wave propagation simulation capabilities.
The digital twin platform is powered by AMD EPYC central processing units (CPUs) and Radeon PRO graphics processing units (GPUs) for machine learning inference and visualization workload and hosts a library of manufacturer-certified virtual sensor models.
It has also developed an integrated toolchain in collaboration with Kontrol, Microsoft, and TÜV SÜD to streamline safety, certification, and virtual homologation.
This comes as Ansys is extending its deals with Sony Semiconductor and Marelli Electronic Systems and its simulation tools are used by 94% of the top 100 automotive suppliers. Synopsys has spent the last year chasing a deal to acquire Ansys for $35bn.
The tools for developing ADAS systems and autonomous driving software are being shown at CES 2025 next week.
- Ansys taps Supermicro, Nvidia for turnkey AI hardware
- BMW co-develops simulation tools for autonomous driving
The company has developed a design and simulation platform called ConceptEV that accelerates EV powertrain system development and enables cross-functional teams to meet consumer and market requirements.
For the Cognata deal, the AVxcelerate Sensors is accessible through Cognata’s Automated Driving Perception Hub. functions using a high-fidelity simulation platform with virtual twin technology. The ADPH allows OEMs and sensor manufacturers to test and validate certified sensors against diverse industry standards, including those put forth by the National Highway Traffic Safety Administration (NHTSA) and the New Car Assessment Program (NCAP). The platform currently includes Cognata sensor models for thermal cameras, LiDAR, RGB cameras with varying lens distortions.
The physics-based radar models that reproduce EM wave propagation — accounting for material properties within high frequencies — to enhance signal strength and accuracy. The radar simulation provides raw data that can be used to test and improve the algorithms that process radar signal interference, like small changes in frequency caused by moving objects (doppler effect). When connected to a virtual model from a radar supplier, AVxcelerate Sensors produces a virtual twin of the sensor, enabling OEMs to evaluate its performance with enhanced predictive accuracy.
“We are excited to integrate Ansys’ radar simulation technology into the ADPH platform, bringing OEMs and tier-one suppliers an unmatched level of accuracy in sensor validation,” said Danny Atsmon, founder and CEO at Cognata. “Ansys’ ability to simulate complex EM wave interactions enhances our platform’s ability to deliver precise, real-world insights for radar-based ADAS and AV systems. This collaboration significantly advances the industry’s ability to test and refine sensor performance under diverse conditions.”
Cognata’s generative AI transfer technology runs on the AMD Radeon PRO V710 GPUs and enhances the RGB camera simulation platform by accurately capturing and replicating the real-world behaviour of sensors within the simulation.
“Ansys’ AVxcelerate Sensors platform includes real-time radar capabilities for accurate modeling of radar interactions in complex environments,” said Shane Emswiler, vice president of products at Ansys. “As the industry works toward fully autonomous driving, safety validation is paramount, and the joint effort between Ansys and Cognata streamlines this typically long and complicated process.”
The Ansys AVxcelerate Sensors autonomous vehicle (AV) sensor simulation software provides real-time multispectral camera simulation for scenario-based perception testing for the high dynamic range (HDR) Image Sensor Model from Sony. This allows OEMs to test advanced driver assistance systems (ADAS) and AV functions, accounting for sensor behaviour in diverse driving scenarios including low light, nighttime, rain, snow, and fog,
The AVxcelerate Sensors platform generates a virtual environment with varied lighting, weather, and material conditions to simulate how light travels through the environment, the camera lens, and then hits the imager. Coupled with Sony’s sensor model, this simulation can reproduce pixel characteristics, signal processing functions, and system functions of Sony’s HDR imager with extreme predictive accuracy.
This simulation model enables users to conduct robust, scenario-based testing — with either pre-defined inputs or real-time feedback — for Sony’s HDR imager-based perception systems, improving accuracy, reliability, and safety in ADAS and AV applications.
To minimize on-road testing, combined simulations can inject images into advanced Software-on-Chip perception systems. Simultaneously, an electronic control unit, used to control functions like engine management and transmission, is integrated into this simulated environment to test its performance. This approach ensures the entire simulation process, from sensors to processing chips, is accurate and reliable.
“Achieving full autonomy involves OEMs working with leading technology providers like Ansys to enhance the accuracy of the integrated tools used to validate AV systems,” said Tomoki Seita, general manager, automotive business division, Sony Semiconductor Solutions Corporation. “Through this collaboration, customers can confidently verify their systems using highly reproducible, predictively accurate simulations. This is especially useful for OEMs and Tier 1 suppliers that run actual camera simulations to verify recognition algorithms and vehicle control software.”
“Using Ansys tools, we have realized a 25% reduction in product development cycle, 15% to 20% savings on engineering development costs, and the 15% to 20% improvement in product performance — to the satisfaction of our customers,” said Luciano Saracino, head of the Mechanics and Optics Centre of Expertise, Marelli Electronic Systems.
The Ansys SimAI cloud-enabled generative AI platform provides prediction of physics behaviour and can be trained with existing simulation results for applications including fluid dynamics, thermal and electromagnetic performance and structural deformation, while the Discovery 3D simulation software uses Nvidia’s Omniverse Blueprint digital twin platform for visualizing and accelerating large-scale computational fluid dynamics workflows.