MENU

Simulator supports multiple robotaxis

Simulator supports multiple robotaxis

Interviews |
By Nick Flaherty


The latest simulator technology from Budapest-based AImotive supports multi-node, multi-client operation to allow multiple autonomous vehicle designs to be simulated in the same virtual environment, says Szabolcs Jánky, aiSim Product Manager at AImotive.

Currently only one vehicle design can be validated in a simulator, with other vehicle represented by models. One vehicle uses one GPU for a vehicle with a small sensor, or 6 GPUs to run a detailed simulation of a robotaxi with multiple sensors, and aiSim3.0 allows up to four, 10 GPU racks to be combined. The software can also  be used in cloud systems and is built on open APIs such as the Vulkan ray tracing API from Khronos to allow portability between GPUs

“We don’t use proprietary libraries from hardware vendors for ray tracing or physics models, and with the recent support for ray tracing from Khronos we were able to switch to open APIs with Nvidia, AMD and Intel,” said Jánky.

“In the cloud we don’t have a software limit, the limit is the architecture of the cloud, the connection between the nodes,” he said.

“When it comes to server infrastructure there is a huge difference between AMD nodes and Nvidia nodes so we want to give the option to switch as necessary. In automotive everything is about standards and customers want to mitigate the risk of being locked in – this is the case in hardware but it is starting to be the case with the software infrastructure as well,” he said.

aiSim3.0 is a ISO26262 certified simulator for the development and validation of ADAS and AD systems. AImotive built its own engine to provide accurate physics for the sensors, particularly issues such as fog or rain, rather than using the Unity or Unreal gaming engines.

The focus is on testing sensors, including camera, lidar and radar fusion in a wide range of scenarios. Using simulation allows edge and corner cases to be validated, he says.

“Unreal and Unity have their limitations in simulating the setups for sensor set ups and sensor models,” said Jánky. “Nvidia also realised that these are not enough, they are going with their own engine, this is something we had realised four years ago. Rain and snow is about visualisation for the game engines and they use tricks to get the visual performance for games. For simulation you need to be completely deterministic all the time as you need to calculate the same thing to ensure you are testing the system, not the simulator,” he said. “What you need is correlation between simulation testing and real world testing – this is what we are doing,” he said.

For validation, the system can take real world data from test cars on streets around the world and correlate this with the simulation performance to demonstrate the accuracy

“Most simulators are used in R&D but not used in verification and validation pipelines because of the missing correlation piece for end-to-end simulation including the sensors and this allows aiSim to be the first ISO26262 qualified simulator,” said Jánky.

The simulator is also designed to be used with hardware-in-the-loop (HIL) verification systems and AImotive works with NI on the Veristand systems.

aiSIm3.0 is optimised to fit into existing tool chains for HIL, and the open APIs allow it to connect to a test framework for a custom simulator. “We have all the tools for automated testing and we want to bring those features to local machines as customers, such as running results locally. For example when you are doing sensor simulation you can just connect to the large scale system and stream the results back to a local machine,” he said.

“HIL support tests very low level things and is expensive to set up but there are new demands for simulation with level 3 stack with a lot of sensors in real time and introducing scenario testing for HIL. Previously it was about fault injection, and it wasn’t testing the end to end system. The correlation is easiest if you use a consistent HIL environment and sensor environment

“We built a HIL bench for our own needs and found there were limitations – right now HIL for a big ECU is just testing the sensors, they provide replay tests. We are actually able to drive these setups with multiple mode setup with sensors on separate computers and we can orchestrate the high fidelity models with low latency so you can test things in HIL that you weren’t able to before,” he said.     

“We added a PC with lots of GPUs to the HIL and then we connect to the system with direct data ingestions of data from the camera models running on the GPUs. This gives real purpose to the ECU testing,” he said.

“For AIsim4.0 we have a good base to start with but I think we need to cover more locations, more sensor models  for validation in China or Singapore or Japan, with those specific traffic signs,” he said. “If people are using a mix of lidar and radar we need to have those models and that will mean partnerships in the future.”

“We are also discussing how to make this available to every player – these tools are here, so something for developers and engineers to play with, accelerating the transition to simulation,” he added.

www.aimotive.com

Related articles

 Other articles on eeNews Europe


Share:

Linked Articles
10s