dSPACE Sensor Simulation offers its users models for the respective sensor environment via library menus. These models can be used to create virtual 3D worlds in which real objects such as road users, traffic signs, or edge structures can be inserted into the environment of the self-propelled vehicle. Via the menus, users can also access a material database that already contains more than 1,300 objects and 170 predefined materials, which can be further completed by users. In addition, the software provides sensor models for radar, lidar and cameras and offers suitable models for testing perception, fusion or application logic.
The development of functions for autonomous driving is complex not least because hardware components such as sensors from different manufacturers, the control units and the driving algorithms must be perfectly coordinated. The new simulation environment therefore also allows the integration of customer-specific sensor front ends. This increases the degree of realism and the sensor technology can be individually adapted to the individual application.
Sensor Simulation supports the reuse of models and test scenarios on different platforms. Tests created and used on the developer’s PC can be performed on a HIL or SIL simulator or in the cloud. This enables easy scaling so that developers can perform many tests in a short time.
Sensor simulation requires a powerful PC hardware platform. For maximum performance, dSPACE offers the Sensor Simulation PC, which is equipped with a powerful graphics processing unit (GPU) that executes the complex, highly accurate sensor models.
The development tool vendor plans to demonstrate the potential of the new environment at the dSPACE World Conference in Munich on November 19 and 20.
More information: https://www.dspace.com/en/ltd/home.cfm