
Smart signal data processing slashes response time for cars
When a child runs onto the road, it takes an average of 1.6 seconds for a human driver to hit the brakes. Autonomous vehicles equipped with radar or lidar sensors and a camera system have a reaction time of 0.5 seconds. At a speed of 50 km/h, however, this is still 7 metres, which the car continues to drive unbraked. Together with various partners from industry (AVL, Jabil Optics, John Deere, InnoSenT, Silicon Radar) and research (DCAITI, Fraunhofer Institute for Open Communication Systems FOKUS), the Fraunhofer Institute for Reliability and Microintegration IZM is now developing a combined camera-radar module that registers changes in road traffic much faster: The module, the size of a mobile phone, will have a reaction time of less than 10 milliseconds. According to a study by the University of Michigan, it will react 50 times faster than conventional sensor systems and 160 times faster than humans. The car thus only moves 15 cm without braking – then the system reacts and sends signals for braking. This could prevent many accidents in urban traffic.
The trick behind this is an innovative signal processing architecture: The data from the radar system and the stereo camera are processed and filtered immediately in or on the module. Non-relevant information is recognized, but not passed on to a downstream processing unit. Through sensor fusion the data from the camera and radar are combined. On the basis of neuronal networks, these data and thus various traffic conditions are evaluated by machine learning algorithms.
As a result, the system does not transmit any status information to the downstream processing entity (for instance, an autonomous driving computer or a driver assistance system), but only action instructions. This leaves the vehicle’s data bus free for important and urgent signals, such as a child suddenly running onto the road. “The integrated signal processing shortens the reaction time enormously,” says Christian Tschoban, group leader in the RF & Smart Sensor Systems department at Fraunhofer IZM who is working with his team on the KameRad project. The functional demonstrator that he developed looks like a grey box with one eye on the right and one on the left – the stereo cameras.
The project will be completed in 2020. Until then, the project partners AVL List GmbH and DCAITI are testing the first prototype in Berlin’s city traffic, among other places.
More information: https://www.fraunhofer.de/en/press/research-news/2019/june/radar-sensor-module-to-bring-added-safety-to-autonomous-driving.html