Sensor fusion: A critical step on the road to autonomous vehicles: Page 2 of 5

April 11, 2016 // By Hannes Estl
Sensor fusion: A critical step on the road to autonomous vehicles
Many cars on the road today, and even more new cars in the show rooms, have some form of advanced driver assistance system (ADAS) based on sensors like cameras, radar, ultrasound, or LIDAR. However, it is not just the number or type of sensors that is important, but how you use them.

Sensor fusion system examples
Sensor fusion can happen at different levels of complexity and with different types of data. Two basic examples of sensor fusion are: a) rear view camera plus ultrasonic distance measuring; and b) front camera plus multimode front radar – see figure 2. This can be achieved now with minor changes to existing systems and/or by adding a separate sensor fusion control unit.

Figure 2: Fusing front radar with front camera for adaptive cruise control plus lane-keep assist or rear-view camera with ultrasonic distance warning for self-parking.
  • Rear view camera + ultrasonic distance measuring

Ultrasonic park assist, which has reached wide acceptance and maturity in the automotive market, gives an acoustic or visual warning of near objects while parking. As mentioned earlier, rear view cameras will be legally required in all new cars in U.S. by 2018. Combining information from both allows the introduction of advanced park assist features, which is not possible with just one system. The camera gives the driver a clear view of what is behind the car and machine vision algorithms can detect objects, as well as the curb and markings on the street. Supplemented with the capabilities of the ultrasound, the distance of the identified objects can be accurately determined and basic proximity warning in low light or even full darkness is ensured.

  • Front camera + multimode front radar

Another powerful combination is combining the function of a front camera with the front radar. Front radar can measure the speed and distance of objects up to 150 meters in all weather conditions. The camera is great in detecting and differentiating objects, which includes reading street signs and street markings. By using multiple camera sensors with a different field of view (FoV) and different optics, such things as pedestrians and bikes passing in front the car as well as objects 150 meters and more ahead can be identified. Features like automated emergency brake and city stop-and-go cruise control can be reliably implemented.

Being able to perform ADAS functions under certain well-defined conditions in many cases can be achieved by single sensor types or individual systems. However, this can be insufficient to operate reliably given the unpredictable conditions found on our streets. Sensor fusion – in addition to enabling more complex and autonomous features – can achieve fewer false positives and false negatives in existing features. It will be critical to convince customers and law makers to trust “a machine” to drive a car autonomously.

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.