MENU

Researchers combine lidar and radar sensors in the headlamp

Technology News |
By Christoph Hammerschmidt


Autonomously driving cars rely on quite a number of sensors for orientation. But these require space, which is usually opposed by the wishes of the designers. Researchers at the Fraunhofer Gesellschaft have now found a way to integrate some of the sensors unobtrusively. They are building them into the front headlights – combining optical light, radar and lidar.

Five Fraunhofer institutes, including the Institute for High Frequency Physics and Radar Techniques FHR (Wachtberg, Germany), have joined forces in the “Smart Headlight” project to install the sensors in a space-saving way and as unobtrusively as possible – without compromising function and performance. The aim of the project is to develop a sensor-integrated headlight for driver assistance systems in which different sensory elements are combined with adaptive lighting systems. In this way, objects on the road, especially other road users such as pedestrians, are to be recognised even better. For example, the lidar sensor is used in electronic brake assistants or in distance control systems.

“We integrate radar and lidar sensors into the headlights, which are there anyway and which guarantee optimum transmission for optical sensors and light sources as well as freedom from soiling,” says Tim Freialdenhoven, a scientist at Fraunhofer FHR.

First of all, the lidar system has to be designed for integration into automotive systems. In addition, the light that falls from the headlights onto the road should not be influenced by the two additional sensors. Nevertheless, the beams of both sensor systems should take the identical path as the LED light. This is further complicated by the fact that all beams have different wavelengths: Visible headlight is in the range of 400 to 750 nanometres, while the infrared lidar beams are quite close to the visible range at 860 to 1550 nanometres. Radar beams, on the other hand, have a wavelength of four millimetres. “These three wavelengths are to be coaxially combined, so we speak of a multispectral combiner,” Freialdenhoven emphasises. The coaxial beam guidance is important to avoid a parallax error, which first has to be calculated out in a complicated way. In addition, arranging the sensors side by side would take up considerably more space than the coaxial arrangement. The researchers solve this problem using so-called bi-combiners: a specially coated di-chroid mirror is used for the combination of LED light and lidar light, with which both beams are brought onto one axis via a wavelength-specific reflection. The same is done at the second combiner, where LED light, lidar light and radar are combined, although this is not as complex because of the very different wavelengths. Since radar sensors are already widely used in the automotive sector, the bi-combiner is to be designed in such a way that manufacturers can continue to use existing sensors without adaptation.

But why combine optical systems, lidar and radar at all? “Each individual system has its strengths, but also its weaknesses,” Freialdenhoven explains. For example, optical systems reach their limits in optically poor visibility conditions – fog, precipitation and dust. Radar systems, on the other hand, see almost unhindered through dense fog. But their classification capability is not very high: radar can detect whether it is a person or a tree, but it does not come close to the classification capability of lidar. “We are working on fusing the data from radar and lidar – which offers extreme added value, especially in terms of reliability,” says Freialdenhoven. A patent has already been filed, and the team is currently working on building a prototype.

The technology will significantly expand the possibilities of sensor integration for driver assistance systems. Smaller light modules, more compact lidar sensors and integrated radar sensors allow the implementation of multisensor concepts, especially for autonomous driving with increasing design requirements and limited installation space. In this way, autonomous systems will in future not only be able to recognise a human being, but also to analyse their speed, distance and the angle at which they are standing in relation to the car.

https://www.fhr.fraunhofer.de/en.html  

Related articles:

Radar sensors integrated in car headlight

Headlamp glass cover doubles as radar antenna

Koito, Blickfeld plan to integrate lidar into headlamp

200,000 micro lenses to dynamically configure headlights

Lidar sensor combines compact design with high resolution

Radar, the car’s virtual eye

Lidar perception challenges


Share:

Linked Articles
10s