For advanced driver-assistance systems (ADAS) and autonomous-vehicle (AV) platforms to become the future of driving, those systems must be prepared for all driving conditions. The road is full of complex, unpredictable situations, and cars must be equipped with effective, intelligent sensor systems that are not only affordable for mass production, but capable of collecting and interpreting as much information as possible to ensure the artificial intelligence controlling a vehicle makes the right decision every time.
However, the existing sensor technology currently deployed on test roads across the world doesn’t completely address the requirements for SAE level 3 and greater. Truly safe ADAS and AV require sensors to deliver scene data adequate for the detection and classification algorithms to autonomously navigate under all conditions for SAE automation level 5 (Fig. 1). This is a challenging requirement for engineers and developers to address.
Visible cameras, sonar, and radar are already in use on production vehicles today at SAE automation level 2, and SAE automation levels 3 and 4 test platforms have added light detection and ranging (LiDAR) to their sensor suite, but that’s not enough. These technologies are unable to detect all important roadside data in all conditions, nor provide the data redundancy required to ensure total safety.