The iDAR combines what the company says is the world’s first agile MOEMS LiDAR, pre-fused with a low-light camera and embedded artificial intelligence. AEye's founder and CEO Luis Dussan is keen on highlighting the fast and dynamic perception and path planning that the iDAR platform enables, for intelligent and agile data collection. Interviewed by eeNews Europe, Dussan wouldn't reveal much about how the two sensors, camera and LiDAR are stacked together.
"The CMOS camera and the MOEMS LiDAR are mechanically aligned, they see the same thing. How we manage the light path is our own secret recipe and that's why we have about 17 patents around the system", the CEO said. "The camera overlays its 2D colour images onto our 3D LiDAR scans without any registration post-processing required to correlate the two data sets, it is done mechanically".
The LiDAR part is based on a proprietary low-cost, solid-state beam-steering MOEMS operating at the 1550nm telecom waveband, which according to Dussan, is the only sensible choice for the mass adoption of LiDARs.
"This waveband is retina-safe, it is not only invisible but does not focus on the back of your retina, instead it is absorbed by the liquid between the cornea and the retina. It is the right choice if you are going to put millions of those devices on the streets, so that there is no chance to do any harm to unsuspecting pedestrians. In contrast, 900nm is the most dangerous wavelength to stare at", the CEO noted.
Then embedded AI algorithms leverage computer vision on the 2D images to guide the LiDAR scanning patterns in real-time so it can keep in check particular areas of interest within the perceived environment. This prioritization on-the-fly and the capability to analyze co-located pixels (2D) and voxels (3D) within a same frame allows the system to target and identify objects within a scene 10 to 20 times faster than LiDAR-only products, claims AEye.
In essence, the 2D colour image overlay allows AEye to add computer vision intelligence to the 3D point clouds, providing the iDAR with sufficiently detailed information to interpret signage, emergency warning lights, brake versus reverse lights, and other scenarios that have historically been tricky for legacy LiDAR-based systems to navigate.