Dynamic Vixels are 3D volume pixels (voxels) from ranging lidar that are combined with pixels from digital 2D cameras to crate a dynamically scalable sensor data type.
The data type is intended to are designed to be a more useful receptable for data from AEye's iDAR (Intelligent Detection and Ranging) perception system. In so doing the data type strengthens a biomimetic approach to visual perception, essentially enabling vehicles to see and perceive more like humans to better evaluate potential driving hazards and adapt to changing conditions, the company said.
Dynamic Vixels create content that inherits both the ability to evaluate a scene using the entire existing library of 2D computer vision algorithms as well as capture 3D and 4D data concerning not only location and intensity but also deeper insights such as the velocity of objects.
AEye's iDAR uses dynamic Vixels to maintain situational awareness while tracking objects of interest.
"We know from experience that when you fuse a camera and LiDAR mechanically at the sensor, the integration delivers data faster, more efficiently and more accurately than trying to register and align pixels and voxels in post-processing. The difference is significantly better performance," said Luis Dussan, founder and CEO of AEye, in a statement. "One nice consequence that comes out of the architecture is we give our customers the ability to add the equivalent of 'human reflexes' to their sensor stack," Dussan added.
Dynamic Vixels can also be encrypted. The technology enables each sensor pulse to deal appropriately with challenging issues such as interference, spoofing, and jamming. Issues that will become increasingly important as millions of units are deployed worldwide.
As a result of early fusion of background and moving data iDAR can identify objects with minimal structure more easily, such as bicycles and objects of the same colour such as a black tyre on black asphalt.
AEye's first iDAR-based product, the AE100 artificial perception system, will be available this