Next-gen artificial perception better mimics human vision
Called “Dynamic Vixels,” the iDAR enhancement combines pixels from digital 2D cameras with voxels from AEye’s Agile 3D LiDAR (Light Detection and Ranging) sensor into a single super-resolution sensor data type. This advancement, says the company, enables vehicles to see and perceive more like humans to better evaluate potential driving hazards and adapt to changing conditions.
It is claimed to be the first time a real-time integration of all the data captured in pixels and voxels is combined into a data type that can be dynamically controlled and optimized by artificial perception systems at the point of data acquisition. According to the company, Dynamic Vixels create content that inherits both the ability to evaluate a scene using the entire existing library of 2D computer vision algorithms as well as capture 3D and 4D data concerning not only location and intensity but also deeper insights such as the velocity of objects.
“There is an ongoing argument about whether camera-based vision systems or LiDAR-based sensor systems are better,” says Luis Dussan, Founder and CEO of AEye. “Our answer is that both are required – they complement each other and provide a more complete sensor array for artificial perception systems.”
“We know from experience that when you fuse a camera and LiDAR mechanically at the sensor, the integration delivers data faster, more efficiently and more accurately than trying to register and align pixels and voxels in post-processing,” says Dussan. “The difference is significantly better performance.”
The company’s iDAR perception system mimics how a human’s visual cortex evaluates a scene and calculates potential driving hazards. Using embedded artificial intelligence within a distributed architecture, iDAR employs Dynamic Vixels to critically and actively assess general surroundings to maintain situational awareness, while simultaneously tracking targets and objects of interest – enabling iDAR to act reflexively to deliver more accurate, longer range and more intelligent information faster.
Dynamic Vixels can also be encrypted. The patented technology enables each sensor pulse to deal appropriately with challenging issues such as interference, spoofing, and jamming.
This new way of collecting and inspecting data using at the edge-processing of the iDAR system, says the company, enables an autonomous vehicle to more intelligently assess and respond to situational changes within a frame, thereby increasing the safety and efficiency of the overall system. For example, iDAR can identify objects with minimal structure – such as a bike – and differentiate objects of the same color, such as a black tire on asphalt.
In addition, Dynamic Vixels can leverage the capabilities of agile LiDAR to detect changing weather and automatically increase power during fog, rain, or snow. Likewise, says the company, iDAR’s heightened sensory perception allows autonomous vehicles to determine contextual changes – such as a child’s facial direction – which can be identified to calculate the probability of the child stepping out onto the street, enabling the car to prepare for the likelihood of a halted stop.
“There are three best practices we have adopted at AEye,” says Blair LaCorte, Chief of Staff. “First: never miss anything; second: not all objects are equal; and third: speed matters. Dynamic Vixels enables iDAR to acquire a target faster, assess a target more accurately and completely, and track a target more efficiently – at ranges of greater than 230 meters with 10% reflectivity.”
The company’s iDAR perception system includes 71 intellectual property claims on the definition, data structure, and evaluation methods of dynamic Vixels. These patented inventions, says the company, contribute to significant performance benefits, including a 16x greater coverage, 10x faster frame rate, and 7-10x more relevant information that boosts object classification accuracy while using 8-10x less power.
AEye’s first iDAR-based product, the AE100 artificial perception system, announced earlier this year, will be available this summer to OEMs and Tier one companies launching autonomous vehicle initiatives.
Related articles:
Agile sensor technology may surpass lidar
Intel among robotic vision startup investors
LiDAR market set for 43% CAGR