MENU

Stacked event-based vision sensor boasts highest HDR

Stacked event-based vision sensor boasts highest HDR

Technology News |
By eeNews Europe



Announced at the International Solid-State Circuits Conference (ISSCC), the 1280×720 HD stacked event-based vision sensor detects changes in the luminance of each pixel asynchronously and outputs data including coordinates and time only for the pixels where a change is detected. This approach enables the vision sensor to achieve high resolution, high speed, and high time resolution of 1μs despite its small size and low power consumption. Their is no “typical” power consumption figures for such sensors, but each pixel draws 35nW and event detection for a given pixel only draws 137pJ. The companies says the sensor consumes 32nW when events are detected at a rate of 100,000 events/s and 73nW for 300,000 events/s.

This accomplishment was made possible by combining technical features of Sony’s stacked CMOS image sensor, resulting in small pixel size and excellent low light performance that are achieved by the use of Cu-Cu connection, with Prophesee’s Metavision event-based vision sensing technologies. The newly developed sensor is suitable for various machine vision applications, such as detecting fast moving objects in a wide range of environments and conditions.

The back-illuminated CMOS image sensor section (top pixel chip) and the logic chip (bottom) incorporate signal processing circuits which detect changes in luminance based on an asynchronous delta modulation method. Arrayed separately, each pixel of the two individual chips is electrically connected using a Cu-Cu connection that provides electrical continuity via connected copper pads when stacked.


The high HDR performance is made possible by placing only back-illuminated pixels and a part of N-type MOS transistor on the pixel chip (top), thereby allowing the aperture ratio to be enhanced by up to 77%. High sensitivity/low noise technologies Sony has developed over many years of CMOS image sensor development enable event detection in low-light conditions down to 40mlx.

While a frame-based sensor outputs entire images at fixed intervals according to the frame rate, an event-based sensor selects pixel data asynchronously using a row selection arbiter circuit. By adding time information at 1μs precision to the pixel address where a change in luminance has occurred, event data readout is ensured with high time resolution. The companies say the chip supports an output event rate up to 1.066Gevents/s by efficiently compressing the event data, i.e. luminance change polarity, time, and x/y coordinate information for each event.

Prophesee – www.prophesee.ai

Related articles:

2D materials promise ultra-efficient neuromorphic computing

Sony acquires Swiss vision sensor firm

Prophesee: Metavision for machines

 

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s