
Silicon image sensor also computes
Researchers at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) say they have developed the first in-sensor processor that could be integrated into commercial silicon imaging sensor chips. Such chips – known as complementary metal-oxide-semiconductor (CMOS) image sensors – are used in nearly all commercial devices that need to capture visual information, including smartphones.
The ability to use in-sensor image processing, in which important features are extracted from raw image data by the image sensor itself instead of a separate microprocessor, can speed up the visual processing and potentially could mean the difference between avoiding an obstacle or getting into a major accident, say the researchers.
“Our work,” says Donhee Ham, the Gordon McKay Professor of Electrical Engineering and Applied Physics at SEAS and senior author of a paper on the research, “can harnesses the mainstream semiconductor electronics industry to rapidly bring in-sensor computing to a wide variety of real-world applications.”
The researchers developed a silicon photodiode array to capture images that differs from the arrays on commercially-available image sensing chips in that it is electrostatically doped, meaning that sensitivity of individual photodiodes, or pixels, to incoming light can be tuned by voltages. An array that connects multiple voltage-tunable photodiodes together can perform an analog version of multiplication and addition operations central to many image processing pipelines, extracting the relevant visual information as soon as the image is captured.
“These dynamic photodiodes can concurrently filter images as they are captured,” says Houk Jang, a postdoctoral fellow at SEAS and first author of the paper, “allowing for the first stage of vision processing to be moved from the microprocessor to the sensor itself.”
The silicon photodiode array can be programmed into different image filters to remove unnecessary details or noise for various applications. An imaging system in an autonomous vehicle, for example, may call for a high-pass filter to track lane markings, while other applications may call for a filter that blurs for noise reduction.
“Looking ahead,” says Henry Hinton, a graduate student at SEAS and co-first author of the paper, “we foresee the use of this silicon-based in-sensor processor not only in machine vision applications, but also in bio-inspired applications, wherein early information processing allows for the co-location of sensor and compute units, like in the brain.”
Looking ahead, the researchers say they plan to increase the density of photodiodes and integrate them with silicon integrated circuits.
“By replacing the standard non-programmable pixels in commercial silicon image sensors with the programmable ones developed here,” says Jang, “imaging devices can intelligently trim out unneeded data, thus could be made more efficient in both energy and bandwidth to address the demands of the next generation of sensory applications.”
For more, see “In-sensor optoelectronic computing using electrostatically doped silicon.”
