MENU

Sub-THz chip could help driverless cars ‘see’ through fog, dust

Sub-THz chip could help driverless cars ‘see’ through fog, dust

Technology News |
By Rich Pell



Akin to the working principle of LiDAR imaging, forward-transmitted sub-terahertz wavelengths bounce-off objects and the reflected signal can then be detected and processed for object detection and mapping. The difference being that sub-terahertz wavelengths (between microwave and infrared radiations) can be detected through fog and dust clouds with ease, whereas such conditions scatter the infrared-based signal emitted by LiDARs.

In a paper “A 32-Unit 240-GHz Heterodyne Receiver Array in 65-nm CMOS With Array-Wide Phase Locking” published in the IEEE Journal of Solid-State Circuits, the researchers report the integration of two interleaved 4×4 phase-locked dense heterodyne receiver arrays within a 1.2mm² die area, enabling the concurrent steering of two independent beams.

Based on their measurements, sensitivity (at a bandwidth of 1kHz) of a single unit was 58fW, offering a 4300x sensitivity improvement in phase-sensitive detection. The paper concludes that larger sensing arrays could be designed simply by tiling up more receiver units while still enabling a very compact design.

“A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones,” explained co-author Ruonan Han, an associate professor of electrical engineering and computer science, and director of the Terahertz Integrated Electronics Group in the MIT Microsystems Technology Laboratories (MTL). “Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough.”


The key to the design is what the researchers call “decentralization.” In this design, a single pixel — called a “heterodyne” pixel — generates the frequency beat (the frequency difference between two incoming sub-terahertz signals) and the “local oscillation,” an electrical signal that changes the frequency of an input frequency. This “down-mixing” process produces a signal in the megahertz range that can be easily interpreted by a baseband processor.

The output signal can be used to calculate the distance of objects, similar to how LiDAR calculates the time it takes a laser to hit an object and rebound. In addition, combining the output signals of an array of pixels, and steering the pixels in a certain direction, can enable high-resolution images of a scene. This allows for not only the detection but also the recognition of objects, which is critical in autonomous vehicles and robots.      

MIT – www.mit.edu

If you enjoyed this article, you will like the following ones: don't miss them by subscribing to :    eeNews on Google News

Share:

Linked Articles
10s