Sub-terahertz sensor could help autonomous cars see through fog

A new sub-terahertz sensor developed at MIT could help autonomous vehicles and robots identify objects in foggy conditions where LiDAR fails.

(Credit: Pixabay)

Sub-terahertz wavelengths are between microwave and infrared radiation on the electromagnetic spectrum and can be detected through fog and dust clouds. A sub-terahertz imaging system sends an initial signal through a transmitter, then a receiver measures the absorption and reflection of the rebounding wavelengths. That sends a signal to a processor that recreates an image of the object.

The output signal can be used to calculate distance, similar to how LiDAR uses a laser to hit an object and rebound off it. By combining the output signals of an array of pixels and steering the pixels in a certain direction, high-resolution images can be formed. This not only allows for object detection but also recognition, something that’s crucial for higher level autonomous driving and advanced robotics functions. The work appears in the journal IEEE Xplore.

“A big motivation for this work is having better ‘electric eyes’ for autonomous vehicles and drones,” said co-author Ruonan Han, an associate professor of electrical engineering and computer science, and director of the Terahertz Integrated Electronics Group in the MIT Microsystems Technology Laboratories (MTL). “Our low-cost, on-chip sub-terahertz sensors will play a complementary role to LiDAR for when the environment is rough.”

Key to the success of the system is a decentralised design where each pixel in the array generates its own local oscillation signal. According to Han, a good analogy for the design is an irrigation system. A traditional irrigation system has one pump that directs a powerful stream of water through a network of pipes that distribute water to many sprinklers. Each sprinkler spits out water that has a much weaker flow than the initial flow from the pump. If you want the sprinklers to pulse at the exact same rate, that would require another control system.

By comparison, the MIT design gives each site its own water pump, eliminating the need for connecting pipelines and giving each sprinkler its own powerful water output. Each sprinkler also communicates with its neighbour to synchronise their pulse rates.

“With our design, there’s essentially no boundary for scalability,” said Han. “You can have as many sites as you want, and each site still pumps out the same amount of water…and all pumps pulse together.”

To maintain a relatively small footprint for each of the array’s 32 pixels, the team combined various functions of four traditionally separate components – antenna, downmixer, oscillator, and coupler – into a single “multitasking” component given to each pixel.

“We designed a multifunctional component for a [decentralised] design on a chip and combine a few discrete structures to shrink the size of each pixel,” Hu explained. “Even though each pixel performs complicated operations, it keeps its compactness, so we can still have a large-scale dense array.”