Bugs Taking Over Robot Guidance

One of Centeye's earlier obstacle-avoidance aircraft designs. Using standard hobbyist components to build the models, Centeye has constructed toy-like aircraft with complete vision systems that consume a small fraction of a watt of power, with each sensor weighing less than two-tenths of an ounce and the complete aircraft only around 3.5 ounces.

Courtesy of Centeye

Share

Bugs Taking Over Robot Guidance

One of Centeye's earlier obstacle-avoidance aircraft designs. Using standard hobbyist components to build the models, Centeye has constructed toy-like aircraft with complete vision systems that consume a small fraction of a watt of power, with each sensor weighing less than two-tenths of an ounce and the complete aircraft only around 3.5 ounces.

Large UAVs that fly at high altitudes employ sensing mechanisms based on GPS or radar technologies, but those methods fail when it comes to scaled-down vehicles with smaller wingspans (ranging from an inch or so to about 3 feet). The smaller UAVs fly closer to the ground, navigate complex terrain and weigh only a few grams. It's hard to work radar into that kind of payload, and GPS lacks accuracy when it comes to low-level flight – it's not possible to program every single building and bush into the autopilot system.

And besides, GPS doesn't exist on other planets.

To create intelligent artificial-vision packages that weigh only a few grams and contain all the necessary optics, hardware and software, researchers have turned to creatures that manage it all with brains that weigh less than a milligram.

Although insects can see the entire visual sphere (humans are limited to seeing approximately 30 percent of the view sphere), their eyes are set too close together to provide reliable depth information. But that hasn't stopped the creatures – with their tiny brains and low-resolution eyes – from navigating through complex, cluttered environments.

The key lies in understanding how insects perceive their world – a concept called optic flow.

"The principle is simply that, if the insect flies along a straight line, objects that are near it appear to whiz by much more rapidly in the eye than objects that are far away," says Srinivasan. "Thus, the distance to an object can be inferred in terms of the velocity of its image in the eye – the greater the velocity, the nearer the object."

Present models of aircraft being developed by the team include a miniature video camera. The camera transmits signals to a ground station which analyses the incoming images to compute optic flow, then radios back the appropriate commands. Two demonstrations are scheduled to take place later this year at NASA to see if the technology can be included in future Martian aerial probes.

Geoffrey L. Barrows, CEO of Centeye, is using a different approach, developing optic flow sensors that acquire and process images simultaneously.

"One can say that our imaging chips are neural network chips, although highly specialized for a specific task," says Barrows. "They literally generate kilobytes per second, but each bit is much more valuable, allowing us to increase the frame rate to thousands of frames per second, capture events that would otherwise be missed and even use a simple 8-bit microcontroller rather than a Pentium-class CPU for the processor."

Using standard hobbyist components to build the models, Barrows has constructed toy-like aircraft with complete vision systems – imaging chips and all processors included – that consume a small fraction of a watt of power, with each sensor weighing less than two-tenths of an ounce and the complete aircraft only around 3.5 ounces.

The group has managed to make the airplane fly at a constant altitude, ascend or descend, and avoid collisions with trees and buildings. Its members now are working on enabling the craft to fly through a tunnel. Though the sensors currently are quite primitive, Barrows envisions a massively intelligent chip with circuitry allowing it to see in both infrared and ultraviolet wavelengths.

"Our artificial insect vision system is at present 1 percent to 5 percent as powerful as that of a fruit fly," says Barrows. "Instead of using Moore's law to provide more pixels per imager, we will ultimately use it to increase the amount of circuitry per pixel, allowing the imaging chip to capture more useful information. 'Tera-op on a chip' is not beyond current capabilities."

Besides military applications, the technology has enormous commercial spinoffs, including intelligent vehicle systems, autonomous robots, intelligent toys, panoramic imaging systems and sensors that aid the blind, just to name a few.

Professor Adrian Horridge, a neurobiologist who has worked with the Australian National University, says he believes that we will see unmanned planes flying major air-transport freight routes in the next five years.

Defence Science and Technology Organisation's Chahl sees other advances creating a buzz in the future.

"I would predict that within 15 years there will be extremely small aircraft that have tasks and behavior that are in some way similar to honeybees, such as gathering, cleaning, monitoring, etc.," says Chahl.