Robot bees fly and swim, soon they’ll have laser eyes

Robot insects may someday be used in agriculture and disaster relief situations. Credit: Microrobotics Lab, Harvard John A. Paulson School of Engineering and Applied Sciences and the Wyss Institute for Biologically Inspired Engineering.

The flying insects, designed for agriculture and disaster relief, will rely on technology used by driverless cars

“Essentially, it’s the same technology that automakers are using to ensure that driverless cars don’t crash into things.”

Karthik Dantu, assistant professor,

School of Engineering and Applied Sciences, University at Buffalo

Credit: Microrobotics Lab, Harvard John A. Paulson School of
Engineering and Applied Sciences and the Wyss Institute for
Biologically Inspired Engineering.

BUFFALO, N.Y. – How do you teach robotic insects to
see?

By equipping them with tiny laser-powered sensors that act as
eyes, enabling the miniature machines to sense the size, shape and
distance of approaching objects.

“Essentially, it’s the same technology that
automakers are using to ensure that driverless cars don’t
crash into things,” says University at Buffalo computer
scientist Karthik Dantu. “Only we need to shrink that
technology so it works on robot bees that are no bigger than a
penny.”

The UB-led research project, funded by a $1.1 million National
Science Foundation grant, includes researchers from Harvard
University and the University of Florida. It is an offshoot of the
RoboBee initiative,
led by Harvard and Northeastern University, which aims to create
insect-inspired robots that someday may be used in agriculture and
disaster relief.

Researchers have shown that robot bees are capable of tethered
flight and moving while submerged in water. One of their
limitations, however, is a lack of depth perception. For example, a
robot bee cannot sense what’s in front of it.

This is problematic if you want the bee to avoid flying into a
wall or have it land in a flower, says Dantu, who worked on the
RoboBee project as a postdoctoral researcher at Harvard before
joining UB’s School of Engineering and Applied Sciences in
2013 as an assistant professor.

The UB-led research team will address the limitation by
outfitting the robot bee with remote sensing technology called
lidar, the same laser-based sensor system that is making driverless
cars possible.

Lidar (short for light detection and ranging) works like radar,
only it emits invisible laser beams instead of microwaves. The
beams capture light reflected from distant objects. Sensors then
measure the time it takes the light to return to calculate the
distance and shape of the objects.

The information is then analyzed by computer algorithms to form
a coherent image of the car’s path. This enables the car to
“see” its environment and follow traffic signs, avoid
obstacles and make other adjustments.

These systems, which are typically mounted on the car roof, are
about the size of a traditional camping lantern. The team Dantu
leads wants to make them much smaller, a version called
“micro-lidar.”

University of Florida researchers will develop the tiny sensor
that measures the light’s reflection, while Dantu will create
novel perception and navigation algorithms that enable the bee to
process and map the world around it. Harvard researchers will then
incorporate the technology into the bees.

The technology the team develops likely won’t be limited
to robot insects. The sensors could be used, among other things, in
wearable technology; endoscopic tools; and smartphones, tablets and
other mobile devices.