By studying how different creatures navigate the world, we can design better drones and robots.

The birds and the bees and the drones

The bees were supposed to buzz in through a different entrance. But the cunning creatures had found a shortcut and streamed into the lab through a hole in the window. As Professor Srini Srinivasan watched the bees, he noticed something: they flew precisely through the middle of the hole, every time. How were they doing this without banging into the sides? ‘You cannot ask a bee, “What are you doing?” You have to design an experiment that tells you the answer,’ says Srinivasan.

It was a chance observation—but one that is now transforming the way we design robots and drones.

Emeritus Professor Mandyam Veerambudi Srinivasan, known simply as Srini to his colleagues and friends, has always been driven by curiosity. After an electrical engineering degree, Srinivasan shifted focus to modelling insect vision, eventually landing with bees as his primary research subject.

Emeritus Professor Srini Srinivasan in front of the Australian Academy of Science’s Shine Dome. Image: Australian Academy of Science

How a bee sees the world

Just by looking at a bee’s eyes, you can tell they must perceive the world very differently to us humans.

But aside from structural differences, one of the most important differences is how far apart the bee’s two ‘big eyes’ are. For example, our two eyes are six to eight centimetres apart, and this gives us stereoscopic vision, or the ability to perceive depth, distance and three dimensions.

Here’s how it works: look at one finger, and close one of your eyes. Now close that eye and open the other one. You’ll notice that the image of the finger jumps, as your two eyes take slightly different images of the finger. ‘Your brain is measuring this jump, or disparity, and triangulating to figure out how far away your finger is,’ explains Srinivasan. But a bee’s eyes are so close together it’s essentially like a cyclops, there’s no ‘jump’ in the image. ‘What is cool is that insects have evolved a completely different way to see the world in three dimensions and measure distances,’ he says.

Each of a bee’s ‘eyes’ is composed of thousands of ‘little eyes’ called ommatidia. Image adapted from: v2osk/Unsplash (CC0)

This brings us back to the sneaky shortcutting bees, which Srinivasan observed flying perfectly through the centre of the hole in the lab window.

To figure out how they might be achieving this feat of flight, Srinivasan filmed the bees as they traversed a corridor patterned with black and white stripes. As predicted, the bees flew (on average) straight down the middle of the tunnel. Next, he took one of the walls of the corridor and moved it in the same direction as the bees’ flight. As a result, the bees flew much closer to the moving wall. ‘The apparent speed of the image of the moving wall in the bee’s eye is much slower than the other wall, so she thinks the moving wall is further away and shifts closer to compensate,’ explains Srinivasan. The bees were measuring the optic flow—the speed of images moving past their eyes—and extracting three-dimensional and distance information from this.

Honeybees coming in for a landing. Studying how bees land can help us design better drones. Image adapted from: Eric Ward/Unsplash (CC0)

Making a beeline to better robots

It wasn’t just interesting—Srinivasan’s discovery also turned out to be useful in robotics. Designing a robot that uses stereoscopic vision, like humans, involves ‘cumbersome computation’. But using the bees’ method is much simpler. ‘This can be done with a simple wide-angle video camera, measuring the rate of motion of walls and controlling your trajectory based on the motion in the camera image, without any fancy computation,’ he explains.

Since then, Srinivasan has incorporated the brilliance of bees into his own robotic innovations. Bees control their flight speed based on image motion—how fast the environment is whizzing by—and this can be used by drones to control flight speed. The ability of bees to make a smooth landing without GPS or fancy radar inspired ‘a wonderful biological autopilot we’ve now put into drones’ and a bee’s ability to navigate up to 10km looking for food has inspired autonomous flight systems that don’t need GPS or radio control.

Free as a bird

Srinivasan’s passion lies with bees, but he has also looked to birds for inspiration. For example, he has found that budgies flying head-on towards one another will always veer to the right to avoid a collision. Interestingly, this same principle is taught to pilots—birds just figured it out a few million years ago.

But when it comes to choosing a gap to fly through, different individuals show different biases for left and right, independent (to some extent) of gap width. This might help prevent ‘traffic jams’ when a flock is flying through woodland—if all the birds always chose the widest gap, you’d end up with a budgie bottleneck! This same principle could be applied to a fleet of drones—simply program in a random collection of biases and your fleet of autonomous drones can safely and effectively navigate a dense environment.

A flock of budgerigars (budgies). Srini is studying how budgies avoid mid- air collisions.Image adapted from: peterichman/Flickr (CC BY 2.0)

But for Srinivasan, these robotic applications have never been the end goal. It’s always been about satisfying an innate curiosity. ‘It’s mostly about trying to answer interesting questions … to find out what it is that makes these lovely creatures tick’ he says. ‘If something sounds interesting, you follow your heart and you get into it.’

This is part of a series of articles featuring Australian science stories written for National Science Week, Australia’s annual celebration of science and technology. Funded by the Australian Government.

This article has been reviewed by the following experts: Emeritus Professor Srini Srinivasan FAA, Queensland Brain Institute, The University of Queensland; Professor Ian Reid, Australian Centre for Visual Technologies, University of Adelaide.