Inside Unmanned Systems

APR-MAY 2018

Inside Unmanned Systems provides actionable business intelligence to decision-makers and influencers operating within the global UAS community. Features include analysis of key technologies, policy/regulatory developments and new product design.

Contents of this Issue

Navigation

Page 59 of 67

AIR SOFTWARE
60 April/May 2018 unmanned systems
inside
incorporate other pieces of information, such
as uncertainty in the data from the drone's
depth sensors, Florence said. He and his col-
leagues will present their latest findings in
May at the IEEE International Conference
on Robotics and Automation in Brisbane,
Australia.
MIMICKING CARS AND BICYCLES
Drones f lying through cities have more than
trees to contend with. There are also pedestri-
ans, cyclists and automobiles that can all move
unpredictably. Nevertheless, people routinely
navigate such dynamic environments without
the aid of complex, expensive devices.
"When we drive a car or
ride a bicycle, we are al-
ready pretty good at this
task, and we don't require
any costly sensor, but just
rely on what we see with
our eyes," Loquercio said. "From there, we had
the idea of imitating cars and bicycles to let a
drone f ly through cities."
Loquercio and his colleagues developed
an artificial intelligence system known as
DroNet, which is a kind of neural network. In
such a system, components dubbed neurons
are fed data and cooperate to solve a problem,
such as recognizing an obstacle. The neural
network then repeatedly adjusts the connec-
tions between its neurons and sees if the new
patterns of connections are better at solving
the problem. Over time, the neural net discov-
ers which patterns are best at computing solu-
tions and adopts them as their defaults, mim-
icking the process of learning in the human
brain. The Swiss researchers previously used
neural networks to help drones autonomously
recognize and follow forest trails to help them
find injured or lost hikers.
In their latest work, the scientists trained
DroNet on images collected by cars and bi-
cycles traveling in urban environments. The
aim was to have DroNet analyze video col-
lected from a drone's forward-facing camera
and predict where to steer the drone "and, if
something dangerous is happening, whether to
stop," Loquercio said.
Specifically, DroNet learned what angles to
steer at to avoid obstacles by analyzing more
than 70,000 images captured from cars. The
data was gathered by Udacity, a firm working
on the world's first open-source self-driving
car—a project since spun off into the com-
pany Voyage. DroNet also learned how to
estimate the probability of a collision by ana-
lyzing roughly 32,000 images the scientists
collected by mounting a GoPro camera on the
handlebars of a bicycle and riding to different
parts of Zurich.
The researchers tested DroNet using a
commercia l Pa r rot Bebop 2 quad-rotor
drone. The system ran on an Intel Core i7
2.6-gigahertz CPU, receiving pictures from
and sending commands to the drone through
WiFi. The artif icial intelligence relied on
greyscale images only 200 by 200 pixels
large from the drone.
By imitating cars and bicycles, DroNet au-
tomatically learned to respect traffic rules,
such as how to move down a street without
drifting off into the oncoming lane, or how to
stop when obstacles such as pedestrians, other
Photo courtesy of University of Zurich.
By imitating cars
and bycicles, the
drone automatically
learned to respect the
safety rules.
"WORKING ON ROBOTS RACING
THROUGH FORESTS IS JUST INTRINSICALLY
SUPER FUN TO ME."
Peter Florence, lead researcher, NanoMap