The project

Quick description

The topic of our master thesis project is quite straightforward, namely we want to achieve the following:

An unmanned aerial vehicle (UAV) is keeping track of a ground robot that leads a flock of similar robots. The UAV flies autonomously and uses its cameras to help the units moving in a defined formation, while communicating with the leader through an interfacing device.

Hardware involved

AR.Drone by Parrot

This is an inexpensive UAV with great embedded technology that came out late August 2010. Its main features are:

One VGA front camera (640*480 pixels, 93° angle)

One CMOS vertical camera

Ultrasound altimeter for vertical stabilization

Communication over WiFi connection for navigation data and video transmission

Inertial guidance (accelerometer, gyrometer) with security system

Autopilot maneuvers (take-off, landing, hovering)

2D tags and 3D beacons detection for augmented reality

12-15 minutes flying time

This drone comes with a API that eases the programing of devices interfaced with it. So far, Parrot did not provide developers with the necessary tools to access and change the drone’s firmware to make a custom one. However, it is possible to do everything to control the drone remotely and get feedbacks from all the sensors and components.

Interfacing device : laptop or Android phone

The drone is only able to communicate with a device through a WiFi channel (or a USB connection, but this is of no use while flying). The robots we are using – see below – are only Bluetooth or USB enabled. On top of that, the processing power of one robot is not be sufficient enough to compute the image analysis and all the communication protocols. So, in order to have them send data to each others, we need an other layer in our transmission process.

The apparent easiest way for that purpose is to use a laptop. Since neither the drone nor the robots are able to carry one with them, a necessary improvement consists in having this job done by a smartphone carried by a robot, preferably the leader.

We plan on programming this task in Java on an Android phone like the HTC Desire or the Nexus One. They can both use Bluetooth and WiFi connections and developers have already worked a lot on this platform. Besides, the AR.Drone has been mostly thought to be interfaced with smartphones, and current applications to pilot or play with it show that it is indeed possible to do a lot with them.

LEGO Mindstorms NXT Robots

So far, we have been essentially taught robotics by the means of LEGO robots. Those are really handy when it comes to quickly build and program prototypes. They are compatible with a wide range of sensors and motors, and combined with thousands of LEGO bricks it is possible to build almost everything.

The major element in the LEGO Mindstorms products is called NXT. It is a brick shaped computer that can connect up to three servomotors and four sensors. Communication with a computer or other devices is done thanks to Bluetooth or USB. Lots of programming languages are supported, including Java with the LeJOS API.

The robots built all have the same look and are basic ground vehicles with two motors, carrying 2D tags on their tops to be tracked by the UAV.

Our task

Through the realization of our master thesis, we want to study the feasibility of this concept, and we aim at showing its potential efficiency. The ultimate objective would be to have a sturdy system that could be exhibited at events and shows.

Our biggest challenge is first to establish a reliable communication interface between the AR.Drone and the robot leader using a computer or an Android smartphone in-between. The phone will have to help the drone flying autonomously (i.e. without human assistance) while the latter is keeping track of all the robots at the same time, and especially follows the leader. Once done, the UAV should be able to report the coordinates of each robot of the flock.

With those information in hand, we have to analyze and find an efficient way of moving the flock in a defined formation. All robots but the leader should run the exact same program, while the leading unit is exchanging positioning, tracking and navigation data with the drone. It will be important to take likely obstacles into account that could be detected by the means of embedded sensors on the robots, or with the help of the vertical camera of the drone.

A good thing would be to implement the possibility to find and help a lost robot of the formation thanks to the drone.

Applications

Even if we will mainly focus on proving that our concept is working with the elements we plan on using, we might think on real life scenarios for the whole system:

A flock of ground robots could be send on a location hazardous for humans, and probe a specific area while being air-assisted by the drone, which would provide them with more global information and dispatch the robots to different places. Human operators would be able to follow the operations by switching between cameras from the UAV and others carried by robots. The leader might also be remote-controlled for accurate movement. A flock of drones could be added to have a better coverage.

A UAV could patrol over a city, looking for incidents. Once one has been detected, it would contact a dispatch central that would send robotic units on the ground. The drone would then assist them to go to the indicated location, where the robots would be able to handle the situation (car accident with injured people who need to be sent to the nearest hospital, thievery followed by a chase…).

Multiple scenarios can be found for an event presentation, like a game where a remote-controlled vehicle has to flee from a drone and robots pursuit. Or a formation of robots could simply run around among visitors while followed by the UAV.