Using UAVs for Search & Rescue

UAVs (or drones) are starting to be used for search & rescue operations, such as in the Philippines following Typhoon Yolanda a few months ago. They are also used to find missing people in the US, which may explain why members of the North Texas Drone User Group (NTDUG) are organizing the (first ever?) Search & Rescue challenge in a few days. The purpose of this challenge is to 1) encourage members to build better drones and 2) simulate a real world positive application of civilian drones.

Nine teams have signed up to compete in Saturday’s challenge, which will be held in a wheat field near Renaissance Fair in Waxahachie, Texas (satellite image below). The organizers have already sent these teams a simulated missing person’s report. This will include a mock photo, age, height, hair color, ethnicity, clothing and where/when this simulated lost person was last seen. Each drone must have a return to home function and failsafe as well as live video streaming.

When the challenge launches, each team will need to submit a flight plan to the contest’s organizers before being allowed to search for the missing items (at set times). An item is considered found when said item’s color or shape can be described and if the location of this item can be pointed to on a Google Map. These found objects then count as points. Points are also awarded for finding tracks made by humans or animals, for example. Points will be deducted for major crashes, for flying at an altitude above the 375 feet limit and risk disqualification for flying over people.

While I can’t make it to Waxahachie this weekend to observe the challenge first-hand, I’m thrilled that the DC Drones group (which I belong to), is preparing to host its own drones search & rescue challenge this Spring. So I hope to be closely involved with this event in the coming months.

Although search & rescue is typically thought of as searching for people, UAVs are also beginning to appear in conversations about anti-poaching operations. At the most recent DC Drones MeetUp, we heard a presentation on the first ever Wildlife Conservation UAV Challenge (wcUAVc). The team has partnered with Krueger National Park to support their anti-poaching efforts in the face of skyrocketing Rhino poaching.

The challenge is to “design low cost UAVs that can be deployed over the rugged terrain of Kruger, equipped with sensors able to detect and locate poachers, and communications able to relay accurate and timely intelligence to Park Rangers.” In addition, the UAVs will have to “collect RFID tag data throughout the sector; detect, classify, and tack all humans; regularly report on the location of all rhinos and humans; and receive commands to divert from general surveillance to support poacher engagement anywhere in the sector. They also need to be able to safely operate in same air space with manned helicopters, assisting special helicopter borne rangers engage poachers.” All this for under $3,000.

Why RFID tag data? Because rangers and tourists in Krueger National Park all carry RFID tags so they can be easily located. If a UAV automatically detects a group of humans moving through the bush and does not find an RFID signature for them, the UAV will automatically conclude that they may be poachers. When I spoke with one of the team members following the presentation, he noted that they were also interested in having UAVs automatically detect whether humans are carrying weapons. This is no small challenge, which explains why the total cash prize is $65,000 and an all-inclusive 10-day trip to Krueger National Park for the winning team.

I think it would be particularly powerful if the team could open up the raw footage for public analysis via microtasking, i.e., include a citizen science component to this challenge to engage and educate people from around the world about the plight of rhinos in South Africa. Participants would be asked to tag imagery that show rhinos and humans, for example. In so doing, they’d learn more about the problem, thus becoming better educated and possibly more engaged. Perhaps something along the lines of what we do for digital humanitarian response, as described here.

In any event, I’m a big proponent of using UAVs for positive social impact, which is precisely why I’m honored to be an advisor for the (first ever?) Drones Social Innovation Award. The award was set up by my colleague Timothy Reuter (founder of the the Drone User Group Network, DUGN). Timothy is also launching a startup, AirDroids, to further democratize the use of micro-copters. Unlike similar copters out there, these heavy-lift AirDroids are easier to use, cheaper and far more portable.

As more UAVs like AirDroids hit the market, we will undoubtedly see more and more aerial photo- and videography uploaded to sites like Flickr and YouTube. Like social media, I expect such user-generated imagery to become increasingly useful in humanitarian response operations. If users can simply slip their smartphones into their pocket UAV, they could provide valuable aerial footage for rapid disaster damage assessments purposes, for example. Why smart-phones? Because people already use their smartphones to snap pictures during disasters. In addition, relatively cheap hardware add-on’s can easily turn smartphones for LIDAR sensing and thermal imaging.

All this may eventually result in an overflow of potentially useful aerial imagery, which is where MicroMappers would come in. Digital volunteers could easily use MicroMappers to quickly tag UAV footage in support of humanitarian relief efforts. Of course, UAV footage from official sources will also continue to play a more important role in the future (as happened following Hurricane Sandy). But professional UAV teams are already outnumbered by DIY UAV users. They simply can’t be everywhere at the same time. But the crowd can. And in time, a bird’s eye view may become less important than a flock’s eye view, especially for search & rescue and rapid disaster assessments.

That’s why the SaR challenge will take place regardless of the weather (a point made explicit by the organizers). That’s also why the team for the WC challenge carried out a fact-finding mission that included an assessment of weather, environmental conditions, etc., in Krueger National Park and why they require winning solutions to be able to fly in relatively bad weather. I would also contend that most SaR missions do not in fact have to take place in bad weather, but rather are carried out immediately following a disaster to assess the overall impact, eg Typhoon Yolanda (2013), Haiti Earthquake (2010), etc. In addition, UAVs are already being used to find missing people in situations not related to weather-related disasters. Finally, some UAVs can handle relatively bad weather. My own experience during a major snow storm: https://www.youtube.com/watch?v=CsG7IgOCJ5c

I’m curious what the detection algorithm for “humans but not RFID-carrying” in Kruger would be. Thermal or MW infrared might be a part, but you’d have to distinguish between “human” and “large fauna,” which is, without a human at the sensor screen, rather a tall order.

The Fire Chief in our community wants my commercial UAV, (COA), service to photograph excercises. I’m told that sometimes a fire will occur and the fireman can’t readily see what’s happening to a building from a certain angel, the Skycam would have to fly over that area very high to avoid drafts. If there’s precipitation or crowds of people without protective head gear, then all bets are off.

He said there are some excercises that don’t’ involve burning buildings.