How GPUs and Deep Learning Are Helping Protect Endangered Species

Finding a cheetah across a dozen square miles of African bush can be difficult.

But that same animal might lay down 200,000 footprints in a day. So finding its tracks is a lot easier.

And it turns out that, beyond being useful for locating animals, tracks can tell you a whole lot more.

This insight has led to a potentially revolutionary approach to wildlife conservation that’s about to be jump-started by deep learning and NVIDIA GPUs.

Zoe Jewell and Sky Alibhai are co-founders of WildTrack, a nonprofit devoted to monitoring endangered species. In time for World Environment Day, they’ve launched a program called ConservationFIT, where FIT stands for “footprint identification technology.”

The program seeks to crowdsource photos of animal footprints, and then use those images to build algorithms that can identify the species, individual, sex and age-class of the animal who made them.

The pair got their start in this endeavor during a two-year sabbatical in the 1990s. Their task was to monitor black rhinos for the Zimbabwean and Namibian governments during the height of a poaching crisis. They discovered their calling and never returned.

The black rhino population was being decimated because of the illegal trade in their horns. Governments had been using a combination of radio-collaring and de-horning as their primary protection methods.

After 10 years of wandering the bushlands, Jewell and Alibhai determined that those methods were failing. This was not only because the collars failed regularly, but also because the constant immobilization of females for re-collaring had the unintended, and devastating, effect of reducing their birth cycles from having one calf every three years to one every 10 years.

Follow the Footprints

Over time, the game scouts who worked with them repeatedly asked why they didn’t just follow the animals’ footprints. It turned out the scouts could learn a lot more from the footprints than simply where the animals were going.

“To our amazement, these indigenous experts were able to identify not only species, but also individuals, just from their footprints,” said Alibhai. “Time and again they would find a footprint, tell us the name of the rhino, and then track that animal down to prove the point.”

The pair spent years on painstaking attempts to track footprints with techniques such as tracing them with acetate or developing roll after roll of celluloid. Two developments in the mid-1990s changed everything: the advent of digital cameras and their discovery of JMP Software, the statistical analysis software unit of the SAS Institute, which enabled them to develop sophisticated statistical models.

Those technologies led to the development of FIT, which today is able to use photos to classify footprints by species, individual, sex and age-class.

“It does everything from image manipulation, to the analytics, to mapping distributions,” said Alibhai.

More Algorithms Needed to Meet Demand

So far, WildTrack has developed FIT algorithms for 15 species (including the black rhino that started it all), and demand for FIT has skyrocketed in the past year as field projects around the world look to ramp up monitoring of endangered species.

ConservationFIT was launched to help build algorithms faster by using photos uploaded from field biologists, trackers and citizen scientists armed with smartphones.

As a result, Jewell and Alibhai have been working with SAS, using NVIDIA GPUs, to explore deep learning options for keeping up with the expected increase in photos and metadata they contain.

“We’re in the early stages, exploring footprints, animal coat patterns, hair sample structures and other morphometric traits, but the initial results have greatly exceeded our expectations,” said Jewell. “We anticipate that NVIDIA GPUs will increase the speed at which we’re able to process data by several orders of magnitude.”

Eventually, Jewell and Alibhai want to take full advantage of the 8 billion visits recreational visitors currently make to protected areas around the world.

“Imagine if just one percent of those visitors carried smartphones and collected some footprint data,” said Jewell. “That would provide 80 million data points for deep learning and FIT to determine species distribution, and identify individuals.”

Watch WildTrack’s Jewell and Alibhai interview with Duke University for more about their conservation work.