Global warming and pollution are causing severe stress to coral reefs across the world.

Researchers from the University of California Berkeley and University of Queensland developed a deep learning process that automatically analyzes reef photos that will help measure reef health and changes over time. Reefs provide food and shelter for more than a quarter of all marine species, and support fish stocks that feed more than a billion people and provide jobs to millions of people in coastal areas.

The new technology “will allow the world’s scientists to more quickly assess the health of coral reefs at scales never dreamed of before,” said Ove Hoegh-Guldberg, chief scientist of the global reef record and a professor at the University of Queensland. With that information, they can more effectively take steps to protect and save them.

Oscar Beijbom, a postdoctoral scholar at Berkeley, used TITAN X GPUs and the Caffe deep learning framework to train their image recognition system to identify 40 different categories of corals, sponges, algae and other elements – which achieves a 900x speedup over previous methods.

Photo of a coral reef automatically analyzed by deep learning. The program is able to identify as many as 40 categories of corals, sponges, algae and other organisms. Image courtesy of the XL Catlin Global Reef Record.

All of the coral reef images – nearly 225,000 – are hosted on the CoralNet web portal and Beijbom will soon be adding the deep learning system to the platform which will help coral reef ecologists take advantage of automated analysis.

About Brad Nemire

Brad Nemire is on the Developer Marketing team and loves reading about all of the fascinating research being done by developers using NVIDIA GPUs. Reach out to Brad on Twitter @BradNemire and let him know how you’re using GPUs to accelerate your research. Brad graduated from San Diego State University and currently resides in San Jose, CA. Follow @BradNemire on Twitter