OCEANS:

Biologists try anti-terror technology to spy on corals

LA JOLLA, Calif. -- Coral ecologists are taking a page out of the national security playbook: facial recognition technology.

Techniques that can pick a terrorist out of a crowd are being adapted for underwater images, producing data on reefs in record time.

"It's a technology domain whose time has arrived," said Greg Mitchell, an optical ecologist at Scripps Institution of Oceanography. "There is going to be a huge expansion of our ability to understand what's going on in these systems."

Coral ecologists are seeking to document changes to reefs, like this one near Yanuca, Fiji, by taking underwater photos and having a computer evaluate the images automatically. Photo courtesy of David Kline.

Mitchell and his colleagues teamed up with University of California, San Diego, computer vision scientists, who aim to automatically extract information from photos or videos. Over the past several years, the group has developed software that analyzes a coral reef picture in seconds.

The software not only saves hours of "mind-numbing" work, but also opens the door to much larger and more frequent surveys. If a camera attached to an underwater vehicle patrolling protected areas takes thousands of pictures, the information can be processed in a matter of days and given to resource managers.

"The more data you have, the better ecological insight you have into success or failure of a management plan," said David Kline, a Scripps coral ecologist working on the project.

Monitoring on that scale wouldn't be possible without advances in digital cameras, more efficient batteries, faster computers and ever-cheaper data storage, Mitchell noted.

"It's big data applied to ecology," he said.

The team has made the software available for free online, and researchers from around the world laud it as a powerful new tool.

Rusty Brainard, chief of the Coral Reef Ecosystem Division at the National Oceanic and Atmospheric Administration, said the program will help his team analyze images more efficiently.

However, he noted that faster analysis is not necessarily required by reef managers because "most management decisions aren't being made tomorrow."

But there are other potential benefits to automation, Brainard added. Sometimes human-collected survey data are questioned by fishermen or developers whose activities may be restricted as part of reef management, he said. If the automatic analysis can prove it's more accurate than humans, he said, it might assuage those concerns and smooth the management process.

'The algorithm does better'

To document changes to coral reefs, researchers and managers first need base-line data about the quantity and type of corals in each area.

Historically, divers swam along predetermined lines and counted what they saw, taking notes on an underwater clipboard. These days, researchers swim or are towed behind boats along those same lines, but take pictures and later count corals in the lab.

Digital cameras allow them to cover more ground, but it could take up to 20 minutes to identify 200 points in one image. If a researcher took 1,000 pictures on just one dive, much data could remain unprocessed simply because there wasn't enough time.

The computer vision program helps solve that problem. But can managers trust the technology to be as good as a human expert?

The computer correctly identified corals versus other materials like rock, sand or algae between 92 and 95 percent of the time, the researchers said. Within corals, it correctly assigned the coral's genus -- the biological classification level above species -- 97 percent of the time.

"The algorithm does better [than humans] distinguishing genus level of corals," said Oscar Beijbom, a computer vision science Ph.D. student working on the project.

But the computer had some trouble specifically classifying the full spectrum of reef components: rock, algae, microalgae, sand and several classes of corals. It correctly identified randomly selected points in the picture between 74 and 83 percent of the time.

While the computer makes mistakes, so do humans. Even coral experts disagree over classification, or make mistakes because of the repetitive nature of the task, the researchers noted. The computer "turns out to be mostly within the error bars of what people do," said David Kriegman, a computer science professor at UCSD developing the software.

How it works

Once found only in science fiction movies, facial recognition technology is everywhere -- it's how a camera phone picks out people to focus on, and how Facebook suggests friends to tag in a picture.

But while faces are easier to differentiate, it's difficult to pick out any one object in a coral reef because the materials are jumbled together. So Beijbom and Kriegman designed software to decode images using texture and color.

Team member Tali Treibitz photographs a reef around Totoya island in Fiji using fluorescent light. Corals glow in bright colors under fluorescent light, making it easier for the computer to identify them in the pictures. Photo by Keith Ellenbogen.

Examined pixel by pixel, corals, algae, rocks and even sand have unique patterns -- a bit like fingerprints, Beijbom said. The researchers decided to identify the patterns associated with each element and build a reference database with thousands of examples.

They started with 2,000 images from a National Science Foundation long-term research site, a coral reef by Moorea Island in French Polynesia in the South Pacific. Coral ecologists had manually identified 400,000 points within the images, providing a "gold mine" from which to build the reference database.

Now, when presented with a new coral reef image, the computer extracts information about color and texture and compares it to the reference database. It labels hundreds of points within each picture based on what they most closely match. The whole process takes all of 20 seconds, compared to a person's typical five to 20 minutes.

While ecologists would love to simply upload images into the computer and have results spit out in a half hour, it's unlikely they will achieve complete automation, Beijbom said.

The same species of corals have different shapes at different depths and in different parts of the world. Even in the same location, different water and light conditions could skew the analysis, so researchers have to manually label a small part of their new data sets to calibrate the computer, before letting it loose on the rest of their images.

"I don't think it's a matter of being fully automated," Beijbom said. "It's needing less and less annotations."

Fewer annotations would be welcomed by Vincent Moriarty, who works on the Moorea Coral Reef Long Term Ecological Research site, conducting coral reef surveys and annotating the thousand images taken each year.

"I am looking forward to handing this part of my job over to a computer at some point," he said.

Improved accuracy

Accuracy improves as the database grows with each new processed image; this is how the computer "learns." That's in part why the team freely provided the software online at CoralNet. Researchers from around the globe can upload their images for automatic analysis, which helps the software continue to learn and improve with more examples from different locations.

One CoralNet user is the Catlin Seaview Survey, which is on a mission to systematically document the world's coral reefs in 360-degree, high-resolution images. The group has taken more than 100,000 images of the Great Barrier Reef and is now working in the Caribbean, and praised CoralNet for extracting useful information from its images 50 times faster than a human expert.

"CoralNet is a very powerful tool for automated image annotation," said Manuel González-Rivero, lead shallow-reef survey scientist for the Catlin Seaview Survey. "It is flexible to our needs and user-friendly."

The Scripps and UCSD team continues to tweak the program to improve accuracy. For example, they are investigating fluorescent imaging of coral reefs.

The bright colors make it much easier for humans -- and the computer -- to identify corals in images. It could also help improve the accuracy of the automatic analysis by recognizing dead corals that have the same texture pattern as live corals, but should not be logged as such.

Another benefit of the fluorescent images: Baby corals that are almost impossible to see in regular photographs become bright beacons.

"We are trying to improve it," Kline said, "so we can quantify how many baby corals are being produced, which is really important for the recovery of a reef."

Want to read more stories like this?

E&E is the leading source for comprehensive, daily coverage of environmental and energy politics and policy.

Click here to start a free trial to E&E -- the best way to track policy and markets.