Study Finds Racial Bias in Amazon's Facial Recognition Tech

By
Michael KanJan. 26, 2019, 6:36 a.m.

'In light of this research, it is irresponsible for the company to continue selling this technology to law enforcement or government agencies,' wrote one of the researchers behind the study. However, Amazon is disputing the results.

Amazon's facial recognition product —which the company has been marketing to police departments— may have some serious bias against women of color.

A new study published on Thursday looked at whether the company's Rekognition product can accurately identify a person's gender. On photos of men, Amazon's system was nearly flawless. But not so much when it came to women with dark skin tones: The Rekognition product misclassified their gender 31 percent of the time.

The study came from Joy Buolamwini of the MIT Media Lab and Deborah Raji of the University of Toronto. A year ago, the two researchers tested the facial recognition products from Microsoft and IBM, and found the systems also struggled to accurately identify the gender of darker-skinned women. In response, both Microsoft and IBM updated their facial recognition technology to reduce the error rates.

Buolamwini is now calling on Amazon to address the bias of the company's Rekognition product, amid growing worries the technology is both error-prone and ripe for abuse. "In light of this research, it is irresponsible for the company to continue selling this technology to law enforcement or government agencies," Buolamwini wrote in a separate blog post.

However, Amazon is dismissing her study —which was conducted back in August— calling the results inaccurate. This is because the researchers based their testing on "facial analysis" as opposed to true "facial recognition," according to Matt Wood, the general manager of artificial intelligence at Amazon Web Services.

In facial analysis, the computer system is trying to assign generic attributes to a picture, such as whether the person shown is wearing glasses, has a mustache, or may be female. Recognition is different; it focuses on trying to find matching photos of a particular face.

"It's not possible to draw a conclusion on the accuracy of facial recognition for any use case —including law enforcement— based on results obtained using facial analysis," Wood said in a statement.

"The results in the paper also do not use the latest version of Rekognition and do not represent how a customer would use the service today," he added.

But Buolamwini is pushing back against Amazon's claims. "If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free," she wrote in her 3000-word blog post, which responds to the company's criticism of her study.

The co-author of the study, Deborah Raji, also told PCMag that their tests of the Rekognition system occured under favorable conditions when the pictures of the subjects are clear, and easy to view. "This is a demonstration of how badly the technology fails in even incredibly easy cases," she said in an email.

Their study arrives as dozens of civil rights groups and even Amazon employees have spoken out against the company's plans to sell the facial recognition software to government agencies. A major worry is that the same technologies will one day power mass surveillance. Later this spring, a group of Amazon investors plan on protesting by calling for a shareholder vote demanding the e-commerce giant halt the sales until the technology has been proven to be safe.

In response to the concerns, Amazon has said, "This technology is being implemented in ways that materially benefit society, and we have received no indications of misuse." If abuse is ever detected, the company has pledged to ban the offender from using Amazon's software again.

Editor's Note: This story has been updated with comment from Deborah Raji.