Amazon’s cheap facial recognition technology, Rekognition, ended up with egg on its face when it misidentified a significant percentage of members of the US Congress ‒ many of them people of color ‒ as criminal suspects in an experiment conducted by the ACLU.

The software, which Amazon is hawking to US law enforcement agencies, "incorrectly matched 28 members of Congress, identifying them as other people who have been arrested for a crime," the civil rights group wrote Thursday.

Sputnik News reported on Rekognition in May, at a time when Amazon was selling access to the tech to at least one sheriff's department for an amount between $6 and $12 a month, on the condition that they trumpet its boons to other customers, including body camera manufacturers.

According to the ACLU then, Amazon is "marketing Rekognition for government surveillance."

The technology is reportedly capable of identifying up to a hundred people in a single image. "Unlike anything else, it handles real-time video," Amazon Web Services CEO Andy Jassy said in a video published November 30, 2017. That would make it perfect to pair it with city surveillance camera systems, which Orlando, Florida, did for a time.

Later in May, Sputnik News reported that two congressional Democrats, Keith Ellison and Emanuel Cleaver, had penned a letter to Amazon CEO Jeff Bezos demanding a list of police departments using the tech in addition to other requests.

"A series of studies have shown that face recognition technology is consistently less accurate in identifying the faces of African-Americans and women as compared to Caucasians and men. The disproportionally high arrest rates for members of the black community make the use of facial recognition technology by law enforcement problematic, because it could serve to reinforce this trend," the letter says.

Now it seems the ACLU is trying to shore up more opposition from Congress against the program. The group's findings also validated the fears of Cleaver and Ellison. In the ACLU's test, Rekognition turned up false matches for six members of the Congressional Black Caucus. Forty percent of the total false matches from the US Congress were people of color, even though people of color make up only 20 percent of Congress.

The ACLU notes that they purchased Rekognition for just $12.33, "less than a large pizza."

Their methodology followed Rekognition orthodoxy to a tee, although Amazon now disputes it. First, they built the database of photographs of people who have been arrested (which doesn't make them de-facto criminals, as mugshots are taken prior to a conviction by trial). They used 25,000 photos that are publicly available. The Washington County Sheriff's Office in Oregon, which was the first reported police purchaser of Rekognition, built a database of 30,000, Sputnik News reported.

The ACLU "used the default match settings that Amazon sets for Rekognition." That, according to Amazon, is an "80 percent confidence" threshold for the software to categorize a comparison as a match.

Amazon rebutted the ACLU's findings in a lengthy statement issued to Business Insider, arguing that the civil rights group failed to follow best practices for the software. "While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals or other social media use cases, it wouldn't be appropriate for identifying individuals with a reasonable level of certainty," the company said.

The company said it recommends law enforcement customers set the threshold to 95 percent or higher, and it emphasized that Rekognition is not supposed to be used autonomously, but with a human partner.

Nonetheless, the ACLU is concerned with dangerous — potentially deadly — scenarios the technology could foment. "If law enforcement is using Amazon Rekognition, it's not hard to imagine a police officer getting a ‘match' indicating that a person has a previous concealed-weapon arrest, biasing the officer before an encounter even begins," they wrote. That same fear was espoused prior in the Democrats' letter to Bezos.

"A recent incident in San Francisco provides a disturbing illustration of that risk," the ACLU continued. "Police stopped a car, handcuffed an elderly Black woman and forced her to kneel at gunpoint — all because an automatic license plate reader improperly identified her car as a stolen vehicle."

Hello,
!

We are committed to protecting your personal information and we have updated our Privacy Policy to comply with the General Data Protection Regulation (GDPR), a new EU regulation that went into effect on May 25, 2018.

Please review our Privacy Policy. It contains details about the types of data we collect, how we use it, and your data protection rights.

Since you already shared your personal data with us when you created your personal account, to continue using it, please check the box below:

I agree to the processing of my personal data for the purpose of creating a personal account on this site, in compliance with the Privacy Policy.

If you do not want us to continue processing your data, please click here to delete your account.

promotes the use of narcotic / psychotropic substances, provides information on their production and use;

contains links to viruses and malicious software;

is part of an organized action involving large volumes of comments with identical or similar content ("flash mob");

“floods” the discussion thread with a large number of incoherent or irrelevant messages;

violates etiquette, exhibiting any form of aggressive, humiliating or abusive behavior ("trolling");

doesn’t follow standard rules of the English language, for example, is typed fully or mostly in capital letters or isn’t broken down into sentences.

The administration has the right to block a user’s access to the page or delete a user’s account without notice if the user is in violation of these rules or if behavior indicating said violation is detected.