Facial ID programs carry same bias as enforcers checking...

The ACLU and other organizations have asked Amazon to stop selling its facial-recognition tool, called Rekognition, to law enforcement agencies.

Photo: Carline Jean / Sun-Sentinel

2 of 3

A U.S. Customs and Border Protection officer helps a passenger navigate one of the new facial recognition kiosks at an airport gate.

Photo: David J. Phillip, Associated Press

3 of 3

An Apple employee demonstrates the facial recognition feature of the new iPhone X.

Photo: Eric Risberg, Associated Press

Back in the days of my dissipated youth, I was in need of a fake ID.

This was before Real ID and rampant identity theft, so it wasn’t a high-risk process. Any enterprising Kinko’s employee could knock together a fake ID for a minor, and half of the students in my college dorm had found at least one who was willing to do so. But I was unsatisfied with the quality of the results of what I’d seen.

A Trinidadian friend of mine came to the rescue with a gift: her elder sister’s old ID. She’d used it for years, she explained, and now that she was 21, she was happy to pass it on.

Her sister — I’m still sad that I never met her — and I had some features in common. We both had light brown skin, dark curly hair and almond-shaped eyes. If I saw the two of us on the street together, I’d believe we were cousins.

But we didn’t look like the same person. The elements of our faces didn’t match. Our jawlines, foreheads and noses were different. Our smiles had little in common.

She was right. I breezed past every single bouncer and bartender with that ID. I was amused by this but also slightly unnerved.

The memory came back to me this week when I learned about Amazon’s facial recognition service, which has been around since 2016. The reason Amazon’s program, which is called Rekognition, is in the news again is because the ACLU of Northern California obtained documents showing the company had been marketing it to local law enforcement agencies in states including Florida and Oregon.

Law enforcement agencies in California, Arizona and other places all indicated interest in Rekognition, but we don’t yet know how many of them are using it.

That’s problem No. 1.

Amazon has said it expects a common use of the Rekognition service will be tracking people. So marginalized communities, already under threat in this political climate, can expect even greater surveillance.

The company has also encouraged agencies to use the service to scan footage from police body cameras. Communities across the country have agitated for these devices in the hope that they’ll be tools for police accountability. Amazon’s service will turn them into nothing more than another surveillance device.

Amazon’s response to the issue hasn’t been convincing.

“Amazon requires that customers comply with the law and be responsible when they use (our) services,” said an Amazon spokesperson in a statement. “When we find that AWS services are being abused by a customer, we suspend that customer’s right to use our services.”

Problem No. 2: This is a lot more serious than shutting down someone’s Amazon Prime account.

Facial recognition systems are not neutral technology. In February, researchers from MIT and Stanford released a study of three commercially released facial analysis programs. They found a whopping percentage of errors based on race and gender. The three programs had only a 0.8 percent error rate in determining the gender of light-skinned men, but their error rates for darker-skinned women were off the charts — there was an error rate of 20 percent for one program and 34.7 percent for the other two.

In every way that matters, tech companies create products that reflect their workforces, not the world. That’s a problem for a lot of reasons.

But when it comes to matching facial recognition systems with law enforcement agencies, this is a problem with the biggest possible dimensions. What’s at stake is freedom versus imprisonment, and even life versus death.

The ACLU, along with dozens of civil rights organizations, has called on Amazon to stop providing Rekogniton to governments. That’s a good cause, especially since Amazon hasn’t released any of its data to the public. But it’s a limited one.

All brown people look alike here.

The vast expansion of facial recognition software and the expansion of artificial intelligence to videos and image-scanning means that we’re on the cusp of a new era in surveillance.

If we waltz into it with the same old values, the results will be inaccurate, dangerous and unjust. This is the path we’re already heading down. It’s not too late to change course.

Caille Millner is an editorial writer and Datebook columnist for the San Francisco Chronicle. She has worked at the paper since 2006. On the editorial board, she covers a wide range of topics including business, finance, technology, education and local politics. For Datebook, she writes a weekly column on culture.She is the recipient of the Scripps-Howard Foundation’s Walker Stone Award in Editorial Writing and the Society of Professional Journalists’ Editorial Writing Award.