A broad coalition of over 50 civil liberties groups delivered a letter to the Justice Department’s civil rights division Tuesday calling for an investigation into the expanding use of face recognition technology by police. “Safeguards to ensure this technology is being used fairly and responsibly appear to be virtually nonexistent,” the letter stated. The routine unsupervised use of face recognition systems, according to the dozens of signatories, threatens the privacy and civil liberties of millions — especially those of immigrants and people of color.

These civil rights groups were provided with advance copies of a watershed 150-page report detailing — in many cases for the first time — how local police departments across the country have been using facial recognition technology. Titled “The Perpetual Lineup,” the report, published Tuesday morning by the Georgetown Center on Privacy & Technology, reveals that police deploy face recognition technology in ways that are more widespread, advanced, and unregulated than anyone has previously reported.

“Face recognition is a powerful technology that requires strict oversight. But those controls by and large don’t exist today,” said Clare Garvie, one of the report’s co-authors. “With only a few exceptions, there are no laws governing police use of the technology, no standards ensuring its accuracy, and no systems checking for bias. It’s a wild west.”

Of the 52 agencies that acknowledged using face recognition in response to 106 records requests, the authors found that only one had obtained legislative approval before doing so. Government reports have long confirmed that millions of images of citizens are collected and stored in federal face recognition databases. Since at least 2002, civil liberties advocates have raised concerns that millions of drivers license photos of Americans who have never been arrested are being subject to facial searches — a practice that amounts to a perpetual digital lineup. This report augments such fears, demonstrating that at least one in four state or local law enforcement agencies have access to face recognition systems.

Among its findings, the report provides the most fine-grained detail to date on how exactly these face recognition systems might disproportionately impact African-Americans. “Face recognition systems are powerful — but they can also be biased,” the coalition’s letter explains. While one in two American adults have face images stored in at least one database, African-Americans are more likely than others to have their images captured and searched by face recognition systems.

In Virginia, for instance, the report shows how state police can search a mug shot database disproportionately populated with African-Americans, who are twice as likely to be arrested in the state. Not only are African-Americans more likely to be subject to searches, according to the report, but this overrepresentation puts them at greatest risk for a false match.

These errors could be compounded by the fact that some face recognition algorithms have been shown to misidentify African-Americans, women, and young people at unusually high rates. In a 2012 study co-authored by FBI experts, three algorithms that were tested performed between 5 and 10 percent worse on black faces than on white faces. And the overall accuracy of systems has been shown to decrease as a dataset expands. The Georgetown report interviewed two major facial recognition vendors which said that they did not test for racial basis, despite the fact that systems have been shown to be far from “race-blind.”

A slideshow on San Diego’s privacy policy obtained by the researchers reveals that people of color in the county are between 1.5 and 2.5 more likely to be targeted by its surveillance systems. San Diego County uses a mugshot-only system, and repeated studies have shown that African-Americans are twice as likely as white people to be arrested and searched by police.

New York Police Department officers watch demonstrators as they stage a die-in on the floor of Grand Central Station in New York on December 6, 2014.

Photo: Eduardo Munoz Alvarez/AFP/Getty Images

First Amendment Concerns

The Georgetown report shows for the first time that at least five major police departments have “run real-time face recognition off of street cameras, bought technology that can do so, or expressed a written interest in buying it.” They warn that such real-time surveillance tracking could have serious implications for the right to associate privately.

“This is the ability to conduct a real time digital manhunt on the street by putting people on a watchlist,” explained Alvaro Bedoya, the executive director of the Georgetown Center and one of the report’s co-authors. “Now suddenly everyone is a suspect.” Real-time recognition, he added, could have a chilling effect on people engaging in civil conduct. “It would be totally legal to take picture of people obstructing traffic and identify them.”

Indeed, as the ACLU revealed last week, face recognition systems were used to track Black Lives Matter protesters in Baltimore. “There’s a question of who is being subjected to this kind of facial recognition search in the first place,” David Rocah, a staff attorney at the ACLU of Maryland, told the Baltimore Sun. “Is it only Black Lives Matter demonstrators who get this treatment? Are they drawing those circles only in certain neighborhoods? The context in which it’s described here seems quintessentially improper.”

Bedoya pointed out that these systems in Baltimore uploaded social media photographs of protestors into these systems to conduct real-time street surveillance. “It turns the premise of the Fourth Amendment on its head,” he added.

The Georgetown report shows that some departmental policies allow for face recognition algorithms to be used in the absence of an individualized suspicion, which means the technology could conceivably be used to identify anyone. At least three agencies, according to the report, allow face recognition searches to identify witnesses of a crime in addition to criminal suspects.

As privacy organizations have previously noted, the FBI’s federal database includes and simultaneously searches photographic images of U.S. citizens who are neither criminals or suspects. The Georgetown report likewise shows that some state databases include mug shots, while others include both mug shots and driver’s license photos.

In a landmark Supreme Court decision on privacy, in which the justices unanimously concluded that the prolonged use of an unwarranted GPS device violated the Fourth Amendment, Justice Sotomayor wondered whether “people reasonably expect that their movements will be recorded and aggregated in a manner that enables the government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.”

Of the 52 agencies found by the report to have used face recognition, however, only one department’s policy explicitly prohibited officers from “using face recognition to track individuals engaging in political, religious, or other protected free speech.”

Apart from some news stories focusing on the policies of specific departments, most notably those of San Diego County, reporting on law enforcement’s use of face recognition technology has been scarce. Departments themselves have not been forthcoming about their use of the technology to identify suspects on the streets and to secure convictions. And many of the documents obtained by privacy organizations about face recognition programs largely date to 2011, prior to the federal face program’s full implementation.

No Oversight, Little Data

This is partly due to how little information is available. There is no national database of departments using these programs, how they work, what policies govern them, who can access them, and how the passive information is being collected and queried. The Georgetown report, compiling tens of thousands of records produced in response to Freedom of Information requests sent to fifty of the largest police departments across the country, provides the most comprehensive snapshot to date of how and on whom face recognition systems are used — and what policies constrain their use, if any. But even this picture continues to be partial, given the continued lack of transparency of several large law enforcement agencies with some of the most advanced systems.

The researchers state that despite several news articles and descriptions of the New York Police Department’s face recognition program, the NYPD denied their records request entirely, arguing that the records fell under a “non-routine techniques and procedures” exemption. Likewise, while the Los Angeles Police Department has claimed to use real-time, continuous face recognition and has made decades of public statements about the technology, the department found “no records responsive to [their] request” for information about this or any other face recognition system. “We followed up with a number emails and calls inquiring what that meant,” Garvie said. “The final word was that they found no records responsive.”

Of the 52 agencies that did provide responsive records to the researchers, at least 24 did not provide a face recognition use policy. Four of those two dozen agencies admitted that they expressly lacked any policy whatsoever to govern their face recognition systems.

Civil rights groups have long described the difficulties of calling for greater oversight for a system whose contours, uses, and abuses are unknown. The amount of up-to-date public records collected by the Georgetown researchers has the potential to change this and spark a national conversation on oversight, Bedoya said.

“I genuinely hope that more and more of the American public has a chance to see what’s at stake here,” Bedoya said, describing face recognition as “an extraordinarily powerful tool.” “It doesn’t just track our phones or computers. It tracks our flesh and our bones. This is a tracking technology unlike anything our society has ever seen. You don’t even need to touch anything.”

No national guidelines, laws, or polices currently regulate law enforcement’s use of face recognition technology. To fill this gap, the Georgetown report proposes protective legislation for civil liberties, limits on the amount and types of data stored, and a push for independent oversight and public notice procedures.

Among their recommendations, the Georgetown researchers advise that mug shots, rather than driver’s license and ID photos, be used to populate photo databases for face recognition, and for those images to be “periodically scrubbed to eliminate the innocent.” They also suggest that financing for police face recognition systems be contingent “on public reporting, accuracy and bias tests, legislative approval—and public posting—of a face recognition use policy.”

In Seattle, where a face recognition program was funded by a $1.64 million grant from the Department of Homeland Security, some of these model guidelines are already in place. Only specially trained officers use the software, real-time use is banned, and the software’s use is limited to scanning suspicious subjects only.

The ACLU, when it first investigated nascent uses of face recognition technology back in 2002, presciently warned that the “worst-case scenario … would be if police continue to utilize facial recognition systems despite their ineffectiveness because they become invested in them, attached to government or industry grants that support them, or begin to discover additional, even more frightening uses for the technology.”

The Georgetown report offers a glimpse into this worst-case scenario, but Bedoya is hopeful that the Model Face Recognition Act proposed by the report and endorsed by the letter’s signatories provides a “deeply reasonable” solution. He pointed to the fact that state legislatures have previously passed laws to limit geolocation technology by police, automatic license plate readers, drones, wiretaps and other surveillance tools. “This is very feasible. It’s not about protecting criminals. It’s about protecting our values.”

Top photo: A facial recognition program is demonstrated during the Biometrics 2004 exhibition and conference on October 14, 2004, in London.

We depend on the support of readers like you to help keep our nonprofit newsroom strong and independent. Join Us

Here we have yet another misapplication of technology to the solution of what is essentially a social problem. Even more frightening than that is the fact that the people who employ it have no comprehension of its limitations. How many police, do you think, have studied probability and statistics to a sufficient degree to understand the meaning, much less the ubiquity, of Type I and Type II errors? How many of them can articulate the limitations of machine learning, and explain the implications out of sample errors for the results of their facial recognition software? None of them! And nobody in their organizations! But unscrupulous vendors are always happy to make a buck by selling their wares, and none of the above are concerned in the least about sending innocent people to jail.

It is concerning that facial recognition software doesn’t work equally well on all ethnic groups. If people are going to lose their privacy, African Americans should too. It is maddening to think of them moving around in relative privacy, while everyone else’s movements are tracked, placed in databases and analyzed.

As Mrs. Clinton would say, we need a new Manhattan project to ensure that facial recognition software works for everybody. Taking away the rights of all groups, equally, is the American way. I would like to thank the author of this article for reminding us that although much progress has been achieved, there is much work left to do.

If you have a Colorado-issued ID, you are automatically entered into a facial recognition database. How do I know? I got my license renewed a few years ago, and after taking the photo, the clerk remarked “Wow, you have a really symmetrical face!” I asked him what he was talking about, and he proudly described how the DMV has a new system of facial recognition software and everyone who gets their picture taken for an ID. I looked it up online, and sure enough, there was nothing on the DMV site, but there were a few small articles in the local news about the new system, assuring us it would only be used for “internal” use.

Five years later, if you look it up online, there’s a snarky article about the same thing. Only this time, they are assuming that facial recognition is a “given”, and they are making fun of the fact that you can’t smile in your picture, because it messes up the results. And five years later, there’s nothing about it on the DMV site, and there’s no way of opting out.

The author just wrote an article 10/13 on Steve Talley, a man who was wrongfully arrested for a robbery he couldn’t have committed, but who was supposedly identified based on facial recognition software. The police claim that after the pictures of the robber was posted, three “friends” turned him in, and then they used facial recognition to match up current pictures of Talley with the images from the robbery.

Funny thing, though. The robber was described as being 5′ 8″ and slender, with a profile picture that shows a small nose. But supposedly three “friends” turned in Talley – who is almost 6′ 4″ and a large man with a large nose – as the robber.

Bullshit. It seems likely to me that Denver Police took a front-facing photo of the robber, ran a facial-recognition algorithm on it, and then compared it to the Colorado ID database. If you look at front-on photos, and that alone, Talley actually looks fairly similar to the robber. But if you look at the video, you can see that the robber is significantly smaller than 6′ 4″, and in a profile shot, the robber has a small, almost delicate nose, while Talley has a rather large nose with a pronounced bridge. There’s just no way, if you look at the video and non-frontal-facing stills, that you would think he was “that guy”.

I do not for one minute think that “burqa bans” have anything to do with Islamophobia, nor does the “creepy clown hysteria” have anything to do with prankster clowns. These fuckers think ahead, and they are already setting out their position to keep people from evading their spying.

And I think they think further ahead than this, because there’s only so much law enforcement you can do. I mean, having facial recognition of everybody just to jail criminals only gets you as far as there are criminals, and the population is already pretty cowed. No, I think it has more to do with, say, watching to see whether somebody goes for a soda can in a machine that costs a dollar. If the poor fool will buy soda at that rate, then the next time he walks up to a supermarket rack there will be little electronic numbers for the price that spin and go way up when they see the mark arrive. For him to cover his face and deny the soda machine operator the right to this valuable intellectual property, why, that would be THEFT. And corporate America cannot abide thieves, oh no!

If you like the old East Germany; North Korea etc. you are gonna luv the new Amerika and the new world where everyone is surveiled all the time. The excuse is, it will make you safer from the chance of being killed by a terrorist…riiiight. You are more likely to get hit by lightening and much more likely to be killed by a texter, maybe yourself. Perhaps we should all wear the Niquab. Please quit seeing racism everywhere, this is not about controlling petty crime done by blacks and it certainly ain’t about controlling white collar crime especially at the top; this is about controlling YOU!

I would like to see facial recognition software made widely available along with feeds from all the cameras watching the public, so that we can watch the government and the enforcers equally as they watch us.

Ironic, isn’t it, that the very people who rush to adopt new technology that subjects us all to unreasonable searches can be so negative about wearing body cams, and threaten or even arrest people who photograph them. We do in fact live in an Animal Farm society, in which everyone is equal, except that some (the pigs) are more equal than the others.

The use of this software sounds like a good thing. Unlike the author, I’m not too convinced that the “mug shot database [is] disproportionately populated with African-Americans, who are twice as likely to be arrested in the state”, because African Americans commit disproportionate amounts of crime.

Justice should be certain and swift. It makes the system more effective, most criminologists argue.

In an ideal world, anyone with an arrest warrant out for them would be swiftly arrested. Most criminologists think that the ideal criminal justice system is swift and certain. This technology will helps us move further to that vision.

The Intercept has pretty much lost all credibility with me because every single issue has to have some racial element to it now. A straight forward article about facial recognition would have been great, but EVERYTHING now is about race.

Oh for fucks sake. Spare us. Meanwhile, why not brush up on some skillsets to improve your understanding of the world..like..reading. For starters..try Tying Shoes for Dummies. Rumor has it you keep tripping over your shoelaces. At least, once you’ve mastered tying your shoes, you will have eliminated one more thing people laugh at you about. Then..try reading the 12 Step System to Stop Looking Stupid.

Nice…name calling and insults..the usual tactic. My skillset includes Army Ranger for 4 years, inner city school science teacher for 9 years and now a first responder at a California Correctional Facility. Two bachelors and a Masters. Does that qualify me to go head to head with you on a commentary section? I’ll go back to school and get my PhD in Keyboard Warriorhood. Until then, I’ll defer to your superior intellect.

Crimes are defined by those in power, to ensure the stability of the system and protect their own lives, assets, and way of life. This means systemic racism (and sexism etc) is inevitable in a sytem created and largely run by rich white men.
So yeah, there are more black “criminals”, because of how crime is defined and policed. That doesnt make it right. The press’s job is to watch the system and report on problems, which in the US case, very much includes racism.

Are you arguing that “rich white men” are responsible for defining armed robbery and murder as crimes? Because AAs are disproportionately represented in prisons for committing those crimes. I think that no matter who is in charge those acts would be considered to be crimes. You might disagree.

African-Americans are no more likely to use drugs than whites – but they get stopped and searched for drugs a lot more. That leads to more being caught with drugs, not because they use drugs more, but because they’re looked at harder. When whites put down African-Americans for breaking the law, it usually involves forgetting just how many times you pulled a fast one and got away with it — or got involved with something that you know a black guy would have gone to jail for. I mean, like you’re in a car and the guys who picked you up start sharing a joint – you didn’t ask for it, but you’re still going to jail if a cop decides you failed to signal and searches the car.

Now as for violent crime, there are specific reasons for that – some of the reasons you have to trace back to the CIA-Contra conspiracy in the 1980s. It’s not really the race, but what’s been done to the race. But for this particular purpose I think the difference in violent crime is of little importance because the violent criminals are only a small proportion of the total mugshot database, so it doesn’t change the numbers much.

Wnt, the war on drugs accounts for less than 20% of the prison population. And in any case, African Americans are overrepresented in violent crime and property convictions as well. We’re not going to stop putting people in jail for that.

If you really want to believe that the CIA is responsible for persistently high rates of African American criminality, you would be advised to keep in mind that their crime started spiking in the 60s, though it’s always been higher per capita than whites.