The growing use of facial-recognition systems has led to a high-tech form of racial profiling, with African Americans more likely than others to have their images captured, analyzed and reviewed during computerized searches for crime suspects, according to a new report based on records from dozens of police departments.

The report, released Tuesday by the Center for Privacy & Technology at Georgetown University’s law school, found that half of all American adults have their images stored in at least one facial-recognition database that police can search, typically with few restrictions.

The steady expansion of these systems has led to a disproportionate racial impact because African Americans are more likely to be arrested and have mug shots taken, one of the main ways that images end up in police databases. The report also found that criminal databases are rarely “scrubbed” to remove the images of innocent people, nor are facial-recognition systems routinely tested for accuracy, even though some struggle to distinguish among darker-skinned faces.

The combination of these factors means that African Americans are more likely to be singled out as possible suspects in crimes — including ones they did not commit, the report says.

“This is a serious problem, and no one is working to fix it,” said Alvaro M. Bedoya, executive director of the Georgetown Law center that produced the report on facial-recognition technology. “Police departments are talking about it as if it’s race-blind, and it’s just not true.”

The 150-page report, called “The Perpetual Line-Up,” found a rapidly growing patchwork of facial-recognition systems at the federal, state and local level with little regulation and few legal standards. Some databases include mug shots, others driver’s-license photos. Some states, such as Maryland and Pennsylvania, use both as they analyze crime-scene images in search of potential suspects.

At least 117 million Americans have images of their faces in one or more police databases, meaning their resemblance to images taken from crime scenes can become the basis for follow-up by investigators. The FBI ran a pilot program this year in which it could search the State Department’s passport and visa databases for leads in criminal cases. Overall, the Government Accountability Office reported in May, the FBI has had access to 412 million facial images for searches; the faces of some Americans appear several times in these databases.

Several law enforcement agencies also have expressed interest in real-time facial-recognition technology, potentially giving police the ability to instantly identify people walking by a camera posted on a city street — or attending a political protest that authorities are recording, the report says.

A coalition of civil rights and civil liberties groups, provided with advance copies of the Georgetown report, plans to deliver a sharply worded letter to the Justice Department’s civil rights division on Tuesday calling for investigation into the use — and possible abuse — of facial-recognition technology.

The groups note the technology’s deployment during protests after the police-involved death of a black man in Baltimore and warn that unregulated use could make African Americans reluctant to attend events where facial images might be captured, chilling their rights to free speech and assembly. The report said protesters arrested for minor crimes such as trespassing can end up in facial-recognition databases for their rest of their lives, exposing them to an enhanced level of police scrutiny even if charges were later dropped.

“Face recognition technology is rapidly being interconnected with everyday police activities, impacting virtually every jurisdiction in America. Yet, the safeguards to ensure this technology is being used fairly and responsibly appear to be virtually nonexistent,” wrote the American Civil Liberties Union and the Leadership Conference on Civil and Human Rights, an umbrella group with more than 200 member organizations. More than 40 civil rights and other groups signed the letter.

The FBI said facial-recognition technology is a valuable tool for fighting crime. The technology provides leads that investigators can pursue but does not by itself single out people for arrest. The bureau also said that concerns about racial disparities are misplaced.

“Facial recognition algorithms are developed in the computer vision field, based solely on pattern matching techniques,” the FBI said in a statement. “ ‘Facial recognition’ algorithms do not actually compare ‘faces’ and they do not consider skin color, sex, age, or any other biographic.”

Yet the uneven impact of surveillance tools is a source of growing concern for civil liberties and civil rights groups. Experts say that a wide range of technologies — body cameras, cellphone tracking, facial recognition and others — is creating what amounts to feedback loops that reinforce and amplify traditional forms of police bias.

Virginia State Police, for example, can search a mug-shot database with 1.2 million images that skews toward African Americans because they are twice as likely to be arrested than the overall population in that state, the report said. Maricopa County, Ariz., uploaded the entire driver’s-license and mug-shot database from the government of Honduras, a major source of immigration to Arizona.

In Pinellas County, Fla., a pioneer among police forces using facial-recognition systems, Sheriff Bob Gualtieri called allegations of disparate racial or ethnic impact “a bunch of nonsense.”

Florida’s database of 11 million law enforcement photos, Gualtieri said, helps Pinellas County deputies identify robbers captured in bank videos, uncooperative people in routine police stops and even victims of fatal accidents or killings. All of the images are public record; the facial-recognition technology only helps investigators compare existing images more efficiently, he said.

When asked about racial disparities in arrest rates, Gualtieri replied, “That’s a whole different discussion.” His county’s system, he said, “is not capturing any [people] other than those who are charged with crimes.”

Yet he acknowledged that the database had not been audited for accuracy or racial disparities. People represented in some databases can seek court action to have their images removed, but the process typically is arduous.

Research from experts increasingly has found disproportionate racial impact in advanced surveillance technologies. The problem is not the technologies themselves but the underlying bias in how and where they are deployed, said Sakira Cook, counsel at the Leadership Conference on Civil and Human Rights. That is especially true when arrests — as opposed to successful prosecutions — are key factors in determining whose images appear in databases.

A higher arrest rate, Cook said, “doesn’t necessarily mean that African Americans are committing crimes at higher rates. It just means that they are being policed, surveilled and arrested at higher rates. It compounds the issue of racial profiling.”

The stakes may be highest for freedom of political expression. One activist, Chris Wilson, was arrested for trespassing after blocking entrance to the Florida State Fair to protest the death of a black teenager shortly after he was released from police custody on a dangerous roadway. The charges against Wilson were dropped in exchange for community service and a fine, but the booking photos made their way into the Pinellas County face database — and potentially into the FBI’s national database as well.

That is worrying, said Wilson, who is African American and in North Dakota protesting a pipeline construction project. “It worries me so much that I cover my face when I go out and protest,” Wilson said. “It really is scary to think that they’re tracking you.”

Comments

Craig TimbergCraig Timberg is a national technology reporter for The Washington Post. Since joining The Post in 1998, he has been a reporter, editor and foreign correspondent, and he contributed to The Post’s Pulitzer Prize-winning coverage of the National Security Agency. Follow