An anonymous reader quotes the Independent:
Millions of people face the prospect of being scanned by police facial recognition technology that has sparked human rights concerns. The controversial software, which officers use to identify suspects, has been found to be “staggeringly inaccurate”, while campaigners have branded its use a violation of privacy. But Britain’s largest police force is set to expand a trial across six locations in London over the coming months.
Police leaders claimed officers make the decision to act on potential matches with police records and images that do not spark an alert are immediately deleted. But last month The Independent revealed the Metropolitan Police’s software was returning “false positives” — images of people who were not on a police database — in 98 percent of alerts… Detective Superintendent Bernie Galopin, the lead on facial recognition for London’s Metropolitan Police, said the operation was targeting wanted suspects to help reduce violent crime and make the area safer. “It allows us to deal with persons that are wanted by police where traditional methods may have failed,” he told The Independent, after statistics showed police were failing to solve 63 per cent of knife crimes committed against under-25s….
Det Supt Galopin said the Met was assessing how effective facial recognition was at tackling different challenges in British policing, which is currently being stretched by budget cuts, falling officer numbers, rising demand and the terror threat.

A policy officer from the National Council for Civil Liberties called the technology “lawless,” adding “the use of this technology in a public place is not compatible with privacy, and has a chilling effect on society.”

But a Home Office minister said the technology was vital for protecting people from terrorism, though “we must ensure that privacy is respected. This strategy makes clear that we will grasp the opportunities that technology brings while remaining committed to strengthening safeguards.”