If you are not already concerned by the spread of electronic surveillance technology, you should be. The only saving grace of the Trump administration is that it is so incompetent, the likelihood of it availing itself of the gadgetry to suppress its enemies is marginal at the moment. Plus, the technology itself has not matured.

Big Brother Watch issues a report today on London's use of facial recognition technology. The Guardian reports that so far, it's a failure:

It says the technology, whereby computer databases of faces are linked to CCTV and other cameras, was used by the Metropolitan police to spot people on a mental health watch list at the 2017 Remembrance Sunday event in London. It was also used by South Wales police at protests against an arms fair. Police plan to use it at music festivals and other events.

Some in policing see facial recognition as the next big leap in law enforcement, akin to the revolution brought about by advances in DNA analysis. Privacy campaigners see it as the next big battleground for civil liberties, as the state effectively asks for a degree of privacy to be surrendered in return for a promise of greater security.

The goal is to turn your face into a walking ID card in real time. For now, tests of the technology show it delivering false positives in over 90 percent of cases.

Silkie Carlo, the director of Big Brother Watch, said: “Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified – or misidentified – everywhere they go.

“We’re seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.

“It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for and that poses a major risk to our freedoms.

National Public Radio reports that the technology promises to allow its masters to track residents of an entire city — or country. Police departments have been early adopters of the still crude technology, largely without oversight. But vendors are pushing hard, and the incremental advances mean the capabilities may be widespread before the public is really aware of it:

Jonathan Turley, a civil libertarian and law professor at George Washington University, worries that this kind of incrementalism will eventually lead to a "fishbowl society," in which it will be impossible to walk down the street without being identified.

"Unfortunately, it could happen in the United States. There's not a lot standing between instantaneous facial recognition technology and its ubiquitous use by police departments or cities," Turley says.

He says people shouldn't assume the courts will limit police use of facial recognition, especially if the real-time ID systems are first "normalized" in private settings. (NEC says it has already sold real-time facial recognition to private customers in the U.S., though it won't name them.)

"Policymaking by procurement"

As with surplus military hardware being disbursed to police departments post-September 11, there will be a strong impulse to use up all the cop equipment they have hanging around the police officer station. Cities such as Oakland finally are taking first steps to rein in technological advancement run amuck, Slate reports:

After Sept. 11, thanks in part to massive federal grants with few strings attached, local law enforcement agencies all over the United States began steadily acquiring and deploying powerful new policing tech. These surveillance technologies, often acquired and deployed unbeknownst to residents or city councils and usually without court approval or oversight, include cell-site simulators for tracking cellphone-call details (often referred to as stingrays), automatic license plate readers for tracking cars, drones for conducting aerial surveillance, gunshot-location technology that relies on citywide networks of high-powered microphones, and predictivepolicing algorithms that tend to push police to focus even more on already overpoliced communities. This trend of unrestrained acquisition and use of surveillance tools has been dubbed by some critics as “policymaking by procurement,” with important decisions being made about police power based simply on the fact that the feds were willing to cut a check for the tech, rather than being based on careful consideration by local elected officials.

As communities catch up with what is happening, cities and civil rights groups are supporting local ordinances requiring transparency and accountability. For all the good they will do to slow the spread of the tech.

The URSA program will explore situations and behaviors that will enable identification and discrimination between innocent civilians and individuals with hostile intent. Although the development of these probing behaviors will be an output of the program, a simple example of an URSA engagement may help clarify the program’s intended end-state and related technical challenges. For example: a static sensor located near an overseas military installation detects an individual moving across an urban intersection and towards the installation outside of normal pedestrian pathways. An unmanned aerial system (UAS) equipped with a loudspeaker delivers a warning message. The person is then observed running into a neighboring building. Later, URSA detects an individual emerging from a different door at the opposite end of the building, but confirms it is the same person and sends a different UAS to investigate. This second UAS determines that the individual has resumed movement toward a restricted area. It releases a nonlethal flash-bang device at a safe distance to ensure the individual attends to the second message and delivers a sterner warning. This second UAS takes video of the subject and determines that the person’s gait and direction are unchanged even when a third UAS flies directly in front of the person and illuminates him with an eye-safe laser dot. URSA then alerts the human supervisor and provides a summary of these observations, warning actions, and the person’s responses and current location.