There is growing awareness of police use of facial recognition software. This is good. More awareness should lead to more debate and discussion about the appropriate use of this technology and the implications on privacy, data protection, civil liberties and to a degree, personal autonomy.

Recently, an office worker from Cardiff took South Wales police to court for the use of facial recognition technology, claiming his privacy was unlawfully violated. Ed Bridges told a three-day hearing at Cardiff civil justice and family centre that the use of the technology breaches data protection and equality laws and left him distressed. He explained in an opinion piece that without his consent or knowledge, his facial features had been scanned on two occasions – once on a high street and again at a protest - despite not being a suspect of a crime or on a criminal watchlist.

Bridges’ concerns are that there has been no public consultation or parliamentary debate about the rollout, no warning, automated facial recognition technology is proved to be inaccurate with certain demographics, and the police are violating our right to privacy. The police, government and technology companies cannot brush aside these issues.

Also last week, there was a surge in popularity of the story of a facial recognition technology trial by the Metropolitan Police in Romford at the beginning of the year. Once he was aware of the cameras, a passer-by covered his face and was promptly issued with a £90 fine for acting suspiciously. The story was initially only covered by one local newspaper, The Romford Recorder and one national paper but on 16th May other nationals ran the story, probably due to the surfacing of a BBC video of the incident. This video sparked a lot of debate on social media.

In the Romford case, the Metropolitan Police said that it informed passers-by informed of the facial recognition trial with large posters.

In the US, democratic Congresswoman Alexandria Ocasio-Cortez exposed the flaws of facial recognition technology and how it can exacerbate racial bias in the criminal justice system at a congressional hearing. With Joy Buolamwini, founder of the Algorithmic Justice League as an expert witness, it was explained that facial recognition algorithms are least effective on women and people of colour, and exclude people of different gender expressions.

Ocasio-Cortez asked: “So we have a technology that was created and designed on one demographic that is only most effective on that demographic, and they’re trying to sell it and impose it on the entirety of the country?”

The hearing took place on the same day that it was announced that Amazon shareholders chose not to ban use of its facial recognition technology by government and law enforcement. The ramifications of this are alarming.

Public consultation around facial recognition technology is overdue. And trust is priceless. Trust is built by openness, honesty, transparency and keeping one’s word. Police forces, governments and big tech companies are not bastions of these principles, but they could and should be.

It looks like public and political awareness is growing, but development and rollout of this technology is racing ahead. Debate, discussion and regulation that covers the use of this technology will need to catch up.

Did you find this content useful?

Thank you for your input

Thank you for your feedback

Next read

Talk of contact tracing apps to tackle COVID-19 serves to highlight the difficult balance between using data to personalise services and robust data protection. Whether your organisation is involved in building these solutions or not, Craig Suckling of IAG Loyalty suggests there are four principles to keep you on the right side of your customers.

You may also be interested in

Fraud costs the government an estimated £31 billion to £49 billion a year. To tackle thist, reforms to the government’s anti-fraud efforts are vital. Satrajit "Satty" Saha of TransUnion in the UK explains how commercial data organisations can play their part.

In this edition, Cathy Pendleton, senior data governance manager at Compare the Market, talks to DataIQ about how to make the protection of data an enabler of new business processes and why this is proving to be an attractive new career option in the industry. Plus diversity at Hastings Direct and KPMG.

Leyre Murillo-Villar is chief data officer and data control lead at BNP Paribas. She is also a member of the Women in Data 20 in Data and Tech list. She told DataIQ about the need to close the gender gap and how data is at the heart of managing risk.