Log in

Network

To get the most from our network, tell us a little about your interests. Job and news alerts can be sent according to your interest and other users will be able to see your selection. You can skip this section and sign up in your account at a later date.

US police discontinue Amazon's facial recognition technology

The Orlando police department’s controversial pilot of facial recognition technology has ended following public outcry.

Jun 27, 2018

By Serena Lander

The department trialled Amazon’s Rekognition by comparing photographs to a large database of images. It reportedly has not used any images of the public during this period.

The technology – which is powered by artificial intelligence – can track and analyse up to 100 people in a single image including emotions and age ranges.

The pilot was lamented by civil rights group American Civil Liberties Union (ACLU) which leaked emails showing the technology had been sold to US law enforcement agencies in Washington, Oregan and Orlando, Florida.

Nicole Ozer, Technology and Civil Liberties director for the ACLU of California said “Rekognition marketing materials read like a user manual for authoritarian surveillance”.

Amazon staff and shareholders have now contacted the company’s chief executive, Jeff Bezos, stating that they “refuse to contribute to tools that violate human rights”.

The letter also called for greater “transparency and accountability measures, that include enumerating which law enforcement agencies and companies supporting law enforcement agencies are using Amazon services, and how”.

An Amazon spokesperson told Police Professional: “We did a professional services engagement with the City of Orlando that was a pilot and had a discernible end date. That this engagement ended was expected and is not news.”

In a similar case, the Metropolitan Police Service (MPS) confirmed it would not be using facial recognition technology at this yearâ€™s Notting Hill Carnival. The system, which is ostensibly still on trial, has been used at the event each year since 2016 and was expected to be used at the carnival along with seven further deployments over the coming months.

The decision was taken following an investigation by campaigning group Big Brother Watch, which found that the equipment used by the MPS had a 98 per cent false positive rate.

There is currently no legal guidance covering the police use of automated facial recognition to identify people in crowds and from CCTV footage, despite fears that this amounts to illegal mass surveillance.