Deputies are allowed to run black and white sketches through Amazon's software, Rekognition, in the hope that the artist's impression of the suspect finds a match.

It's unclear how effective this method is, but Washington County police told the Post's Drew Harwell that in one test case, using a sketch had identified a man they'd already flagged as a suspect.

AI experts told the Post that using a sketch could increase the likelihood of a false match, a sentiment which was echoed by Privacy International's Frederike Kaltheuner when contacted by Business Insider.

"This adds another layer of complexity that will likely increase error rates," said Kaltheuner, who heads up the organization's programme on corporate exploitation.

Amazon told the Post that using sketches doesn't contravene its rules, but said that it would expect police to "pay close attention to the confidence of any matches produced this way." Confidence is the percentage rating Rekognition gives any match it makes, and Amazon recommends that law enforcement set a threshold of 99% when using the software.

The Post found, however, that deputies weren't even shown this rating when using Rekognition, but were shown five possible matches for each search, irrespective of the system's confidence in its match.

A previous report from Gizmodo also revealed that police do not necessarily adhere to Amazon guidelines on confidence ratings, with a Washington County police public information officer saying "we do not set nor do we utilize a confidence threshold."

An Amazon spokeswoman told Business Insider in a statement: "WCSO's use of the suspect sketch was an experiment, and not part of the current system that is in use. Regardless, we would expect that the results of any matches would be thoroughly reviewed by humans, that no automated action was taken, and that the reviewers would pay close attention to the confidence of any matches produced this way, in addition to the usual processes surrounding the use of sketches in law enforcement."

"Generally speaking, we are quite concerned about the use of facial recognition by police departments — both when it works and when it doesn't. When it works it turns people into walking ID cards, when it doesn't it risks incriminating the innocent who then have to prove that they are not guilty," said Kaltheuner.