Flaws in facial recognition software are fueling discrimination against people with darker skin, according to a new study.

The Gender Shades project, conducted by the MIT Media lab, tested facial recognition software from Microsoft, IBM and Face++ on more than 1,200 images of people from six African and European countries. While the researchers found that the software can guess the gender of a person 99 percent of the time, it made over 35 percent more mistakes when studying the faces of people with darker skin.

Images of men produced around 8 percent of the errors. Meanwhile, pictures of women returned an error rate of more than 20 percent. The products made fewer errors when identifying subjects with lighter skin, with error rates ranging between 11 and 20 percent. “We found that all classifiers performed best for lighter individuals and males overall. The classifiers performed worst for darker females,” the report concluded.

The study claims that, while facial recognition software is increasingly used by law enforcement in the US, none of these products have been publicly tested for accuracy. “This kind of technology is being built on machine-learning techniques. And machine-learning techniques are based on data. So if you have biased data in the input and it’s not addressed, you’re going to have biased outcomes,” Joy Buolamwini, the study’s lead author, told NPR.

The MIT report said that these failures require urgent attention if companies are to build fair and transparent facial analysis systems, especially as machine-learning software is now being used to decide loan, job and university applications.

According to a report from Georgetown University’s Center on Privacy and Technology, some 16 states have let the FBI compare driver’s licenses to the faces of suspected criminals. More than 117 million Americans are thought to be on the facial recognition database.

The software is being rolled out by police forces around the world. In August, it was revealed that police in the UK have catalogued more than 20 million facial images – the equivalent of one third of Britain’s population. Earlier this month, Chinese police began tests of its facial recognition glasses to identify ‘fugitives.’ The glasses, which were being used by police in a train station, are connected to a tablet-like device that allows officers to take mugshots and compare them to a police database.