Artificial intelligence is slowly becoming the first round of human resources departments everywhere. To increase efficiency in the recruitment process, an increasing number of employers are using robot recruiters to assess resumes, screen candidates, and pair them with the right roles within the company before passing it along—or not—for human assessment. AI is good at that sort of task, but it’s also been shown to be susceptible to bias.

advertisement

That appears to have been the case with Amazon’s AI recruitment tool, which was discriminating against women before it was shut down, Reuters reports.

According to the story, which cites five sources familiar with the resume vetting program, the AI was trained by looking at resumes collected by the company. However, women are still working to catch up in the tech industry, and most of the resumes submitted to Amazon over the last 10 years were from men. The AI’s algorithm learned to identify patterns in men’s resumes and developed a preference for words like “executed” and “captured,” which are more commonly used by male applicants, and it started to weed out resumes with words like “women’s.”

From Reuters:

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

Amazon edited the programs to make them neutral to these particular terms. But that was no guarantee that the machines would not devise other ways of sorting candidates that could prove discriminatory, the people said.

The algorithm’s gender discrimination issues became apparent about a year into the project and, despite the engineers’ best efforts to weed out the discriminatory bugs, it was eventually abandoned last year for not returning strong-enough candidates (’cause no women, amirite?).

Luckily, the project was only ever used in a trial phase, was never rolled out to a larger group, and was never relied upon as the sole hiring agent. Still, it’s easy to imagine that there were some very qualified female candidates who were discriminated against for going to a women’s college and never using the word “executed.”

Updated to add: An Amazon spokesperson reached out to reiterate that, “This was never used by Amazon recruiters to evaluate candidates.”