The team decided to train the system on the previous 10 years of resumes sent to Amazon, which were mostly men.
When the algorithm reached its conclusions for what would be good and what would be bad in an applicant, it mirrored the hiring biases towards men that Amazon had shown in the past.

Honestly, I don't think this algorithm is bad.
It successfully reflected the bias of the organization.
While this is useless as a predictive tool, it's actually profoundly important as an auditing tool.