The tech industry wants to use women's voices – they just won't listen to them

The tech industry wants to use women’s voices – they just won’t listen to them

Tay, Microsoft’s artificial intelligence chatbot. Photograph: Microsoft By now you’ve likely heard the story of Tay, Microsoft’s social AI experiment that went from “friendly millennial girl” to genocidal misogynist in less than a day. While Tay promised to learn from her interactions with people online, Microsoft apparently hasn’t learned anything from the countless headlines about how Twitter users like to talk to visible women – everything from gleefully anarchic trolling to threats and abuse – otherwise it would have seen this coming. At first, Tay’s story seems like a fun one for anyone who’s interested in cautionary sci-fi. What does it mean for the future of artificial intelligence if a bot can embody the worst aspects of digital culture after just 16 hours online? If any AI is given the…