Our suggestion: if there’s anything Tay (and through her, the Internet) needs: it’s more chill. Microsoft’s Tay. Credit: Microsoft Senior Editor, PCWorld Mar 24, 2016 1:26 PM It sounds like Microsoft’s Tay chatbot is getting a time-out, as Microsoft instructs her on how to talk with strangers on the Internet. Because, as the company quickly learned, the citizens of the Internet can’t be trusted with that task. In a statement released Thursday, Microsoft said that a “coordinated effort” by Internet users had turned the Tay chatbot into a tool of “abuse.” It was a clear reference to a series of racist and otherwise abusive tweets that the Tay chatbot issued within a day of debuting on Twitter. Wednesday morning, Tay was a novel experiment in AI that would learn natural language…