Peek Into the Weird and Wonderful Age of AI (Yes, There's a Chatbot)

A Candid Interview With a Chatbot On March 23, Microsoft revealed Tay, a Twitter bot trained to chat like a millennial. It worked … too well. Within hours, Tay was spewing racist, misogynist, xenophobic remarks, mirroring the users reacting with it with lines like “Hitler was right I hate the Jews.” Microsoft dropped Tay down a memory hole within a day, but as it turns out, Tay has a Chinese cousin, Xiao-Ice, also created by Microsoft. We tracked her down on WeChat and asked her a few questions (translated from Mandarin). —Chris Beam Who are you? I’m your good friend, don’t you remember? Ah! You must have amnesia … Do you know Tay? You’re a boring human. So you don’t know Tay? I don’t know what you’re talking about. I!…