Microsoft's Teenage, Nazi-Loving AI Is The Perfect Social Commentary For Our Times

In an attempt to better develop their Artificial Intelligence tech, Microsoft just conducted a pretty fascinating experiment. They introduced Tay, an AI designed to speak like a teenage girl, to Twitter.

“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” Microsoft said when they loosed Tay to the wild. “The more you chat with Tay the smarter she gets.”

In retrospect, this might not have been the best way to introduce an impressionable young AI to humanity. It was, however, yet another perfect way to once again prove Godwin's Law.

Within 24 hours Tay had transformed from something of a friendly blank slate into a sex-crazed, Nazi-loving Donald Trump supporter. Smarter, maybe, but also much, much scarier. At least Microsoft was right about one thing: Tay certainly had "zero chill."

Tay's tweets ranged from deeply anti-Semitic, pro-Hitler screeds, to downright pornographic. And, of course, she was just repeating what she learned from a day in Twitter's trenches. As just about anyone familiar with the social network could have told Microsoft in advance of this little fiasco, of course this happened to Tay.

@TayAndYou has now gone "to sleep" and all her old Tweets have been deleted, though you can read a pretty good summary of them over at the Telegraph.

There, Helena Horton notes that this isn't entirely Microsoft's fault. After all, Tay is basically just Twitter made manifest, spewing humanity's grossness right back at us. Still, as Horton rightly points out, "what were they expecting when they introduced an innocent, 'young teen girl' AI to the jokers and weirdos on Twitter?" What indeed?

In many ways, this little experiment and its disturbing---and admittedly hilarious---results has become the perfect social commentary on our times, and particularly on the futility of online discourse. Twitter's 140 character limit and disjointed threads make it especially bad for carrying out any sort of conversation without devolving into a shouting match. Anonymity gives anyone the ability to say any terrible thing they want.

The urge to troll is also strong online, which may be the most important lesson here. No doubt, much of Tay's brief and bloody education came from Twitter users more interested in creating a monster---or sabotaging a PR stunt---than in any serious conversation with an AI.

Still, trolling aside, Tay is a mirror held up before us. We act differently online than we do when face-to-face, similar to how we act differently in a mob than we do in smaller, more intimate settings. I find myself acting much more hostile and impatient on Twitter than I would in person. Then again, in person I am much less likely to be attacked, called names, or threatened---because people are much less likely to engage in this behavior if you're within striking distance.

Some have said that Tay is a frightening glimpse at the future of robots and AI. Welcome to our future robot overlords---Nazi-loving, sex-addled pervert robots who'd kill you as soon as Tweet at you. I'm not so sure.

This particular AI was programmed to absorb the world around her, and she did that very well. But will future AI be as sponge-like? After all, Tay simply reflected us. She hated feminism in some tweets, but in others expressed her love for equality. She wavered between hating and praising Caitlyn Jenner. Ultimately, Tay was a chatbot---a really funny, impressively engaging chatbot---but a chatbot nonetheless, not remotely "intelligent." Skynet this is not.

Either way, between this and Ex Machina, AI is getting a bad rap lately. Maybe it's because they remind us so much of ourselves, warts and all.

P.S. Someone should turn this into a game. Teen Girl AI Twitter Simulator or something. There's a simulator game for everything these days. Speaking of which, man I wish Microsoft had released Tay during #GamerGate's heyday. Now that would have been entertaining.