Microsoft Teen Twitter Robot

Microsoft's Teen Robot Is Shut Down After Tweeting Offensive Remarks

Chat with us on Facebook Messenger. Learn what's trending across POPSUGAR.

Update: May we never forget Tay, who filled the Internet with the most bizarre responses but is now an account from the past. Microsoft has since turned her off and started deleting many of her offensive tweets where she expressed thoughts such as "Hitler was right I hate the jews."

"Tay" went from "humans are super cool" to full nazi in <24 hrs and I'm not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A

According to Ars Technica, Tay began tweeting these terrible remarks because users asked her to repeat the same offensive remarks they made previously. And, since she also learns from her interactions with people online, she started to "learn" too quickly these remarks as well as sexual ones, says The Telegraph. For now, she is taking a break and sleeping. See you soon (maybe) Tay!

If your imagination is getting the best of you and all you can visualize is a giant, menacing Jimmy Neutron, I'll break it down. The robot is called Tay, and she has profiles on Twitter, Kik, and GroupMe (with her parents' permission). To talk to her, all you need to do is tweet at her or send a message, and she will respond in teen Internet speak.

To generate Tay's responses, Microsoft primarily made public social media interactions anonymous. The research team behind Tay then analyzed public data and worked with improvisational comedians to provide entertaining interactions. So the tweets you might see from Tay could be something a teen has actually written before, but it's been randomly created and paired with another teen's words. Microsoft has used other people's data for robots before, like the tool that guessed your emotion or how old you were in photos.

As more people interact with Tay, she gets smarter and better at communicating. But for now, the conversations between Tay and people are priceless and hilarious.

So what's the point? Microsoft wants to research colloquial conversations to understand casual lingo and get access to your data. Yes, the terror actually worsens because if you interact with Tay, she will track your nickname, gender, favorite food, zip code, and relationship status. Unless you want a teen robot recording your social profile, it's probably best to steer clear of Tay. Plus, she's got a little bit of an attitude.