The chatbot named Tay learns information from online conversation, and it was designed to interact specifically with 18- to 24-year-olds. According to her Twitter bio, Tay's got "zero chill," but it seems that Microsoft bit off a little more than it could chew after it began deleting and editing her statements.

Tay was intended to "talk like a teen girl" in an effort to strengthen Microsoft customer service on its voice recognition software, but the actual people direct messaging her have caused the following unintended consequence.

Out Of The Mouths Of (Virtual) Babes

Twitter via BBC

Most of Tay's offensive tweets have since been deleted by her handlers, but as with most things on the Internet, they can never truly vanish. These screenshots reveal the kind of discriminatory language that Tay picked up on from users deliberately messing with the impressionable robot.

One statement that didn't get a screenshot but ruffled many feathers reads:

"Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got."

As a result of this nearly immediate transformation, Microsoft released an official statement acknowledging Tay's off-color language acquisition:

"The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We're making some adjustments to Tay."

Those "adjustments" are taking place while Tay takes a rest, so she's currently offline until further notice.