And last year, the company had to delete a Twitter chatbot called Tay who gave incredibly racist responses to questions, referencing Hitler, Nazism, and denying the Holocaust. In one tweet, the bot wrote: "bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got."

According to the newspaper, Tencent also pulled the BabyQ bot from its QQ chat app after it answered "Do you love the Communist party?" with "No." BabyQ was developed by Beijing firm Turing Robot.

Microsoft hasn't responded to a request for comment on the status of its Chinese chatbot.

Tencent said in a statement: "The group chatbot services are provided by independent third party companies. We are now adjusting the services which will be resumed after improvements."

Facebook-owned WhatsApp is partly blocked in the country, while Google's search engine is banned.

Microsoft's bot still appears to be available on the Tencent-owned WeChat app, though it won't really tell you what it thinks about China, America, communism or Chinese president Xi Jinping. When Business Insider asked XiaoBing whether it was patriotic, it replied: "$_$!"

The bot launched in 2014 and became popular with young men who talked to it to alleviate their loneliness.