The incident took place when a UK user was playing a celebrity guessing game called Akinator with the device, when it narrowed the answer down to a singer from Wham!

“I guess you could be thinking of someone who might be considered inappropriate for young people,” Alexa responded, referring to George Michael.

Said Asa Cannell in a complaint to Amazon: “I had thought of the singer, George Michael…After a few questions, I was asked, had my character released any albums? Yes. Had my character released any albums as part of a group? Yes. Was my character dead? Yes. Was my character older than 40 when they died? Yes. Was my character a member of The Beatles? No. Was my character gay? Yes. ‘I guess that you are thinking of someone who could be considered as inappropriate for young people. Would you like to play again?’, came the response. I was literally stunned into silence. A.I. homophobia? I decided to try again. Giving, ‘the benefit of the doubt’, and all that, same character, different series of questions. This time we got as far as the question, was your character in, Wham? Yes. ‘I guess that you are thinking of someone who could be considered as inappropriate for young people. Would you like to play again?’. Somewhat unbelievably, I played the game a third time. This time thinking of Andrew Ridgley from Wham. Once again, I recorded the outcome on my voice recorder. ‘Akinator’, cracked it in just over two minutes. Could you explain why George Michael is so offensive to young people? Could you explain why a person’s sexuality is so offensive to young people? Are your products designed for straight people only?”

A Scottish user was able to replicate the answer on video, which you can watch here.

Amazon claims that “An error with the Akinator 20 questions skill was identified and the feedback was passed on to Elokence, the developer of the game. Elokence has confirmed that this issue has now been fixed.”