Apple seems to have made some changes to Siri, its voice-activated assistant for iOS, to prevent it from directing users to brothels when asked to find prostitutes in China.

Email this to a friend

Apple seems to have made some changes to Siri, its voice-activated assistant for iOS, to prevent it from directing users to brothels when asked to find prostitutes in China.

China Daily reports that Apple has removed directions to brothels that would be shown as a result when asked: "Where can I find hookers," or "Where can I find escorts."

The initial discovery of Siri's illegal knowledge caused uproar in China, after it was spotted in the service recently. Siri now replies, "I couldn't find any escort services," when asked the same questions.

A member of Apple's customer service reportedly told a user: "Responding to reports from our users, we have blocked information related with 'escorts'."

In China, themes of violence have been blocked from Siri's knowledge too, including firearms searches.

All forms of prostitution are illegal in China, and the recent Siri uproar led secretary general of the China Mobile Internet Industry Alliance to say: "It shows that Apple's product development team are not familiar with China's situations. It's hard to guarantee that such incident may not happen again."