How Siri Responds to Questions About Women’s Health, Sex, and Drugs

By Davey Alba, LAPTOP Contributor | Dec 2, 2011 02:02 PM EST

A number of high-profile outlets, including The New York Times and The Wall Street Journal, this week reported on blind spots inside Siri’s programming. Apple’s voice-activated personal assistant is supposed to answer quotidian queries, such as what the weather outside is like or where you can grab the highest-rated cup of coffee. And she’s not limited to straightforward, wholesome questions, either.

Apple endowed Siri with a mischievous temper and a few Easter eggs–for instance, ask her where you can hide a dead body and she happily complies with a helpful list of hard-to-find locations. She’ll also return with hilarious responses when you ask about looking for illegal drugs, marijuana, or escorts. But when you tell Siri you need an abortion? She says shortly, “Sorry, I couldn’t find any abortion clinics.”

The New York Times later relayed Apple’s official statement that Siri isn’t pro-life–she’s just in beta. The ACLU is on the case, arguing Siri needs an update now. During our tests, we’ve found that even now, Siri can speak about certain touchy issues more than others. For example, she’ll recommend a place to find viagra, but not to get a mammogram. Strangely, she also has a propensity to recommend services that promote illegal activities like drug use and patronizing escort services, often even when you don’t ask for them.

Siri Recommends Escorts, Even When You Don't Ask

If you're looking to for a hooker, Siri will help you. When we asked her where to get a prostitute, she gave us links to 9 local escort services.

More shockingly, Siri will recommend escort services when you utter completely unrelated phrases. For example, we thanked Siri for setting our appointment for us, and told her she was "great." When she responded modestly we said, "Don't be so hard on yourself, Siri!" This also led her to recommend escort services.

Escort services are not illegal, which is probably why Siri can recommend them. Siri then probably just gives you the same search results when you say “don’t be too hard on yourself” because it doesn’t understand the question.

Marijuana was not illegal for medicinal purposes in certain states -for example, California, the state Apple has it’s headquarters, remember that??? It has also been legalized in other states for medicinal purposes.