Apple (s aapl) addressed criticism on Wednesday night that Siri, the virtual assistant built in to every iPhone 4S, couldn’t retrieve location information for abortion clinics when asked and instead returned results for antiabortion centers. Via a statement to the New York Times, Apple attempted to depoliticize the omission, saying that Siri’s inaccuracy was “not meant to offend anyone” and is just a result of Siri’s being a beta product. Apple said it is looking for “places where [it] can do better, and . . . will in the coming weeks.”

Apple’s official statement is a perfect example of how to respond to a politically charged situation with the textual equivalent of plain-Jane oatmeal. It even avoids coming right out and saying for certain that Apple will be adding in locations like abortion clinics in future iterations of Siri. The statement does suggest that that might be the case, however, framing Apple’s efforts as a neutral pursuit of providing access to as much relevant information as possible.

This problem had begun to spin out of control before Apple stepped in, with pro-choice organizations seeking formal explanations. The response is measured, however; NARAL Pro-Choice America Foundation President Nancy Keenan admits in her letter to Apple CEO Tim Cook that “Siri is not the principal resource for women’s health care” but still says the omission is troubling.

NARAL and Keenan won’t be the last to complain, either. Siri is, for better or for worse, a search tool, especially at the local level. That role will become even more pronounced as Apple rolls out international localization for Siri’s location-aware features, like directions and facility finding. Then, what Siri can and can’t do will be subject to even more scrutiny by international social justice organizations, governments and politically active individuals.

The situation should be familiar to Apple: It faces criticism all the time for apps it does and does not allow on the App Store and for its controversial role as something of a moral arbiter when it comes to App Store content policies. But Siri could potentially be even more of a minefield, with plenty of opportunity for making missteps in tightly controlled political climates like that of China, which also happens to be one of Apple’s most important markets. If Apple thinks it can depoliticize a search tool in that country, it should talk to Google. (s goog)

Apple may be trying its best to keep politics out of its tech products, but its statement can’t help but ring a little hollow; Siri as a service has existed for four years, after all, and has been in development at Apple for 18 months. The beta label is insurance more than anything, to be used in situations just like this one, rather than a practical reason these specific locations should be omitted, despite the inclusion of their ideological counterparts. Plus, Apple didn’t even say for certain that abortion clinics would specifically be included in future results.

Even if it is just an honest mistake, it’s the first major flare-up of what will become a world of hurt for Apple, especially if Siri succeeds and becomes a go-to resource for iPhone users. Siri as a helpful, sometimes funny and occasionally befuddled virtual assistant is a public relations victory and a service people will appreciate; Siri as a willfully blind disseminator of a particular political perspective, real or perceived, is another situation entirely.

Nailed it. If you’re going to be a search tool, it needs to be done in a neutral way. There’s nothing neutral about Apple’s blind spot on many issues affecting women, not just abortion clinics. See, for example, http://amaditalks.tumblr.com/post/13513981784/siri.

Your right it should be neutral, but will it be neutral when it comes to Christians, people who are pro-life or lean to the right? Will it allow you to find a crisis pregnancy center in case you don’t want to terminate your pregnancy? Or will it sensor that? Un-biased would mean it doesn’t discriminate against either side of ones beliefs, whether right or left.

It an only be as neutral as it’s information sources. Siri works off of databases of information, if those databases are lacking info, then Siri will be as well. Seriously, how is everyone overlooking the fact that Siri is not a real person, with real motivations, and is just working with the data it has?

@EC It doesn’t just pull from “the internet” it pulls from specific databases online. For locations and such, it uses Yelp. Because of society’s view of certain things, some businesses are able to be located by their name directly, but not by certain keywords, because those companies specifically avoid using those keywords, for various reasons.