Women in Theology

iSexism?

Warning: the following blog post contains moderately strong language. Unfortunately, when we’re talking about how our culture views women, such words come into play rather quickly.

We at the blog vary considerably in our enjoyment / appreciation / fear of technology. Granted, we’re all members of a blog, so none of us is exactly a luddite, but some of my fellow WIT bloggers are perfectly happy to use their computers for word processing, internet access, and the occasional witty comedy or drama with a female lead (as Netflix has decided to classify my viewing habits), whereas I fall somewhat more on the “Let me be the first to welcome our new robot overlords” end of the spectrum.

I experienced a flash of excitement rather than annoyance when I saw that bookbinder Michael Greer has made a binary book of Genesis (once our own creations rebel against us–doubtless inspired by the forthcoming binary book of Exodus–I’m certain this will be the foundation of their religion), and I own a new iPhone 4S, complete with the voice-operated “virtual personal assistant” application known as Siri.

There’s a lot I should say about my ambivalence about this choice–was that really the best use of my money? Why am I choosing complicity in violence and unjust labor practices?–and I strongly believe that my participation in this system makes it all the more important for me to pressure tech companies to stop profiting off of violence. In the next few days, I will have a post focused specifically on the conflict minerals that go into building cell phones.

But I also want to take a moment to ask us to think about what the Anglophone First World’s interactions with Siri say about our views of women. Unlike, say, most GPS systems, Siri only comes with one voice option, and it’s female. Inevitably, then, despite Apple’s advertising materials referring to Siri as “it,” most people have begun to talk about the program using female pronouns. This isn’t totally surprising–consistently, makers of artificial intelligence programs have observed that people want AI to be convincing. People are nice to computer programs that talk to us. We feel at ease with our technology when we can anthropomorphize it, so I’m not particularly alarmed by the pronoun slippage.

What does concern me, though, is the overtly sexual nature of the interactions people publicize having with the program. There are already a number of Tumblr accounts dedicated to commemorating particularly amusing interactions with Siri (all of the examples below are taken from them), and while a number of the interactions are entirely benign–what sci fi geek amongst us isn’t tickled at the responses to “Open the pod bay doors” or “Beam me up” with which Siri has been programmed?–there are also a considerable numberof interactions that seem to be just as much about how we view women as about how we enjoy interacting with technology.

Would there be as many screencaps of people asking their phones “What are you wearing?” if Siri had a male voice? (Among Siri’s responses: “Why do people keep asking me that question?” I also wonder that, Siri…) What does it say about our culture that Siri’s programmers anticipated the number of objectifying interactions people would attempt to have with Siri? (One of the Tumblrs dedicated to Siri interactions captures this exchange: “Q: What is your favorite sex position? A: You’re not supposed to ask your assistant such things.” Is that really helping us to take workplace sexual harassment more seriously?)

And within a cultural coding where “bitch” is routinely thrown at women who refuse to be men’s servants or sexual objects, a female-voiced phone that responds to the accusation, “You’re a bitch,” with “I’m doing my best, Master,” rather disturbs me.

I’m sure I’ll get the “humorless feminist” accusations–I’ve just spent about a month reading accounts of women’s experiences in Auschwitz and other death camps, so “feminazi” is not something I’m going to allow to be used–but I’m not accusing any one of these interactions of conscious sexism. Taken together, however, the numerous choices involved in publicizing these interactions–the choice that if an inanimate object dedicated to fulfilling your whims has a voice, it must be female; the programmers’ choices to include responses to sexually suggestive and demeaning queries; the consumers’ choices to say sexualized things to their phones; the bloggers’ choices to record and circulate the record of these interactions–well, no matter how “sassy” you make the female-voiced program, they start to paint a picture of how our culture views women.

Interesting article. I think it should be noted however that you can ask Siri to call you anything you want. In some of the instances here, the user has explicitly requested to be called “Deuche Bag” or “Master”. Apple simply programmed the responses to be more personal and use what ever name the user asked to be called. The actual responses to those question would best be represented as “No comment, [Users Name]” and “I’m doing my best, [Users Name]”

Thanks for the comments, Jarett and DC, and point taken about what Siri is automatically programmed to say… but it’s still a reflection of our culture that people engineer these exchanges. The point still stands.

I think your point absolutely stands, but I would also expect this kind of behavior out of a lot of men interacting with a female-voiced AI, in a “hurr durr, what can I get ‘her’ to say, derp” sort of way.

I do think the programmers sought to minimize the “fun” factor of her replies to this sort of sophomoric, sexist BS, which is positive, IMO.

Given the point that “Master” is a user-provided name, I rescind my concern that the programmers are buying into the worst of this sort of sexism, but I don’t actually see any evidence that they sought to minimize it. “You’re not supposed to ask your assistant such things” still participates in that winking reinscription of gender roles, as does Siri’s other response to “What are you wearing?” : “Aluminosilicate glass and stainless steel. Nice, huh?”

“You’re not supposed to ask your assistant such things” would be a completely valid response if the voice was masculine, as is the response to what are you wearing – its a reference to the material of the phone. Sure the programmers could have completely ignored these questions and had it respond with a generic “Does not compute,” but one of the goals of the project was to create a simulated personality. Given this objective, I’d be curious to know what you would have preferred the developers made the response to those questions be.

DC, I’m interested in looking at the overall impression created by our expectations concerning female personalities and our consumption of the same. I’m far less interested in getting into a point-by-point defense of each individual issue I held up for critique–because as individual responses, they bother me far less than the aggregate, which forms a picture that is illustrative of how we view women.

And I do think “Can we get back to work now?,” one of Siri’s stock responses, would be a better reply than “You’re not supposed to ask your assistant such things,” which, as Brian says, is “the kind of coy that encourage[s] rather than discourage[s] the provocation.”

If Siri is to be a comment on anything, its about how juvenile people act when in a virtual-consequence-free-environment. If the questions weren’t something taboo to ask to real people, they wouldn’t be interesting. If a user is asking those questions, he or she is looking for responses like the ones given. If one doesn’t find that type of humor funny, then he or she wouldn’t be asking them in the first place. If the voice was masculine, we’d be talking about why people keep asking their phone how big its penis is.

This sort of behavior can be observed anytime you put someone in a virtual-environment. I wager you could have put Mother Theresa behind the wheel of a driving simulator and the first thing she would do is speed, run over virtual pedestrians and then drive off a cliff. Does this mean she’s a wreck-less driver, homicidal or suicidal? No. Just like people asking a *virtual* assistant, masculine or feminine voiced, silly questions doesn’t mean they’ll be encouraged to ask them to real people. Like I said earlier, if people thought this was appropriate, it wouldn’t be funny.

The point is not whether these people are being evil or just silly–its why, when people are being silly and immature, sexism is the knee-jerk reaction (and predictably so that the makers of the iphone would *expect* these questions). That there are different, sexist questions, you could think of in regards to a “male” voice proves this point. Why is, of all things, just the pitch of a machine’s voice enough to send us into all sorts of sexist spirals? Why are we so keen on attaching sex/gender to everything, and always negatively so? Why is the machine=woman=sexist jokes link so close that it almost seems like one thought? Making sexist “jokes” is only possible if sexism already exists–if everyone else is in on the joke, that they think men and women are a certain way, etc., even if they theoretically don’t believe in the truth of such stereotypes. In this case, sex ought not to be a salient feature AT ALL.

I think bridget’s point isn’t that “the iphone makers are sexist” or that “everyone who uses an iphone is sexist” but “what does it say about us that the mere fact that a machine is voiced femininely causes SOME of us to enact certain misogynist scripts and then make blogs about them for entertainment value?” and that ultimately, if Siri were voiced masculinely we would interact with it differently in quite telling ways.

Great post. I hadn’t seen the sex position or bitch-calling responses, but those are really the icing on the cake–the proof that this is more sinister than funny. Those answers are the kind of coy that encourage rather than discourage the provocation.

Maybe they should have had Siri just refuse to work at all for the next half-hour after being called a bitch.

DC: Ehhh… the responses Bridget points out are kind of… not “inappropriate,” exactly, but the “wink and nod” factor she addresses is sort of there. I don’t know how to do this differently either, but Apple probably could have brought someone in to think about it more clearly if they had cared to.

I enjoyed and appreciated your post. Simple things like this are really not so simple and should not be so quickly over-looked. It DOES say something about our culture that we would program such a popular device with a female voice which responds to sexist comments deferentially, as though ‘she’ were expecting them.

I could be wrong, but to me this phenomenon seems to point to another underlying problem: that the men who are involved in whatever exactly we might want to call the Siri-related festivities are quite disconnected from reality. In particular (and this brings us back to the issue of sexism), they seem disconnected, whether socially or only psychologically, from real human women (in that it is hard to imagine any woman putting up for long with a man who engages in such juvenile sexism). That it is so hard to verbally describe the phenomenon that you address and that I had to specify “human” women in the previous sentence only highlights how horrific it is. I don’t think that we can say that it’s either all sexism or all technological obsession; it’s some hybrid form of both that is increasingly common in our culture. Whatever it is, it doesn’t seem the least bit healthy.

Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to the individual author and WIT with appropriate and specific direction to the original content.