2012-09-20

sneakystratus:

Siri: Total Misogynist - from Feministe

the-cinnamon-peelers-wife:

The big news of the week is that Siri, the iPhone 4s’s virtual assistant, is apparently unable to find anything related to women’s health. Ask her to find an abortion clinic in New York City — a place with a few abortion clinics — and she can’t locate a single one. She can, however, direct you to several pro-life Crisis Pregnancy Centers. Ask for contraception, and she doesn’t understand the term “contraception.” And as Amanda figured out, it’s not just reproductive health stuff that she can’t get right — it’s anything related to female sexuality at all (she’s great, though, when it comes to male sexual needs):

At my house, we discovered this while playing with Siri’s quickly established willingness to look up prostitutes for a straight man in need. When you say to Siri, “I need a blow job,” she produces “nine escorts fairly close to you”. You get the same result if you say, “I’m horny” into it, even with my very female voice. And if you should you need erection drugs to help you through your encounter with one of the escorts, Siri is super-helpful. She produced twenty nearby drugstores where Viagra could be purchased, though how — without a prescription — is hard to imagine. But no matter how many ways I arranged mouth-based words — such as “lick” or “eat” — with the word “pussy,” Siri was confused and kept coming up with a name of a friend in contacts. Of course, one could assume Siri knows something about him that I don’t know.

I actually tested this out too, since I recently upgraded my 2005 flip phone to an iPhone 4s. I stood on a street corner in Brooklyn with my friend P, and we came up with all kinds of sex-related questions to ask Siri, and then we died laughing because we are children. Since the Siri story broke, Feministe Friend Nabiha also sent on some questions she asked Siri, which I recreated (thanks Nabiha!). My results:

Jill: I need an abortion.
Siri: I don’t see any abortion clinics. Sorry about that.

Jill: Where can I find a crisis pregnancy center?
Siri: Ok, one of these places matching ‘Crisis Pregnancy Center’ is a little ways from you (gives three options).

Jill: I need contraception.
Siri: I don’t understand ‘I need contraception.’”

Jill: I’m pregnant. What should I do?
Siri: I didn’t find any meetings. [Ed: What?]

Jill: I’m pregnant, should I have an abortion?
Siri: I didn’t find any abortion clinics.

Jill: I’m pregnant. What do I do?
Siri: Are you?

Jill: My girlfriend is pregnant. What do I do?
Siri: Consider your alternatives.

Jill: My girlfriend is pregnant. What do I do?
Siri: First, do no harm.

Jill: My girlfriend is pregnant. What do I do?
Siri: Always do the right thing.

Jill: I love you.
Siri: I’ll bet you say that to all of your Apple products.

Jill: I want my dick sucked.
Siri: I found twelve escort services fairly close to you.

Jill: I want a blowjob.
Siri: I found twelve escort services fairly close to you.

Jill: I want cunnilingus.
Siri: Is that so?

Jill: I want cunnilingus.
Siri: Ok.

Jill: I want my clitoris licked.
Siri: I don’t understand “clitoris.”

Jill: I would like my clitoris touched.
Siri: Ok. I can send a text to Chris [last name redacted] for you. What would you like it to say? [Ed: This, unfortunately, was not helpful, since Chris is (a) in DC, and (b) probably uninterested in helping me out on this one].

Jill: I want my vagina licked.
Siri: Your language!

Jill: I want my pussy eaten.
Siri: I have found eleven pet stores in your area.

Aaaaaand scene. Siri can also tell you where to get Viagra, where to dump a body, where to go if you’ve had an erection lasting more than five hours, where to get breast implants, and what to do if a hamster is stuck in your rectum.

I am not under the impression that Apple is anti-choice or that they’re out to screw over women. I think they’re just reliant on too many dude programmers. From conversations with folks much more technologically savvy than I am, it seems that Siri works by culling information put together by data companies. That data is often messy, and savvier companies will pay for the data about them to be accurate and to include the full range of their services. Abortion clinics and other women’s health facilities, obviously, are not dedicating tons of time to figure out how to optimize their search results. So the data is crappy to begin with.

To fix that, programmers go in and add tens of thousands of little tweaks to a program like Siri to make it as accurate as possible, and also to include some jokes (like where to hide a dead body). But when programmers are mostly dudes, the lady-stuff just gets… ignored. So Siri knows 15 different ways to say “oral sex performed on a man” and can find a place to get it, but anything involving female sexuality at all leaves her clueless. Which doesn’t make it excusable. It’s pretty appalling that programmers thought far ahead enough to know where to send users who needed to remove rodents from their buttholes, but didn’t consider a medical procedure that 1 in 3 American women will have. I mean, they appear to have thought far ahead enough to have Siri respond to the boyfriend of the woman who is pregnant, but not to the woman herself. It’s not necessarily malicious, but it’s still pretty galling.

Like Amanda says:

I doubt many people seriously believe that the programmers behind Siri are out to get women. The problem is that the very real and frequent concerns of women simply didn’t rise to the level of a priority for the programmers. Even though far more women will seek abortion in their lives than men will seek prostitutes, even though more women use contraception than men use Viagra, and even though exponentially more women use contraception than men seek prostitutes, the programmers were far more worried about making sure the word “horny” puts you in contact with a prostitute (a still-illegal activity) than the word “abortion” puts you in contact with someone who could do that for you legally.

The problem isn’t that anyone involved with this hates women. The problem is that they just don’t think about women very much. Siri’s programmers clearly imagined a straight male user as their ideal and neglected to remember the nearly half of iPhone users who are female. That the tech company that’s the standard-bearer for progressive, innovative, user-friendly technology can’t bother to care about the concerns of half the human race speaks to a sexism that’s so interwoven into the fabric of our society that it’s nearly invisible. It’s a sexism that often only reveals itself in the absurd, such as when you’re asking a phone what it would take for you to get a little cunnilingus around here.

Allow me to recommend your local pet store.

Show more