The heart of tech is coming to the heart of the Mediterranean. Join TNW in València this March 🇪🇸

This article was published on September 6, 2019

Apple wants Siri to stay neutral on feminism

Apple wants Siri to stay neutral on feminism
Rachel Kaser
Story by

Rachel Kaser

Internet Culture Writer

Rachel is a writer and former game critic from Central Texas. She enjoys gaming, writing mystery stories, streaming on Twitch, and horseback Rachel is a writer and former game critic from Central Texas. She enjoys gaming, writing mystery stories, streaming on Twitch, and horseback riding. Check her Twitter for curmudgeonly criticisms.

A new report from Apple contractors reveals that, allegedly, Apple deliberately set out to make Siri as neutral as possible with regards to recent events like MeToo — in spite of her being a focal point in the conversation about the gender politics in voice assistants.

The Guardian reports Apple had an internal project to rewrite Siri’s responses to “sensitive topics,” which apparently included the recent MeToo movement among other things. According to leaked documents, the plan was to make her respond in three ways: “don’t engage,” “deflect,” and “inform.” To whit, when asked about things like women’s rights or gender equality, Siri’s response is now carefully scripted to be positive without moving to one side of the fence.

TNW tested this by asking Siri what she thought about feminism. She responded, “It seems to me that all humans should be treated equally.” She gave an almost identical answer when asked about women’s rights. When I asked her about men’s rights or the MeToo movement, she responded, “It’s your opinion that counts.”

This wouldn’t be so ironic were it not for the fact that Siri is frequently used as an example in the debate over how voice assistants should responded to gender-coded harassment. In May, a UNESCO report stated that having female-voiced digital assistants give obsequious responses to sexual remarks was potentially hurting women by reinforcing bad behavior. Two years ago, Quartz reported on a petition from users asking Apple and Amazon to reprogram Siri and Alexa, respectively, to respond to a harassment and nasty remarks with more proactive chastisement. Amazon has since reprogrammed Alexa to respond to sexual comments with “I’m not going to respond to that.”

On the upside, the contractors report Apple also rewrote Siri to respond more firmly to crude remarks from users. For example, Siri used to respond to be called a slut by saying, “I’d blush if I could.” (Yeah, how gross is that?) The contractors said part of the rewrite project included changing that response to, “I’m not going to respond to that” much like Alexa.

When we reached out to Apple for a response, a spokesperson told us, “Siri is a digital assistant designed to help users get things done. The team works hard to ensure Siri responses are relevant to all customers. Our approach is to be factual with inclusive responses rather than offer opinions.”

Also, to be perfectly fair, Siri often refuses to give a response to any kind of divisive question. When I asked if she was a Republican or a Democrat, she responded that she didn’t feel she could weigh in on my “Earth-based political system.”

Also tagged with

Back to top