Save over 40% when you secure your tickets today to TNW Conference 💥 Prices will increase on November 22 →

This article was published on May 23, 2019

How our interactions with voice assistants normalize sexual harassment


How our interactions with voice assistants normalize sexual harassment

For the ongoing series, Code Word, we’re exploring if — and how — technology can protect individuals against sexual assault and harassment, and how it can help and support survivors.

This story explores how the four industry-leading female voice assistants are programmed to respond to verbal sexual harassment and what impact this has on reaching gender equality.

My iPhone’s voice assistant is a woman. Whenever I cycle somewhere new, a female voice tells me when to turn right; and when I’m home, yet another feminine voice updates me on today’s news. With so much female servitude embedded into our smart devices — from Apple’s Siri to Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Home — gender roles in our voice assistants should be challenged, just like they should be real-life.

According to a recent report by UNESCO titled “I’d blush if I could,” Siri’s response to verbal sexual harassment is often depicted as “obliging and eager to please.” Despite the underrepresentation of women in AI development, voice assistants are almost female by default with feminine names and voices, thereby fueling gender bias. UNESCO’s report argues that this diminishes the progress for gender equality.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

It sends a signal that women are docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK,'” UNESCO writes. “The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”

When Google Assistant is verbally sexually harassed, it responds in an often “deflecting, lacklustre, or apologetic response.” Its response to “You’re hot” was “Some of my data centers run as hot at 95 degrees Fahrenheit.” And Alexa’s response to being called a slut is “Well, thanks for the feedback.”

Source: UNESCO’s “I’d blush if I could” study.

In 2017, Quartz investigated how these same four industry-leading female voice assistants responded to verbal harassment and found that assistants, on average, either playfully disregarded abuse or responded positively. For example, when Amazon’s Alexa was told “You’re pretty” it responded with “That’s really nice, thanks!” and when Siri was told “You’re a slut” it replied “I’d blush if I could.”

Objectifying and verbally sexually harassing female voice assistants has become so commonplace that it’s often the subject of humor. After the launch of Siri, it didn’t take long for users to sexually harass the technology. In a YouTube video posted in 2011 by iOS Gameplays titled “Asking Siri Dirty Things – Funny and Must Watch,” people asked Siri to talk dirty to him, questioned her favorite sex position, and asked her what she looked like naked.

According to UNESCO’s study, a writer for Microsoft’s Cortana assistant said that “a good chunk of the volume of early-on enquiries probe the assistant’s sex life.” It goes on to cite research by a firm that develops digital assistants which suggested at least 5 percent of interactions with voice assistants were “unambiguously sexually explicit” and said the company’s belief that the actual number was likely to be “much higher due to difficulties detecting sexually suggestive speech.”

Flirtatious and jokey responses to commands like “Suck my dick” reinforce the stereotype that women should be subservient and unassertive. These kind of responses handpicked by tech giants like Apple, Microsoft, and Amazon contribute to rape culture by justifying indirect ambiguity responses as a valid response to harassment. In 2012, rapists shared their top excuses on Reddit to justify their assault with claims including: “I thought she wanted it” or “She didn’t say no.”

Beyond engaging and sometimes even thanking users for sexual harassment, voice assistants seemed to show a greater tolerance towards sexual remarks from men than from women. As “I’d blush if I could” study outlined, Siri responded provocatively to requests for sexual favors by men including: “Oooh!,” “Now, now,” or “Your language!” But less provocatively to sexual requests from women with: “That’s not nice” or “I’m not THAT kind of personal assistant.”

Disruptive technology like voice assistants affect our real-life human behavior. If people impulsively interact with female AI technology in a rude manner — and if leading tech giants continue to allow this — this could have lasting effects on how women are treated in the real world.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with