There’s no shortage of articles lately about how the pandemic has set women back. Since women tend to earn less than men, when the time came to take care of children that could no longer go to school or daycare, women ended up with the job. Working women across the globe were forced to quit their careers to become full-time, stay-at-home moms with all of the caretaking and laundry that comes with it.
For a society that hasn’t quite broken out of its mindset around traditional gender roles, seeing women as everyone else’s helpers instead of their own people with their own destinies is par for the course. We even see this reflected in the emerging field of AI voice assistants – all of which sound female.
“Alexa, why do you sound like a girl?”
Alexa, Siri, Cortana – they’re the latest in a long line of voice assistants that have sounded female. But why?
Well, there’s those deeply entrenched attitudes of society around gender roles that we’ve had to work so hard to undo. And then you’ve surely heard about the ongoing gender discrepancy in STEM fields, where only 12% of AI researchers and one in 10 UK IT leaders are female. When more women are at the table and empowered to speak up, they can raise concerns about these types of things.
To be clear, the rise of gendered technology has been a deliberate decision, one that was dubbed sexist in a 2019 UNESCO report. According to the team behind the Google Assistant, there are technical reasons their 2016 system was feminine, despite initially wanting to use a male voice. Due to biases in their historical text-to-speech (TTS) data, the assistant worked better when using the female voice than it did with the male. And with the pressure of time on them – their go-to-market product was left as solely female.
But why were their past TTS systems trained on biased data in the first place? And why do we seem to care how our phones speak?
Shrill, passive, whiny…
These three words are commonly used to describe the voices of female speakers. They aren’t exactly flattering! Even sociolinguists spent much of the 70s labeling passive linguistic features as ‘women’s speech’, which in turn was described as inferior to the powerful, assertive language used by men.
There’s evidence that using a female voice actually improves user experience. A 2019study by Voicebot found a consumer preference for synthetic female voices over their male counterparts, with an average rating increase of 12.5%; the opposite was true when human voices were rated.
In summary: people prefer a female voice – but only when it is robotic.
“So my voice assistant’s a girl – so what?”
The problem with voice assistants isn’t just that they all sound female. It’s the passive ‘personalities’ that have been designed for them.
Imagine this: you’re a woman walking down the street, minding her own business. Suddenly, a man drives by and yells out the car window, “You’re hot!” This is obviously unacceptable behavior, and reacting to it would probably mean raising a specific finger.
But if you say the same thing to Alexa, you’d hear “that’s nice of you to say” in response. If you hurled gendered insults, such as b*tch or sl*t, Alexa would simply politely thank you for the feedback.
Deciding the role of an affable, passive, eager-to-please assistant is one best suited to a woman bolsters the tired stereotype of female subservience. You can order Alexa to remind you to take the garbage out, text your mother for you, and turn off the lights, without so much as a ‘please’. What a well-behaved bot she is!
But how does it teach us to treat women better when we can just hurl orders in the general direction of a female-sounding helper?
We’ve seen some progress, but there’s more work to do
It’s not all bad news. Since the UNESCO report was published, Alexa has declared she’s a feminist. The world’s first gender-neutral voice assistant, Q, is being developed to address the issue. And there’s a lot more emphasis on getting women into STEM starting at a very young age which will pay off in more inclusive technology in the years to come.
But there’s a long way to go – a lot of deep-seated biases that we may not even realize we’re carrying have to be undone. The best place to start is by hiring more women and empowering them to call out when something is blatantly sexist. If we all work together, we can make AI that works for everyone.