AI-powered voice assistants from Google, Amazon, Apple, and others could be perpetuating harmful gender biases, according to a recent UN report.
The report, titled “I’d blush if I could” — Siri’s response to provocative queries or flirtatious statements — says the female helpers are often depicted as “obliging and eager to please,” which reinforces the idea that women are “subservient.” Worse, it states, is the way in which they give “deflecting, lacklustre, or apologetic responses” to abuse or criticism.
Because the speech of most voice assistants is female, it sends a signal that women are… docile helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK.’ The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.
In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment.
In the US, more than 90 million adults use smartphone assistants monthly, while 77 million access them from their car, and 45 million use them on smart speakers, according to a survey from Voicebot.ai.
While their usage spans genders, those creating smart assistants are overwhelmingly male.
Companies like Apple and Amazon, staffed by overwhelmingly male engineering teams, have built AI systems that cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation.
The report calls on technology companies to stop making voice assistants female by default, though it stops short of recommending we do away with them entirely. Current voice assistants give users some element of control here, allowing them to change voices, accents, or genders, but they all default to a female voice.
Get the TNW newsletter
Get the most important tech news in your inbox each week.