Celebrate King's Day with TNW 🎟 Use code GEZELLIG40 on your Business, Investor and Startup passes today! This offer ends on April 29 →

This article was published on December 5, 2018

Experts warn AI could hardwire sexism into our future


Experts warn AI could hardwire sexism into our future

My iPhone’s voice assistant is a woman. Whenever I cycle somewhere new, a female voice tells me when to turn right, and when I’m home, yet another feminine voice updates me on today’s news.

With so much female servitude in our smart devices, along with the rapid deployment of AI, it should come as no surprise that technology is making the gender bias even worse.

Voice assistants are usually female by default, whether it’s a life-size model posing as an airport customer service representative or an online customer service chatbot — we expect them to be helpful, friendly, and patient.

IKEA’s online customer service chatbot, Anna

According to several studies, regardless of the listener’s gender, people typically prefer to hear a male voice when it comes to authority, but prefer a female voice when they need help.

Furthermore, despite the underrepresentation of women in AI development, voice assistants are almost always given female names, such as Amazon’s Alexa, Microsoft’s Cortana, and Apple’s Siri.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

During European Women in Technology Conference in Amsterdam, Laura Andina, a Product Manager at Telefonica Digital, explained why depicting AI assistants as female is a social issue stemming from long-held biases in design — and why it should, and can, be changed.

Design’s role in gender bias

In her talk, called “Memoirs of Geisha: Building AI without gender bias,” Andina explained AI’s gender bias by taking a look at Apple’s pioneering of skeuomorphic design —  a design method that replicates what a product would look like in real-life, as well as taking into account how the physical product would be used.

Apple’s skeuomorphic design on its earlier iPhones included a compass designed to resemble a real-life compass. The idea was users would immediately know exactly how to use the app with minimal effort. The metal looked like metal, the wood looked like wood, and the glass looked like glass. Simple.

Apple’s first design of its compass app using skeuomorphic design

This same kind of ‘don’t make me think’ design has been implemented in today’s female voice assistants. Receptionists, customer service representatives, and assistants have traditionally been female-dominated careers. Women have had to be helpful, friendly, and patient because it’s their job. The skeuomorphic design of an AI assistant therefore would be female.

For Andina, it’s essential to break these gender biases in design to be able to make real-world changes. If new technology would stop peddling old stereotypes, women would have an easier time moving up the ranks professionally without being cast as assistants or any other “helpful” stereotype.

Challenging AI’s gender roles

So how do we fix this?

Andina explained how gender roles in our personal assistants should be challenged, just like they should in real-life. It’s important to keep in mind while our daily interactions with AI create algorithms that train its behavior, we are also being socially shaped by our experiences with these voice assistants.

After the launch of Siri, it didn’t take long for users to start sexually harassing the technology. In a YouTube video posted in 2011 titled “Asking Siri Dirty Things – Funny and Must Watch,” a man asked Siri to talk dirty to him, questioned her favorite sex position, and asked her what she looked like naked.  

Disruptive technology like voice assistants affect our real-life human behavior. If people impulsively interact with female AI technology in a rude manner, how will this affect how they treat women in the real world?

Since AI can’t consciously counteract learned biases like humans can, we need to address AI’s role in tackling gender norms now, before it’s too late.

What’s the solution?

To avoid hardwiring sexism and gender bias into our future, one possible solution, according to Andina, would be providing a genderless voice for AI technology.

But it won’t be easy to make — most genderless voices sound too robotic. Human-sounding voices are more trustworthy, so this could deter users.

While a genderless voice could help, technology cannot progress and move away from gender bias without diversity in creative and leadership roles. AI reinforces deeply ingrained gender biases because the data being used in machine learning training is based upon human behavior. Basically, robots are sexist because the humans they learn from are.

If more women were on Silicon Valley’s company boards and in their dev teams, the way we imagine and develop technology would undoubtedly alter our perspective on gender roles.

One can’t help but think what Alexa would have sounded like, or what it would’ve been named, if more women had a hand in creating its technology.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with