This article was published on June 10, 2020

Sci-fi perpetuates a misogynistic view of AI — Here’s how we can fight it

Marginalized voices will make tech more inclusive


Sci-fi perpetuates a misogynistic view of AI — Here’s how we can fight it Image by: Steve Troughton

Fiction helps us imagine the future of AI and the impact that it will have on our lives. But it also perpetuates stereotypes for generations to come.

From Greek mythology to contemporary sci-fi, robots are constantly anthropomorphized. The female androids are typically portrayed as beautiful, subservient, and sexually passive — or deceitful killers on the rampage. Warrior machines, meanwhile, are normally gendered male, whether they’re protecting humans like Robocop, or trying to wipe them out like the Terminator.

These stereotypes endure long after the story ends. They help foster Silicon Valley’s tech-bro culture and the products that it creates.

[Read: Microsoft’s AI editor uses photo of wrong mixed-race popstar in story about racism]

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Take the tendency to give voice assistants female voices and names. We’ve been conditioned to prefer synthesized female voices as they sound “warmer.” Once that prejudice is embedded in the tech, it’s sustained by our interactions with it.

As AI researcher Kanta Dihal noted at the CogX conference this week:

Because people get used to a feminine, servile Alexa, they’ll continue to associate women with servile roles. And these roles relate not only to jobs, such as the servant in the case of women, or the soldier in the case of men, but also to social roles and roles within relationships and hierarchies.

It’s not only stereotypes of gender that fiction reinforces. The real experiences of people of color have also been sidelined — or sublimated.

Shifting sci-fi narratives

In Dihal’s research on depictions of intelligent machines, she’s uncovered misleading allegories of slavery in stories of AI rebelling against humans:

Those narratives also in a way perform a dehumanizing function. Because by drawing on existing narratives of black slaves and transposing them onto narratives of robots that are very often racialized as white, this is a way of reappropriating a literary history and erasing these black voices from that non-fictional history.

Excluding ethnic minority, trans, and female voices from these stories help sustain biases in tech. But we can still challenge these narratives, by diverting attention from tech bro fantasies towards marginalized voices. From Nnedi Okorafo’s vision of childbirth in a future Nigeria to Cassandra Rose Clark’s tale of a girl’s love affair with an android, there are already plenty of alternatives to try.

But the first step towards promoting diverse perspectives in fiction is acknowledging the homogeneity of the Western canon. As Dihal’s colleague Kate Devlin put it:

“If you examine the narratives, then you have a chance of disrupting the narratives.”

 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with