
The Feminist Internet, a non-profit working to prevent biases creeping into AI, has created Fâxa â a feminist voice assistant that teaches users about AI bias and suggests how they can avoid reinforcing harmful stereotypes. Judging by a recent UNESCO report, itâs not a moment too soon.
In the report âIâd blush if I could,â (titled after Siriâs response to: âHey Siri, youâre a bitchâ) UNESCO notes that voice assistants often react to verbal sexual harassment in an âobliging and eager to pleaseâ manner.
The reason why thatâs a big issue is most industry-leading voice assistants are female by default. They all have typically female names, and markedly feminine voices. Just think about it: Appleâs Siri, Amazonâs Alexa, Microsoftâs Cortana, and Googleâs Google Home.
Having female voices react to harassment in a benign way can undercut the progress towards gender equality. Fâxa tries to combat this norm of female servitude in our smart devices, by challenging gender roles in our voice assistants just like they should challenged be in real-life.
This sounds terrific, but how does it actually go about making that happen?
How the feminist bot works
Fâxa is built with feminists values in mind and every response given holds up to feminist beliefs that avoid reinforcing bias and stereotypes. Fâxa was created by a diverse team using the Feminist Internetâs Personal Intelligent Assistant Standards and Josie Youngâs Feminist Chatbot Design research.
In preparation for building Fâxa, Young explored contemporary feminist techniques for designing technology called the Feminist Chatbot Design Process â a series of reflective questions incorporating feminist design, ethical AI principles, and research on de-biasing data.
Using a smartphone, the bot works to ensure designers donât perpetuate gender inequalities into their chatbots and educates users on how current voice assistants give gender equality a bleak future.
For example, when Fâxa was asked âHow does bias creep into AI systems?â it replied: âBias occurs in AI systems when they reflect human biases held by the people involved in coding, collecting, selecting, or using data to train the algorithms that power the AI.â

Smart devices are colonizing the consumer electronic market and according to Fâxa, researchers estimate that 24.5 million voice-driven devices will be used daily in 2019; predictions further suggest that 50 percent of searches will be made via voice command by 2050. So itâs probably a good thing someone is trying to fix the problems of voice assistants before it becomes too difficult.
Despite the underrepresentation of women in AI development, voice assistants are almost always female by default and given feminine names and voices. The most popular tasks given to voice assistants usually mirror jobs historically associated with women including setting kitchen timers, creating to-do lists, and putting together shopping lists â thereby fueling gender bias.
According to several studies, regardless of the listenerâs gender, people typically prefer to hear a male voice when it comes to authority, but prefer a female voice when they need help. But Fâxa believes tech companies like Apple have a responsibility to challenge these kinds of market preferences and not just blindly follow them.
If more women were on Silicon Valleyâs company boards and in their dev teams, the way we imagine and develop technology would undoubtedly alter our perspective on gender roles. One canât help but think what Alexa would have sounded like, or what it wouldâve been named, if more women had a hand in creating its technology.
If youâre a developer or a user, you can try the feminist bot out here.
Get the TNW newsletter
Get the most important tech news in your inbox each week.