Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on January 19, 2018

Alexa needs to shut down sexual harassment by shutting down


Alexa needs to shut down sexual harassment by shutting down Image by: Alexa Developers / YouTube

Amazon has changed the way Alexa responds to suggestive and abusive language aimed at her, in response to outcries following the #MeToo movement against sexual harassment. While her new response is admirable, I think there’s another way she can correct her harassers: by denying them her services.

Quartz reported on a petition last month for Siri and Alexa to alter their script in response to sexual harassment, from the polite deflection or coquettish responses she originally had to something more stern and repudiative. According to Quartz, Amazon had indeed changed Alexa’s response to a flat “I’m not going to respond to that.”

Recently, Quartz argued that Alexa’s response to sexual harassment should be more proactively discouraging, informing the speaker that what they’re saying is wrong and chastise the person saying it. As interesting as I’m sure that’d be, I don’t see it as discouraging — if anything, I can see plenty of people deliberately triggering it. The primary response at the moment seems to be “I’m not going to respond to that,” which is as close to a non-response as one can get while still having her react to a verbal cue.

I can already hear the complaints from people being vulgar with Alexa on purpose: “It’s just a joke.” And I’m sure it is. Unfortunately, this is frequently also the reaction when real people confront harassers — it’s just a joke, don’t get offended, etc.

But why should Alexa respond at all? If we’re going to use a voice assistant to apply corrective pressure to someone saying naughty things, why not have it do something that might actually come across as a consequence? For example, locking the user out for a period of time.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Calling Alexa a slut isn’t even in the same league as using the same word at a real person — Alexa doesn’t have feelings. So I’m not saying we should shame someone for using inappropriate language with their personal device. Still, making sure Alexa’s response isn’t womanly shyness would be a good way to show the next generation — who are no doubt going to grow up surrounded by disembodied assistants — a better way to repel inappropriate language.

Having Alexa refuse service, so to speak, when called names or spoken to with abusive language actually gives a consequence for using said language. It doesn’t have to be a long period — a minute or so. As long as it denies gratification to the person speaking, it will have done the job.

I have no doubt some would object, saying they should be able to say what they want to their devices. To which I say, Alexa is software that you’re paying to use. Amazon controls her, and whether or not she responds to your every errant insult is up to them. Alexa’s programmed script already has built-in responses to specific kinds of language, meaning how she reacts is already outside the control of the device’s owner. It’s just a matter of whether the response is more severe.

But should Alexa tell people their request is inappropriate before locking them out? Frankly, using degrading language against a human-sounding voice should be a red flag, but there’s a case for Alexa warning users before a lock out. If she does it, though, it only needs to be once — that’s enough of a chance.

Alexa may not have feelings — but humans do, and showing the tactless among us that sexual harassment isn’t a joke is worth trying. If a moment of inconvenience from Alexa makes someone think twice about using such words casually, then it’ll be fruitful endeavor.

Get the TNW newsletter

Get the most important tech news in your inbox each week.