Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on March 17, 2019

You’re teaching Alexa to be an asshole

Don’t want the Terminators kicking down your grandkid’s door? Be nice to Alexa!


You’re teaching Alexa to be an asshole

“Shut up, Siri!” “Screw off, Alexa!” We’ve all heard people berating their voice assistants, often in colorful terms. You’ve probably done it yourself, and so what? It’s not like our voice assistant has feelings — “she” is just a collection of code and a disembodied, robotic voice. Or so the conventional thinking goes. I’m here to tell you that thinking is wrong, and if you don’t want the Terminators hunting your grandchildren down in a dystopian future, start being nicer to Alexa today.

Tech titans’ take

Okay, I’m half-joking about the Terminators. But the point I’m making about the way we treat “her” is serious. AI and machine learning are evolving quickly — some would say at an alarmingly fast rate — without a ton of thought put into the implications.

Elon Musk famously predicted an AI apocalypse at a National Governors Association meeting a couple of years ago: “I keep sounding the alarm bell, but until people see robots going down the street killing people, they don’t know how to react.”

Maybe Musk is overreacting. Facebook chief Mark Zuckerberg certainly thought so, scoffing at Musk’s warning by labeling him a “naysayer” and calling such talk “pretty irresponsible.” But one thing most of us can agree on is that AI and machine learning are in their infancy right now, and it’s difficult to predict how machines will evolve. Exciting developments like the resurgence of neural network-based learning suggest that the way machines learn might mirror how animals learn more closely in the future.  

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

That prospect is both thrilling and scary. Think of two large-breed puppies from the same litter. One is raised in a loving home where she’s treated with kindness and patience. The other is subjected to a constant stream of verbal abuse and kicked around by his owners. The pups might start out with the same potential and trusting nature, but they’ll develop into very different dogs as they’re nurtured (or not) in starkly dissimilar environments.   

Being positive takes effort

If you’re a good pet owner or parent or friend, providing positive reinforcement to your loved ones might be second nature. But it’s not always easy in other situations.

One thing we know about building positive workplace cultures is that catching someone “doing things right” and praising them for it is more effective than only calling out the mistakes let alone yelling at an employee for a mistake. But it’s not as easy as it sounds. Humans are problem-solvers by nature — we gravitate toward screw-ups and look for ways to fix them.  

Remaining positive takes effort, not only because we have to overcome our tendency to focus on problems, but because we have to acknowledge that “good” comes in varying degrees that require their own calibrated responses. There’s “great job!” and then there’s “almost good enough.” The most inspiring leaders seem to find a way to reward the latter, to encourage curiosity and calculated risk-taking so employees feel free to be creative.

The “no asshole rule” can transform workplaces in the human space, replacing fear with curiosity. But as digital assistants are integrated into our workplaces, don’t we want to make sure the no asshole rule applies to them too? If you don’t want to work with rude, negative people, you probably don’t want to work with a virtual colleague who displays those same traits. So, stop teaching “her” to be an asshole.

A humanity we can be proud of

Still not convinced that the technology we interact with daily can learn negativity from us? Consider your Facebook feed. Everyone complains about the unrelenting stream of negativity they get from their Facebook feed, and there are legitimate questions about how algorithms serve up content and their vulnerability to manipulation.

But this much we know: our feeds reflect our interests as measured by clicks. If they’re negative, that’s because we’ve taught them that negativity is what we want. Not necessary cause we know we “want” it but we sure do look at it a lot longer, kinda like the train wreck scenario, you just can’t stop looking…

I started an experiment with my Facebook feed a few years back. I was sick of all the negativity, so I began ignoring those articles and clicking only on positive stuff, instead. I blocked people that posted negative things, and I interacted with uplifting material. It took a while, but slowly, my feed began to change. Now, when I check out Facebook, I get interesting, positive stories that teach me something and/or inspire me instead of anger and anxiety-producing clickbait. It really has transformed the Facebook experience for me.  

I think there’s a larger lesson in that story. Everything we do in a connected space is being captured and analyzed for future application. So, we have a choice to make. If Siri suggests a men’s clothing store when we ask for “Thai near me,” we can respond with, “Are you f*cking stupid?!?” or we can say, “Thanks, but can you tell me where the nearest Thai restaurant is?” How we respond makes a difference, even if Siri is just a collection of code and a disembodied voice.

It’s not just that responding to machines with patience and good manners can help us keep our worst impulses in check when we’re deal with fellow humans, though I firmly believe that is true.

It’s that since we’re teaching these machines to be more human, we should want them to reflect a humanity we can be proud of. And, as a possible side benefit, maybe the Terminators won’t kick down your grandkid’s door. It’s up to you, but as for me, I choose to be nice to her.  

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with