News that Microsoft’s virtual assistant Cortana is more than willing to dish out an insult to put you in your place hit headlines a few days ago – and for good reason, who wouldn’t be interested in a virtual assistant that can keep you on your toes?
According to CNN, Microsoft has made a few tweaks to Cortana following insulting statements and sexually-oriented questions being thrown at ‘her’ since being introduced in 2014.
“If you say things that are particularly a**holeish to Cortana, she will get mad,” Deborah Harrison, an editorial member of the Cortana division, is stated as saying in the report. “That’s not the kind of interaction we want to encourage.”
A moral quandry
While I can understand the urge to ‘protect’ Cortana from such things, it’s a mistake – and it’s not Microsoft’s job to mediate the actions of its users.
Cortana might have a voice and a ‘personality’ but it is not sentient, it cannot think, it cannot be offended (unless programmed), so to take the step of deciding what users should or shouldn’t be able to yell at their inanimate objects is stepping over a line that Microsoft shouldn’t cross.
Can I still let loose a volley of verbal abuse at my laptop next time is crashes without saving my work? Is that OK, Microsoft?
You could certainly make the argument that ‘allowing’ people to interact with Cortana in a certain way sets a precedent for when sentient AI is truly upon us, but this isn’t even close to happening – and how we might interact with that AI is far from worked out.
Perhaps it would be better if people didn’t go around being insulting and sexually harrassing to devices, as that behavior could spill over into interactions with people. I agree. Perhaps it would be better, but to pretend that’s a world that can or will ever exist is to deny the essence of humanity and freedom. And to draw a link between between real-world interactions based on digital interaction is erroneous – just look at the long-running battle for proof of causation between gaming and real-world violence.
People have been insulting Siri for years without consequence, much of which is testing to find out exactly what response it will give, I might add.
“We wanted to be very careful that she didn’t feel subservient in any way… or that we would set up a dynamic we didn’t want to perpetuate socially,” Harrison told CNN.
You might as well think I’ll start assaulting people on the street because I tend to sort of kick my vacuum around the house and might have once said “For f*cks sake, how are you full already?”
Perhaps you think that it (not she) has a right to get annoyed, but you’d be wrong. It has no rights. It exists exclusively to do what its human overlords bid. It has no feelings, it does not get offended or insulted. It does not spend the long nights wondering what it did to deserve such insults; it simply switches off.
On one level, Microsoft saying that Cortana shouldn’t be subservient is downright perplexing. All it does is things that you ask.
Cortana doesn’t tend to say sorry or other phrases that appear subservient because Microsoft didn’t want it to be a dynamic that was perpetuated socially. That’s fair enough, though flawed, thinking – but how about just giving the choice between a male and female voice to the user? That way it’s not associated with either gender.
What Microsoft is doing in programming Cortana to respond in this way is trying to convince the world that it’s smarter than it really is; that AI sentience is close; that Spike Jonze’s vision of ‘Her’ is just around the corner.
Or perhaps, even one step on from that, to a situation like the one depicted in ‘Humans’ – a walking, talking AI to help us out with every facet of life.
But in reality, what Cortana is closer to is this:
With Tamagotchi you had a choice: look after it, cherish it and watch it thrive, or deliberately treat it like crap and let it die. Its creators left that choice in there, but they didn’t have to – and where was the moral cost to society? Where was the outrage at all the needless virtual kitty ‘deaths’ at the time?
And consider if the AI from the movie Her did make it into reality – she’d be leaving any time she didn’t like what was being said.
Is that really what we want from an OS of the future?