Update (August 22, 2019): In a story published on August 21, Microsoft told Vice it’s stopped listening to recordings from Xbox products:
We stopped reviewing any voice content taken through Xbox for product improvement purposes a number of months ago, as we no longer felt it was necessary, and we have no plans to re-start those reviews,” the spokesperson wrote. “We occasionally review a low volume of voice recordings sent from one Xbox user to another when there are reports that a recording violated our terms of service and we need to investigate. This is done to keep the Xbox community safe and is clearly stated in our Xbox terms of service.
If you haven’t had enough stories about contractors listening in on your conversations, good news: we’ve found even more of them. This time the contractors in question work for Microsoft and have been listening to you through your trusty Xbox One.
The new report from Motherboard reveals that contractors listened to commands that were supposed to be triggered by the words “Xbox” and “Hey Cortana,” but were sometimes captured by accident. That seems to be a running theme whenever I hear these stories about voice-activated assistants. Earlier this month, Microsoft admitted humans were listening into Skype calls and Cortana commands, and now its workers have included the Xbox (which ran on voice commands even before Cortana‘s inclusion in 2016).
The stories are mostly harmless — one of the workers says they often heard children saying things like “Xbox give me all the games for free,” a hack I’m sure we all tried at some point. Others heard users frantically trying to dismiss Cortana after accidentally summoning her during gameplay. At this point, it’s the same story for different devices: “X listening device has been picking up what you say, including stuff you didn’t want listened to, etc.” Sub “X” for any voice-enabled device on the market, evidently.
But it’s still kind of unnerving, especially given that this was what everyone and their mom was afraid would happen when Microsoft made the Kinect part of the system back in the Xbox 360 days. We didn’t want the Kinect listening to random conversations — and apparently that’s exactly what happened. Microsoft — along with several other companies who collect voice data to improve the recognition of their AI software — insist the data is stripped of identification as much as possible.
This is probably one of the reasons why it’s a good thing Microsoft’s excising Cortana from the Xbox One in a future update. The voice assistant will no longer be available on Xbox One starting this fall. Supposedly this move is prompted by Microsoft moving towards “cloud-based assistant experiences,” whatever that may mean. But I do wonder if maybe the company was noticing a dwindling lack of users who weren’t little kids.
We’ve reached out to Microsoft for comment.
Celebrate Pride 2020 with us this month!
Why is queer representation so important? What's it like being trans in tech? How do I participate virtually? You can find all our Pride 2020 coverage here.