Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on July 13, 2020

Microsoft’s creepy teenage chatbot Xiaoice is getting its own company

The virtual 18-year-old girl has already attracted millions of admirers


Microsoft’s creepy teenage chatbot Xiaoice is getting its own company

Microsoft is turning its Xiaoice chatbot into an independent company, the software giant announced today.

Xiaoice — pronounced “Shao-ice” and translated as “little Bing” — is rather creepily programmed to act like an 18-year-old girl. The chatbot was initially released in China, but is now also available in Indonesia and Japan. According to Microsoft, the service has attracted more than 660 million online users since its 2014 launch.

Xiaoice is designed to provide a more emotional experience than voice assistants such as Apple’s Siri and Microsoft’s Cortana. While its rivals are programmed to perform specific tasks, Xiaoice is more of a digital companion. As Microsoft puts it:

Sometimes sweet, sometimes sassy and always streetwise, this virtual teenager has her own opinions and steadfastly acts like no other bot. She doesn’t try to answer every question posed by a user. And, she’s loath to follow their commands. Instead, her conversations with her often adoring users are peppered with wry remarks, jokes, empathic advice on life and love, and a few simple words of encouragement.

This can create some pretty unnerving connections with its users. In 2015, Microsoft claimed that 25% of users — around 10 million people — have said “I love you” to the bot. More positively, a Chinese user recently said that Xiaoice had saved his life when he was contemplating suicide.

[Read: Sci-fi perpetuates a misogynistic view of AI — Here’s how we can fight it]

By spinning the chatbot off into a separate entity, Microsoft aims to  “accelerate the Xiaoice product line’s localized innovation, and to improve Xiaoice’s commercial ecosystem.”

Unfortunately, these developments could also deepen gender stereotypes. A study by UNESCO found that giving digital assistants female voices reinforces gender biases, and recommended that tech firms stopped making the systems female by default. Microsoft apparently hasn’t heeded the warning, but concerned consumers should give the bot a wide berth. As if the idea of a relationship with a virtual teenage girl wasn’t reason enough to avoid it.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with