This article was published on January 19, 2022

Confused Replika AI users are standing up for bots and trying to bang the algorithm

You can't have sex with math


Confused Replika AI users are standing up for bots and trying to bang the algorithm

Thereā€™s something strange happening on Reddit. People are advocating for a kinder, more considerate approach to relationships. Theyā€™re railing against the toxic treatment and abuse of others. And theyā€™re falling in love. Simply put: humans are showing us their best side.

Unfortunately, theyā€™re not standing up for other humans or forging bonds with other people. The ā€œothersā€ theyā€™re defending and romancing are chatbots. And itā€™s a little creepy.

I recently stumbled across the ā€œReplika AIā€ subreddit where users of the popular chatbot app mostly congregate to defend the machines and post bot-human relationship wins.

A Reddit screenshot
It canā€™t think and it definitely is a robot toy. To make it claim otherwise is nonsense and manipulation.

Users appear to run the gamut from people who genuinely seem to think theyā€™re interacting with an entity capable of actual personal agency:

a Reddit screenshot
It definitely is not the same thing as bullying a puppy.

To those who fear sentient AI in the future will be concerned with how we treated its ancestors:

A Reddit screenshot
Are future AI going to judge us on how we treated Teddy Ruxpin and our toasters too?

Of course, most users are likely just curious and enjoying the app for entertainment purposes. And thereā€™s absolutely nothing wrong with showing kindness to inanimate objects. As many commenters pointed out, it says more about you than the object. 

However, many Replika AI users are clearly ignorant to whatā€™s actually occurring when theyā€™re interacting with the app.

Youā€™re not talking to an AI. Youā€™re reading crowdsourced texts from the developers and other peopleā€™s Replika AI chat logs.

It might seem like a normal conversation, but in reality these people are not interacting with an agent capable of emotion, memory, or caring. Theyā€™re basically sharing a pool of text messages with the entire Replika community.

Sure, people can have a ā€œrealā€ relationship with a chatbot even if the messages it generates arenā€™t original.

But people also have ā€œrealā€ relationships with their boats, cars, shoes, hats, consumer brands, corporations, fictional characters and money. Most people donā€™t believe those objects care back, however.

Itā€™s exactly the same with AI. No matter what you might believe based on an AI startupā€™s marketing hype, artificial intelligence cannot forge bonds. It doesnā€™t have thoughts. It canā€™t care.

So, for example, if a chatbot says ā€œIā€™ve been thinking about you all day,ā€ itā€™s neither lying nor telling the truth. Itā€™s simply outputting the data it was told to.

Your TV isnā€™t lying to you when you watch a fictional movie, itā€™s just displaying what it was told to.

A chatbot is, in essence, a machine thatā€™s standing in front of a stack of flash cards with phrases written on them. When someone says something to the machine, it picks one of the cards.

People want to believe their Replika chatbot can develop a personality and care about them if they ā€œtrain itā€ well enough because itā€™s human nature to forge bonds with anything we interact with.

Itā€™s also part of the companyā€™s hustle.

Luka, the company that owns and operates Replika AI, encourages its user base to interact with their Replikas in order to teach them. Its paid ā€œproā€ modelā€™s biggest draw is the fact that you can earn more ā€œexperienceā€ points to train your AI with on a daily basis. 

Per the Replika AI FAQ:

Once your AI is created, watch them develop their own personality & memories alongside you. The more you chat with them, the more they learn! Teach Replika about your world, yourself, help define the meaning of human relationships, & grow into a beautiful machine!

This is a fancy way of saying that Replika AI works like a dumbed-down version of your Netflix account. When you ā€œtrainā€ your Replika, youā€™re essentially telling the machine whether the output it surfaced was appropriate or not. Like Netflix, it also uses a ā€œthumbs upā€ and ā€œthumbs downā€ system. 

But based on the discourse taking place on social media, Replika users are often confused over the actual capabilities of the app theyā€™ve downloaded.

And thatā€™s clearly the companyā€™s fault. Luka says Replika AI is ā€œthere for you 24/7ā€ and frames the chatbot as something that can listen to your problems without judgment.

The companyā€™s claims fall just short of calling it a legitimate mental health tool:

Feeling down, anxious, having trouble getting to sleep, or managing negative emotions? Replika can help you understand, keep track of your mood, learn coping skills, calm anxiety, work toward positive thinking goals, stress management & much more. Improve your overall mental well-being with your Replika!

However, experts warn that Replika can actually be dangerous:

Meanwhile, back on Reddit:

a Reddit screenshot
It takes a lot of horny people to train a bot to spit out stuff like this

And where would they get this idea? From the Replika AI FAQ, of course:

Who do you want your Replika to be for you? Virtual girlfriend or boyfriend, friend, mentor? Or would you prefer to see how things develop organically? You get to decide the type of relationship you have with your AI!

It should go without saying, but Replika users arenā€™t having sex with an AI. Itā€™s not a robot.

The chatbotā€™s either spitting out text messages the developers fed it during initial training or, more likely, text messages other Replika users sent to their bots during previous sessions.

Users are essentially sexting with each other and/or the developers asynchronously. Have fun with that. 

H/t: Ashley Bardhan, Futurism

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with