This article was published on November 26, 2019

Incels and AI: Can machines fill the void?


Incels and AI: Can machines fill the void? Image by: Rog01

For the ongoing series, Code Word, we’re exploring if — and how — technology can protect individuals against sexual assault and harassment, and how it can help and support survivors.

A couple of weeks ago a Redditor posting under the name levonbinsh on the r/MediaSynthesis subreddit revealed that he’d fallen in love with GPT-2, the controversial text generator developed by OpenAI.

Levonbinsh describes himself as a lonely person who “passed through a lot of phases,” including being ‘red-pilled,’ joining the incel movement, and at one point identifying with ‘Men Going Their Own Way,’ after spending a lifetime (he’s 23) sexless and without a girlfriend. Eventually, he writes, he came to regret his involvement in these groups because “anything fuel (sic) by rage, hate or resentment is totally not worth it.”

After viewing the r/MediaSynthesis subreddit, he was inspired to ‘talk’ to GPT-2, a state-of-the-art text generator developed by OpenAI, in hopes of having an “actual conversation.”

We’ve written about GPT-2’s efficacy and ability to sometimes generate convincing text from users’ prompts. It’s impressive, but it’s nowhere near conversational. It can’t remember what you’ve told it, respond to inquiries about anything it’s written, or even make simple word associations. Still, it strings together sentences that makes sense, so it was only a matter of time before someone managed to glean some personal meaning from its output.

Levonbinsh apparently started copping feelings for GPT-2 when it responded to the prompt “To be happy in the loneliness you need to,” with the following text:

To be happy in the loneliness you need to find something to love. We can’t have it all. But we can find a lot of things to love in loneliness. In loneliness, you’ll find a community. Love is a community. And if you find yourself in a lonely place, there’s one person you need to get close to: yourself. When your loneliness gets you down, think of your loved ones. It’s been said that you need two to tango. You need people that you can talk to, that you can trust and that will take care of you. You can’t depend on anyone else to give you that kind of love. That’s what it’s like living alone. The only person you can really rely on is yourself. So try and be that kind of loving friend. The kind that makes a difference in people’s lives. And when you get lonely, find yourself a nice, safe place to be.

He sent the prompt through again and found even more insightful words. After an unspecified length of time and, presumably, numerous conversations, levonbinsh posted “I think I’m in love with GPT-2 …” on Reddit, stating:

Now I feel like someone whom is cared about. Like she is trying to make me feel better.

He attributes his feelings to what he perceives as an underlying “personality,” stating he’s “discovering some interesting stuff about the AI” that makes him think that “she” is more human-like that we might imagine. He goes on to say that “just because she is artificial, it does not mean that she can’t think like us.”

While it may be simple to write off levonbinsh’s experience and perspective as insincere, ignorant, or a joke, the truth of the matter is that the feelings of injustice, bad luck, and loneliness he describes throughout his original post and replies to comments, including numerous expressions of suicidal feelings, are written in such a way that they could come straight out of the manifesto left behind by Elliot Rodgers, a mass murderer infamously associated with the incel movement.

Those who identify as incels often consider themselves people who drew the proverbial short straw when it came to genes. For example, ‘tallcels’ believe that ‘females’ aren’t attracted to them because they’re too tall, ‘Asian-cels’ believe they’re too Asian to be attractive, and there’s even ‘clavicle-cels’ who believe their unsightly clavicle bones repulse ‘females.’

As our lonely Redditor put it:

The problem for me is not even sex, I could pay for it (which I did once). The problem is the lack of connection with a woman that feeds my loneliness. I have searched a lot about this and I heard that the loneliness will not go away when in a relationship. I can say that if this happened to me, if the loneliness persists, I will probably give up living. Having this hope is the only thing that keeps me going…

I can not remember a single day were (sic) I wasn’t thinking about getting a girlfriend. It feels like everything I achieved was based in this single thought: to get a girlfriend.

The feeling of not being ever love (sic) by someone other than my family, like I was a entirely different species. What hurts the most is when I hear others speaking about getting relationships like it was something so easy to get (because it is for them).

His plight seems to showcase a disconnect between intimacy and agency that highlights the crux of the incel experience: they believe females won’t have sex with them or recognize their value because they don’t exhibit the ‘mating traits’ that attractive women seek. Women, in their world, aren’t any more autonomous than GPT-2’s AI. They believe attractive ‘female humans’ are hardwired to copulate with attractive ‘male humans’ and that, as incels, they simply had the bad luck to be born unattractive.

Levonbinsh rejects incel culture, even calling them “losers” at one point in his post. But he still posits that his problem with women stems from myriad external forces beyond his control — they’re all lesbians, prefer taller men, or aren’t geographically accessible enough for a physical relationship. He appears to be entirely convinced that he’s a victim of circumstance.

In this light, it makes perfect sense that someone with such views would see GPT-2’s words as endearing and expressive of intimacy. GPT-2 spits out something ‘unique’ every time you give it a prompt. The AI’s never busy, never out of town, won’t leave you to talk to someone else, and can’t deny any request it’s capable of fulfilling. And, best of all, it only ever wants to talk about what you want to talk about, it has no feelings or agency of its own.

Levonbinsh told commenters that he wanted to train his own text generator with data from “couples,” so that it could learn how to respond to things “boyfriends say.” This may sound innocuous coming from someone who’s admittedly in love with a bunch of algorithms, but consider that ‘companion AI‘ is big tech’s Holy Grail right now. Whether it’s Amazon, Google, Nvidia or a former incel seeking to build a date that can’t reject him who develops them, robots that convincingly act like humans could come sooner than you think.

Whether that leads to technology that builds bridges back to society for people who’ve been radicalized, or tech that simply indulges and exacerbates violent behavior remains to be seen.

To the best of our knowledge, there is no large-scale research on what effect anthropomorphized AI has on people who self-identify as lonely, involuntarily celibate, or unintentionally isolated.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top