Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on January 31, 2018

Someone is finally purging the AI-made fake celebrity porn from Reddit


Someone is finally purging the AI-made fake celebrity porn from Reddit

Several weeks back we wrote about a bizarre new trend: Redditors were harnessing the power of artificial intelligence to create fake celebrity porn to fap to. But now it seems that someone has finally began purging the controversial content from some of the platforms where it was hosted, including Reddit and Gfycat.

Shortly after Motherboard brought this phenomenon to light back in December,  the so-called “Deepfakes” trend picked up even more momentum, with numerous Redditors using the AI-powered software to create new footage. In fact, one user went as far as building an app, called FakeApp, that essentially makes it easy for anyone to contribute with their own fake videos.

But this might all be coming to an end now – at least on some platforms.

Redditors have pointed out that many Deepfakes videos – especially ones posted in a subreddit specifically dedicated to pornographic Deepfakes – have suddenly disappeared from Reddit and the popular GIF-sharing service Gfycat.

While recently posted fakes continue to appear on Gfycat, most entries older than a day have been wiped.

“I just noticed that […] my upload yesterday was deleted, could be a copyright issue,” one user speculated. “I don’t think it’s copyright since random Japanese idols with little to minimal presence in the West have been removed,” another Redditor chimed in, “[e]ven the botched ones that have no resemblance to any human being are gone. That suggests someone from [G]fycat proactively removed all the gifs linked here.”

Indeed, some isers have since began re-uploading the missing content to other platforms.

In place of the footage, missing Gfycat entries now show the following error message:

Similarly, users have pointed out that numerous Reddit threads and posts featuring Deepfake content have disappeared from the platform.

We have reached out to Gfycat for further comment and will update this piece accordingly should we hear back.

For what is worth, the Deepfakes trend has bred some rather benign – and occasionally funny – content too. Indeed, many Redditors used the same technology to put the face Hollywood A-lister Nicolas Cage in a slew of movies he never appeared in.

It seems that even some of these harmless clips have now gone missing.

But as The Verge has acutely pointed out, things get exponentially more problematic when the same software is applied to splice the faces of real people into smut flicks.

What makes matters worse is that it remains unclear what measures such people can take to have the fake footage taken down. Speaking to The Verge, Santa Monica University law professor Eric Goldman noted that the legal situation is complicated, to say the least. Moreover, removing such content could possibly be a violation of the First Amendment.

“Although there are many laws that could apply, there is no single law that covers the creation of fake pornographic videos,” Megan Farokhmanesh wrote for The Verge, “and there are no legal remedies that fully ameliorate the damage that deepfakes can cause.”

Another issue, underscored by Wired, is that the bodies that appear in these fake videos technically do not belong to the celebrities whose faces we see in these clips, which makes it difficult to pursue such cases as privacy violations.

You can’t sue someone for exposing the intimate details of your life when it’s not your life they’re exposing.”

Then there is the whole question of intent. One Redditor who spoke in support of Deepfakes said that “the work that we create here in this community is not with malicious intent.”

Quite the opposite,” the user continued. “We are painting with revolutionary, experimental technology, one that could quite possibly shape the future of media and creative design. This technology is very new. And new scares people. A lot.”

But still, one can’t help but wonder why the community has focused so much of its attention on putting celebrities in fake porn videos.

Say what you will, but there must be more compelling applications for this technology than fapping.

Update: A spokesperson for Gfycat has since confirmed the controversial fake celeb smut flicks have been indeed deliberately removed.

Our terms of service allow us to remove content we find objectionable,” Gfycat told TNW over email. “We find this content objectionable and are actively removing it from our platform.”

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top