This article was published on October 7, 2019

A new study says nearly 96% of deepfake videos are porn


A new study says nearly 96% of deepfake videos are porn Image by: Flickr: Steve C

Deepfakes are one of the scariest phenomenons of recent technology trends. From demeaning women to costing monetary loss, these artificial videos are causing a lot of trouble for people.

Now a new report from Deeptrace, a Netherland based cybersecurity company, has published a new report stating 96 percent of deepfake videos online are porn, and they received over 134 million views. What important to note is that all porn videos feature female subjects.

Danielle Citron, Professor of Law, Boston University, and author of Hate Crimes in Cyberspace, told the company that deepfakes are being used as a weapon against women:

Deepfake technology is being weaponized against women by inserting their faces into porn. It is terrifying, embarrassing, demeaning, and silencing. Deepfake sex videos say to individuals that their bodies are not their own and can make it difficult to stay online, get or keep a job, and feel safe.

The report says since December 2018 the company has found over 14,678 videos online – that’s an increase of 100 percent over 7,964 videos found last year. It also notes that 96 percent of these videos are porn.

Deeptrace says since February last year, there have been a number of sites opening up dedicated to deepfake porn and the top four sites have over 134 million views.

Spread of deepfakes as per genders and nationality

Last year, Reddit banned the premier subreddit for deepfakes called r/Deepfakes, which was host to a lot of creators as well. Deeptrace says it has found 20 creator communities with nearly 100,000 members overall spread across deepfake porn sites, Reddit, 4chan, and 8chan.

The development of deepfakes has also boosted by popular face swap algorithms hosted on GitHub and the rapid rise of Generative Adversarial Network (GAN) in machine learning techniques.

Thanks to this, we saw a repulsive app generating nudes from clothed photos gain popularity quickly. While the app was quickly shut down, many of its clones sprung up across the internet.

Deeptrace also says there are now several apps, online marketplaces, and services that help you create deepfakes in minutes. So, even if someone’s not technically fluent, they can pay a couple of bucks and get a ready fake video.

While the majority of these videos are porn. there are some instances where deepfakes have caused multiple controversies. Deeptrace says deepfakes posses a large threat and provide cybercriminals with new sophisticated capabilities to enhance social engineering and fraud.

You can read the full report here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with