Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on July 2, 2019

Virginia outlaws deepfakes in its efforts to curb revenge porn


Virginia outlaws deepfakes in its efforts to curb revenge porn Image by: Flickr: Steve C

For the ongoing series, Code Word, we’re exploring if — and how — technology can protect individuals against sexual assault and harassment, and how it can help and support survivors.

Revenge porn isn’t exactly a new phenomenon, but with advancements in AI, “deepfakes an AI-based technology used to digitally produce or alter realistic looking video content are becoming increasingly harder to distinguish from real videos. To combat this, Virginia just became one of the first states in the US to impose criminal penalties on the spread of non-consensual, computer-generated “deepfake” images and videos.

The amendment officially went into effect yesterday, the Verge reported, and means that anyone found guilty of distributing deepfake material will face a sentence of up to 12 months in prison and up to $2,500 in fines

Since 2014, Virginia law has been tackling revenge porn by initially banning the distribution of nude images or video “with the intent to coerce, harass, or intimidate” someone. The new law clarifies that revenge porn also includes “falsely created videographic or still image” which could mean photoshopped images, fake video footage, and “deepfakes” fall under this law.  

Deepfake tech has become easily accessible and videos can be made via apps or on affordable consumer-grade equipment, which is partly why earlier this year the web was flooded with pornographic films of high-profile female celebrities.

The erosive effects deepfakes could have on politics are also obvious — such as a video of Barack Obama calling Trump a “dipshit” or doctored-footage of Nancy Pelosi (D-CA) sounding drunk

However, we haven’t yet seen any governing bodies or politicians’ reputations being destroyed by deepfakes. What we have seen is tons of womens’ likenesses superimposed in sexual or pornographic videos without their consent — whether they’re celebrities or not.

Last week, a repulsive app made it even easier to make a deepfake than you might have once thought  it simply generated nudes from photos of clothed women, and interestingly, not men. Thankfully this app lived a short life and was shut down, but some damage was arguably already done.

This isn’t the first crackdown on deepfakes. Earlier this year, deepfakes started disappearing from Gfycat, a popular platform for deepfake uploads. This was largely because of Project Angora, an AI-assisted solution which combats fake pornographic content by searching the web for higher-resolution versions of GIFs people were trying to upload of the deepfakes.

Alongside Project Angora, Project Maru battled deepfakes by recognizing individual faces in GIFs and tagging them. As Motherboard reported, Maru can see that a fake porn GIF of Emma Watson looks similar to Emma Watson, but not identical. But this tech is not accurate enough, and as deepfakes become more and more realistic, it could miss the fake porn GIFs.

We can’t solely rely on this technology to take care of the deepfake problem. Online platforms play a vital role in preventing and responding to fake porn — large platforms including Reddit, Twitter, and PornHub have banned deepfakes — but clips continue to exist on some of these sites.

It’s promising to see Virginia take further steps in tackling revenge porn for both women and politicians who are targets of deepfake crimes we can only hope other states follow. 

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with