Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on October 5, 2018

Deepfakes are being weaponized to silence women — but this woman is fighting back


Deepfakes are being weaponized to silence women — but this woman is fighting back

For the ongoing series, Code Word, we’re exploring if — and how — technology can protect individuals against sexual assault and harassment, and how it can help and support survivors.

Fake sex videos aren’t a new phenomenon, but advancement in AI is worrying as ‘deepfakes’ are becoming increasingly harder to distinguish from real videos.

Deepfake tech has become easily accessible and videos can be made via FakeApp or on affordable consumer-grade equipment, which is partly why earlier this year the web was flooded with pornographic films of high-profile female celebrities.

The erosive effects deepfakes could have on politics are also obvious — such as ‘Barack Obama’ calling Trump a “dipshit.” However, we haven’t yet seen any governing bodies or politicians’ reputations been destroyed yet by deepfakes. What we have seen is tons of womens’ likenesses put non-consensually in sexual or pornographic videos — whether they’re celebrities or not.

Being a victim of deepfake pornography

For 24-year-old Noelle Martin, her battle with deepfake pornography started six years ago. Anonymous predators stole non-sexual images of her from social media and posted them onto porn sites and threads. This was also accompanied by invasive and graphic commentary about her body.

The situation escalated even more, Martin told TNW: “It then moved to doctoring images of me into graphic pornography, on the cover of pornographic DVDs to fake images of me being ejaculated on.”

Some of the images of Martin were when she was just 17, raising the issue of child pornography.

“They then doctored me into pornographic videos performing oral sex and having sexual intercourse,” Martin explained.

Although it’s been six years since the first deepfake of Martin, she still faces continued harassment today. Martin says this is in response to her speaking out publicly about the initial deepfake: “It was when I started speaking out publicly about my story of image-based abuse that the perpetrators wanted to assert their dominance over me and try to silence me.”

In addition to making sexually exploitative content of Martin without her consent — or the porn performers’ — harassers have on several occasions shared her location and full name publicly to intimidate her.

Despite this, Martin has continued to speak out publicly against deepfakes, and even gave a Ted Talk to share her story as a victim:

These deepfakes are still easily found by searching Martin’s name, raising questions about her future employability and online reputation. Not being a celebrity, Martin doesn’t have the benefit of people knowing that these images _must_ be fake — as they would assume with famous actors.

Martin’s refusal to give up and determination to keep fighting for victims of deepfake revenge porn has led to greater awareness of the issue, especially in legislative bodies. Her fight resulted in tougher legislation on revenge porn in her home country of Australia, improving the legal status of victims. But the fight against deepfakes is far from over.

The fight against deepfakes

Earlier this year, deepfakes started disappearing from Gfycat, a popular platform for deepfake uploads. This was largely because of Project Angora, an AI-assisted solution which combats fake pornographic content by searching the web for higher-resolution versions of GIFs people were trying to upload of the deepfakes.

Alongside Project Angora, Project Maru battled deepfakes by recognizing individual faces in GIFs and tagging them. As Motherboard reported, Maru can see that a fake porn GIF of Emma Watson looks similar to Emma Watson, but not identical.

The main problem with this technology is that it’s not accurate enough, and as deepfakes become more and more realistic, it could miss the fake porn GIFs.

We can’t solely rely on this technology to take care of the deepfake problem. Online platforms play a vital role in preventing and responding to fake porn — large platforms including Reddit, Twitter, and PornHub have banned deepfakes — but clips continue to exist on some of these sites, not to mention other sites who have re-posted the explicit content.

But technology will never solve the problem on its own and there will always be need for coordinated government intervention in the form of clear laws.

Coordinated fight for the future

Since the internet is virtually borderless, perpetrators can be on the other side of the world ruining the life of a person they have never met, nor will ever meet.

“There needs to be stronger laws and penalties for this behavior and governments around the world, law enforcement, tech companies, websites, and social media sites need to work together if this is ever going to be dealt with effectively,” Martin explained.

In order to tackle deepfakes, new laws and defensive technologies will help, but there also needs to be a broader cultural shift. We need to change our attitudes towards victims of online sexual assault as deepfake pornography is just the latest iteration of women’s bodies being controlled and abused.

This is a big undertaking, but we can do better. We have to.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with