You won't want to miss out on the world-class speakers at TNW Conference this year 🎟 Book your 2 for 1 tickets now! This offer ends on April 22 →

This article was published on September 29, 2020

Who thought political ads featuring Deepfake Putin and Kim trashing the US was a good idea?

We don't need fake dictators claiming they aren't attacking our democracy when the real ones are


Who thought political ads featuring Deepfake Putin and Kim trashing the US was a good idea?

A not-for-profit called RepresentUS, working with creative media agency Mischief @ No Fixed Address, recently used the popular Deepfake AI system to create a pair of political ads featuring actors digitally manipulated to look like Vladmir Putin and Kim Jong Un mocking the current state of US politics.

The ads were reportedly slated to air on Fox, CNN, and MSNBC in their DC markets but were “pulled at the last minute” for reasons unknown.

Allow me to clear the mystery: they were probably pulled because this is a bad idea. But before we get into that, let’s take a moment to break down what’s happening in the ads.

Here’s Deepfake Vladmir Putin:

And here’s Deepfake Kim Jong Un:

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

RepresentUs, the not-for-profit behind the project, says on its website that it brings together “conservatives, progressives, and everyone in between to pass powerful state and local laws that fix our broken elections and stop political bribery. Our strategy is central to ending political corruption, extremism and gridlock.”

The creators claim the purpose of the Deepfake ads is to be shocking and warn voters about the potential dangers our democracy faces. On the surface, this is a great message and it’s easy to get behind the campaign. Whether you’re politically red, blue, purple, or none of the above: if you’re eligible to vote you should.

The reason this ad campaign is a bad idea is because the political battlefield is already rife with bots, bad actors, misinformation, disinformation, and organized chaos designed to disenfranchise as many people as possible.

Instead of clearing the air or cutting through the noise, these ads are just more signal distortion. Not only are they disingenuous on their surface, but they’re marketing fluff. There’s no revelatory information in the ads. It’s a fantasy that distracts from a reality. The Putin and Kim in those videos are claiming they aren’t messing with our democracy because they don’t have to.

Yet, there’s ample evidence that they are engaged in massive interference campaigns. So what’s the real purpose of this ad campaign?

Who is the target audience for this faux-deception? People who think things are going so well that the only way they’d vote is if they were momentarily tricked into thinking Putin and Kim aren’t actively attempting to influence the 2020 election? It doesn’t add up.

The idea that Deepfakes, intentional deception, can be used for political good is a silly one. You won’t find any renowned experts opining that US politics doesn’t have enough subterfuge.

It’s important to mention that the organizations behind the Deepfake Dictators campaign aren’t hiding the fact that these are fake videos. They run a disclaimer at the end of them.

But even this seems a bit skeevy to me, as the disclaimer should run throughout the entirety of the clips to ensure no reasonable person could believe they were real. The people making the videos don’t get to decide what bad actors do with them, but they shouldn’t make it ridiculously easy for their work to be abused.

There’s no good reason to try and “fool” anyone in politics anymore, especially not the voters. Ad campaigns like this are toxic and the fact that this one was created by an outfit that claims no apparent bias towards any particular candidate makes it suspicious. Why muddy the already murky water surrounding the 2020 campaign when our democracy is already drowning in propaganda?

At best, it’s a misguided effort to be provocative for the sake of being provocative — “look at us, we’re doing something we’re not supposed to but it’s for a good cause,” it screams while using Deepfakes for political influence ads, the one thing every AI expert on the planet feared would happen.

But at its worst, it looks like a pointed attempt to add more noise to the election scene while simultaneously downplaying the threat of foreign election interference. And that’s a bad look no matter what your original intent was.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top