
A not-for-profit called RepresentUS, working with creative media agency Mischief @ No Fixed Address, recently used the popular Deepfake AI system to create a pair of political ads featuring actors digitally manipulated to look like Vladmir Putin and Kim Jong Un mocking the current state of US politics.
The ads were reportedly slated to air on Fox, CNN, and MSNBC in their DC markets but were âpulled at the last minuteâ for reasons unknown.
Allow me to clear the mystery: they were probably pulled because this is a bad idea. But before we get into that, letâs take a moment to break down whatâs happening in the ads.
Hereâs Deepfake Vladmir Putin:
And hereâs Deepfake Kim Jong Un:
RepresentUs, the not-for-profit behind the project, says on its website that it brings together âconservatives, progressives, and everyone in between to pass powerful state and local laws that fix our broken elections and stop political bribery. Our strategy is central to ending political corruption, extremism and gridlock.â
The creators claim the purpose of the Deepfake ads is to be shocking and warn voters about the potential dangers our democracy faces. On the surface, this is a great message and itâs easy to get behind the campaign. Whether youâre politically red, blue, purple, or none of the above: if youâre eligible to vote you should.
The reason this ad campaign is a bad idea is because the political battlefield is already rife with bots, bad actors, misinformation, disinformation, and organized chaos designed to disenfranchise as many people as possible.
Instead of clearing the air or cutting through the noise, these ads are just more signal distortion. Not only are they disingenuous on their surface, but theyâre marketing fluff. Thereâs no revelatory information in the ads. Itâs a fantasy that distracts from a reality. The Putin and Kim in those videos are claiming they arenât messing with our democracy because they donât have to.
Yet, thereâs ample evidence that they are engaged in massive interference campaigns. So whatâs the real purpose of this ad campaign?
Who is the target audience for this faux-deception? People who think things are going so well that the only way theyâd vote is if they were momentarily tricked into thinking Putin and Kim arenât actively attempting to influence the 2020 election? It doesnât add up.
The idea that Deepfakes, intentional deception, can be used for political good is a silly one. You wonât find any renowned experts opining that US politics doesnât have enough subterfuge.
Itâs important to mention that the organizations behind the Deepfake Dictators campaign arenât hiding the fact that these are fake videos. They run a disclaimer at the end of them.
But even this seems a bit skeevy to me, as the disclaimer should run throughout the entirety of the clips to ensure no reasonable person could believe they were real. The people making the videos donât get to decide what bad actors do with them, but they shouldnât make it ridiculously easy for their work to be abused.
Thereâs no good reason to try and âfoolâ anyone in politics anymore, especially not the voters. Ad campaigns like this are toxic and the fact that this one was created by an outfit that claims no apparent bias towards any particular candidate makes it suspicious. Why muddy the already murky water surrounding the 2020 campaign when our democracy is already drowning in propaganda?
At best, itâs a misguided effort to be provocative for the sake of being provocative â âlook at us, weâre doing something weâre not supposed to but itâs for a good cause,â it screams while using Deepfakes for political influence ads, the one thing every AI expert on the planet feared would happen.
But at its worst, it looks like a pointed attempt to add more noise to the election scene while simultaneously downplaying the threat of foreign election interference. And thatâs a bad look no matter what your original intent was.
Get the TNW newsletter
Get the most important tech news in your inbox each week.