A YouTuber this week created a convincing fake video of Donald Trump using the deepfakes algorithm.
We’ve written previously about the algorithm, specifically about its use in creating fake pornographic videos featuring the face of your favorite celebrities and the body of a passable porn actress. Now that popular platforms like Reddit, Imgur, and Pornhub are cracking down on the practice, AI enthusiasts are turning their attention to newer, SFW applications.
This is good. But also not.
Take this example, which features Alec Baldwin on Saturday Night Live doing his famous impersonation of President Trump. YouTuber ‘derpfakes‘ trained the AI image swap tool to create a composite of Trump’s face, over Baldwin’s speech and mannerisms. The result is a convincing, if not quite ready for primetime representation of the Commander in Chief.
While the algorithm isn’t quite there yet, the progress made in the past several weeks has been staggering. What went from a pixelated semi-recognizable representation of your favorite actresses in compromising positions, is now showing real potential for upending our understanding of truth.
Trump, in the above video, looks a little too perfect to be lifelike, and he’s backed by the speech and mannerisms of Baldwin, which are ever-so-slightly off since it’s not actually Trump doing the talking, or moving.
But how long do you think it takes before the same users smooth out the kinks?
Users are already trying. Take this fake, a mashup of Bruno Ganz playing Adolf Hitler in the 2004 film ‘Downfall’ and current president of Argentina, Mauricio Macri.
Or there’s this, a fake of Inés Arrimadas, a member of Catalonia’s parliament overlaid on a pornographic video from adult content producer Twisty’s.
The latter two examples are from weeks earlier, where subtle facial cues gave away the illusion.
Now, though, the fakes are more convincing than ever and downright passable for lesser-known celebrities or public figures. In a year or two, as the algorithms continue improving, it’s unclear whether the average person will even be able to discern authentic vides from fakes.
At that point, even video evidence becomes questionable, and perhaps even unbelievable. In a world that already can’t agree on simple facts, the future looks pretty terrifying.
Pssst, hey you!
Do you want to get the sassiest daily tech newsletter every day, in your inbox, for FREE? Of course you do: sign up for Big Spam here.