Human-centric AI news and analysis

This spooky deepfake AI mimics dozens of celebs and politicians

There's no celebrity it can't mimic

deepfake, ai, voice, celebrities, politicians

“I did not fuck my dog,” I can hear in a recording I’m currently listening to. “I did not cum on my dog,” the recording continues. “I did not put my dick anywhere near my dog. I’ve never done anything weird with my dogs.”

The voice sounds oddly familiar, like I’ve heard it a thousand times before — and I have. Indeed, it sounds just like Sir David Attenborough. But it’s not him. It’s not a person at all.

It’s simply a piece of AI software called Vocodes. The tool, which I can best describe as a deepfake generator, can mimic the voices of a slew of politicians and celebrities including Donald Trump, Barack Obama, Bryan Cranston, Danny Devito, and a dozen more.

All you need to do is write anything, choose a voice you’d like to mimic, and Vocodes takes care of the rest.

Here are a couple of examples of the same sentence uttered by Attenborough, TV personality Craig Ferguson, Y Combinator’s Sam Altman, and comedian Gilbert Gottfried:

 

(We used the same sentence when testing a voice generator trained on Jordan Peterson’s voice. The reason is to point out how easy it is to put words — no matter how vulgar or vile — into someone else’s mouth.)

We’ve previously seen apps like this, but Vocodes impresses with the sheer volume of voices available to test out.

Still, not all of them are perfect. In fact, the app has ordered the voices based on their quality — from high to terrible. Vocodes is best at mimicking the voices of Sir David Attenborough, Craig Ferguson, Sam Altman, and Gilbert Gottfried, but even those models struggle with unorthodox words and expressions.

Things only get worse when we move onto voices the AI is less familiar with, like those of Hillary Clinton, John Oliver, or Tupac Shakur.

Whenever the app encounters a word it can’t read, it simply skips over it. That’s what happened when I entered a sentence including the words “Tesla” and “Elon Musk,” for instance.

The difference in quality between voices is most obviously manifested in the way the AI stitches together words and sentences. Voices rated “high quality” mimic speech in a much more natural and balanced way, while poor quality alternatives resemble the cadence of Windows 2000’s iconic text-to-speech app — they lack rhythm and sound too detached.

As with other similar apps, Vocodes raises some serious ethical questions about the future of deepfakes. When a developer released an app that almost perfectly imitated the voice of Jordan Peterson, the controversial psychologist penned a post voicing his concern that deepfakes “need to be stopped, using whatever legal means are necessary.”

While the psychologist was certainly right to ring the alarm over the potential threat of using deepfakes for nefarious reasons, his take also drew criticism for stifling the use of technology for creative purposes.

Eventually, the maker of the app — dubbed NotJordanPeterson — decided to shut down the service, citing Peterson’s post as the reason. “In light of Dr. Peterson’s response to the technology demonstrated by this site…and out of respect for Dr. Peterson, the functionality of the site will be disabled for the time being,” a message on the site read at the time.

I’m not a lawyer, but I think we’re entering into a legal gray area,” Vocodes creator Brandon Thomas said, addressing the legal implications of the app. “There are the existing frameworks of copyright, parody, free speech, slander, libel, etc. that are all somewhat tangential to this.”

“I believe (I’m not certain) that celebrity voice impersonation is legal as long as it is not used to sell or endorse a product,” he added.

I don’t think the legislature should be overly protective against machine learning. It seems obvious to me that neural networks will play a huge role in creating entirely virtual musicians and influencers,” Thomas further said. “At the same time, we don’t want these techniques used to commit fraud, slander, or have them be used to falsely accuse someone of committing some act. These are things we might need new legal protections for.”

Whether Vocodes will follow the same fate as NotJordanPeterson remains to be seen. Until then, you can try the app by clicking here.

Published July 28, 2020 — 15:07 UTC