A team of researchers from the State University of New York (SUNY) recently developed a method for detecting whether the people in a video are AI-generated. It looks like DeepFakes could meet its match.
What it means: Fear over whether computers will soon be able to generate videos that are indistinguishable from real footage may be much ado about nothing, at least with the currently available methods.
The SUNY team observed that the training method for creating AI that makes fake videos involves feeding it images – not video. This means that certain human physiological quirks – like breathing and blinking – don’t show up in computer-generated videos. So they decided to build an AI that uses computer vision to detect blinking in fake videos.
The big deal: DeepFake is a deep learning method that makes it possible to replace the face of a person in a video with someone else’s. Upon its release to the public last year it was immediately used to exploit people through the creation of fake pornography featuring celebrity faces pasted over adult film actors’ bodies.
Until now, detecting these videos has been a matter of personal expertise. If you know what to look for, you can reasonably determine a video has been faked. But the potential for dangerous exploitation still exists. Furthermore, the technology used to create fake videos continues to grow at an alarming rate. Experts believe we’ll reach a point where the only way to determine if a video was AI-generated will be through the use of advanced detection tools.
What’s next: We’re certain that someone will make an AI that can generate fake videos with humans that blink. And then a research team is going to have to figure out how to beat that one.
Luckily the SUNY team already has plans in place to build a more robust detector which will look for things like pulse and breathing, along with more advanced forms of blink detection.
Pssst, hey you!
Do you want to get the sassiest daily tech newsletter every day, in your inbox, for FREE? Of course you do: sign up for Big Spam here.