You won't want to miss out on the world-class speakers at TNW Conference this year 🎟 Book your 2 for 1 tickets now! This offer ends on April 22 →

This article was published on November 29, 2018

AI-generated fingerprints could soon fool biometric systems


AI-generated fingerprints could soon fool biometric systems

Research led by two top universities has shed doubt on whether biometric security systems, on their own, can protect our most sensitive data.

Humans are notoriously bad at creating secure passwords. But that’s okay; we’ve fixed the problem, at least somewhat, by introducing a slew of new devices that rely on biometric authentication, whether in the form of fingerprints, voice recognition, or facial scanning.

Researchers at New York University and Michigan State University, however, have their doubts about whether biometrics alone are enough. “Fingerprint-based authentication is still a strong way to protect a device or system, but at this point, most systems don’t verify whether a fingerprint or other biometric is coming from a real person or replica,” said Phillip Bontrager, lead author of the paper and doctoral student at NYU.

At issue is the way in which most fingerprint sensors work. Previous research by NYU professor Nasir Memon detailed a fatal flaw in some system. Rather than using a full fingerprint, most relied on partial fingerprints to confirm identity. Most devices allow users to submit a number of fingerprint images, and a match for any saved partial is often enough to confirm identity.

This led Memon, and Professor Arun Ross, of Michigan State University, to coin the term “MasterPrint” to describe the way partial prints are often enough.

The <3 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Recently, researchers built upon these findings to create a new machine-learning algorithm that generates synthetic fingerprints. These AI-generated fakes could be pitted against real devices in the near future by harvesting fingerprint images stored in fingerprint-accessible systems. And if that happens, researchers suggest, these fingerprints could use used to launch a brute force attack, testing each fingerprint in a system until it opens a device, or a door, as it may be.

Currently, the system hasn’t been tested on real devices. At this point, it’s purely hypothetical. But the research is undoubtedly important in a day where we’re trusting biometric devices to secure ever-increasing amounts of sensitive data.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Published
Back to top