Join us at TNW Conference 2022 for insights into the future of tech →

The heart of tech

This article was published on November 29, 2018

    AI-generated fingerprints could soon fool biometric systems

    AI-generated fingerprints could soon fool biometric systems
    Bryan Clark
    Story by

    Bryan Clark

    Former Managing Editor, TNW

    Bryan is a freelance journalist. Bryan is a freelance journalist.

    Research led by two top universities has shed doubt on whether biometric security systems, on their own, can protect our most sensitive data.

    Humans are notoriously bad at creating secure passwords. But that’s okay; we’ve fixed the problem, at least somewhat, by introducing a slew of new devices that rely on biometric authentication, whether in the form of fingerprints, voice recognition, or facial scanning.

    Researchers at New York University and Michigan State University, however, have their doubts about whether biometrics alone are enough. “Fingerprint-based authentication is still a strong way to protect a device or system, but at this point, most systems don’t verify whether a fingerprint or other biometric is coming from a real person or replica,” said Phillip Bontrager, lead author of the paper and doctoral student at NYU.

    At issue is the way in which most fingerprint sensors work. Previous research by NYU professor Nasir Memon detailed a fatal flaw in some system. Rather than using a full fingerprint, most relied on partial fingerprints to confirm identity. Most devices allow users to submit a number of fingerprint images, and a match for any saved partial is often enough to confirm identity.

    This led Memon, and Professor Arun Ross, of Michigan State University, to coin the term “MasterPrint” to describe the way partial prints are often enough.

    Recently, researchers built upon these findings to create a new machine-learning algorithm that generates synthetic fingerprints. These AI-generated fakes could be pitted against real devices in the near future by harvesting fingerprint images stored in fingerprint-accessible systems. And if that happens, researchers suggest, these fingerprints could use used to launch a brute force attack, testing each fingerprint in a system until it opens a device, or a door, as it may be.

    Currently, the system hasn’t been tested on real devices. At this point, it’s purely hypothetical. But the research is undoubtedly important in a day where we’re trusting biometric devices to secure ever-increasing amounts of sensitive data.