
AI creates fake fingerprints endangering smartphone fingerprint sensors
The researchers used a neural network trained to synthesize human fingerprints to create a fake fingerprint that could potentially fool a touch-based authentication system for up to one in five people. Such a “DeepMasterPrint” – similar to a master key that can unlock every door in a building – uses artificial intelligence (AI) to match a large number of prints stored in fingerprint databases and can thus theoretically unlock a large number of devices.
The term “MasterPrint” – used to describe a set of real or synthetic fingerprints that can fortuitously match with a large number of other fingerprints – was coined by Nasir Memon, professor of computer science and engineering and associate dean for online learning at NYU Tandon, who has done previous research on fingerprint-based systems. Such systems use partial fingerprints, rather than full ones, to confirm identity.
Devices typically allow users to enroll several different finger images, and a match for any saved partial print is enough to confirm identity. As such partial fingerprints are less likely to be unique than full prints, Memon’s work demonstrated that enough similarities exist between partial prints to create MasterPrints capable of matching many stored partials in a database.
Taking this concept further, the latest research trained a machine-learning algorithm to generate synthetic fingerprints as MasterPrints. The researchers then created complete images of these synthetic fingerprints.
The process, say the researchers, is yet another step toward assessing the viability of MasterPrints against real devices, which the researchers have yet to test. In addition, because these images replicate the quality of fingerprint images stored in fingerprint-accessible systems, they could potentially be used to launch a brute force attack against a secure cache of these images.
“Fingerprint-based authentication is still a strong way to protect a device or a system, but at this point, most systems don’t verify whether a fingerprint or other biometric is coming from a real person or a replica,” says doctoral student Philip Bontrager, lead author of a paper on the study. “These experiments demonstrate the need for multi-factor authentication and should be a wake-up call for device manufacturers about the potential for artificial fingerprint attacks.”
The research has applications in fields beyond security, say the researchers. For example, the “Latent Variable Evolution” method used in to generate the synthetic fingerprints can also be used to make designs in other industries – notably game development. The technique has already been used to generate new levels in popular video games.
For more, see “DeepMasterPrints: Generating MasterPrints for Dictionary Attacks via Latent Variable Evolution” (PDF)