The research team presented their work in a paper titled DeepMasterPrints: Generating MasterPrints for Dictionary Attacks via Latent Variable Evolution.
How Are Master Fingerprints Generated?
The fingerprints dubbed as “DeepMasterPrints” by the researchers can be artificially generated using machine learning algorithm. These can be used to fool databases protected by fingerprint authentication without essentially requiring any information about the user’s fingerprints. The artificially generated prints were able to accurately replicate more than one in five real fingerprints in a database, which should only have an error rate of one in a thousand. DeepMasterPrints takes advantage of two flaws in fingerprint-based authentication systems. The first is that many fingerprint scanners do not read the entire finger at once. Secondly, some different fingertip portions are more common than others, which means that scanners that only read partial prints are more likely to be tricked by common fingerprint characteristics. The team trained a neural network to create artificial fingerprints and used evolutionary optimization methods to find their best DeepMasterPrints. They used a common machine learning method, called “generative adversarial network” (GAN) to artificially create new fingerprints that matched as many certain portions of other fingerprints as possible. The team points out that the attack using their AI-driven method can be distributed against random devices “with some probability of success.” The researchers used a NIST public database with 54,000 fingerprints and 8640 finger scans as input for learning and improving their neural networks. However, such attacks may not be able to break into your phone. “A similar setup to ours could be used for nefarious purposes, but it would likely not have the success rate we reported unless they optimized it for a smartphone system,” lead researcher Philip Bontrager of the New York University engineering school told Gizmodo. “This would take a lot of work to try and reverse engineer a system like that.” But, if a hacker is able to use such attacks against many fingerprint-accessible accounts, then the success rate of unlocking devices would be much more. According to Bontrager, “the underlying method is likely to have broad applications in fingerprint security as well as fingerprint synthesis.” He and his team want their research to motivate companies to step up fingerprint-security efforts. “Without verifying that a biometric comes from a real person, a lot of these adversarial attacks become possible,” Bontrager said. “The real hope of work like this is to push toward liveness detection in biometric sensor.”