The pandemic has inspired new uses for facial recognition software, but may require tweaking of algorithms.
The pandemic has inspired new uses for facial recognition software, but may require tweaking of algorithms. Masks, which leave only the forehead, eyes and maybe the bridge of the nose exposed, can make it harder to identify people accurately – especially for law enforcement.
One such player in that arena is the National Institute of Standards and Technology, which has been a consistent source for best practices, latest tech developments and standards.
“We started maybe two or three different bits of work, one on quality assessment – can you look at an image and say it’s a good quality image? That is important to performance downstream. Another bit of work that we’ve been doing during the pandemic, obviously, is to look at – can face recognition work when somebody is wearing a protective face mask?” NIST computer scientist and biometrics expert Patrick Grother said on Federal Monthly Insights – Digital Transformation.
While some algorithms do not tolerate the reduced facial visibility at all, causing error rates to increase, he said some algorithms are prioritizing the eye region of the face naturally, leaving them still usable in an operational context. The biometrics industry’s long-term goal is to make it easier, which he said means more flexibility for capturing images.
“So as you mentioned, this ‘frictionless travel’ idea, where we can throw away a boarding pass and replace it with just your face walking through an airport, from check-in, to bag drop to the TSA line into maybe an aircraft lounge, and then boarding the plane – in principle, that that can all be done with one to many faces,” Grother said on Federal Drive with Tom Temin. “So you just present your face, you’re expected at various touch points through the airport and you confirm your identity using just face recognition.”
The technology may be limited for now but Grother warned not to bet against facial recognition. The accuracy gains in the last decade – not to mention higher resolution cameras on cellphones – have been substantial. Those gains are coming from artificial intelligence research, finding different ways to build neural networks and to do deep learning – a subset of machine learning wherein a model continually analyzes data with layered algorithms, similar to the neural networks of human brains.
If AI is drawing from outdated, decade-old passport photos in databases to make a match with a present-day images, that does not amount what Grother said is a “future standard.” NIST is working on updated quality assessment standards. Reports should be accessible, should explain the technology, document the technology’s capability and describe how well it works in certain settings.
“And then relative accuracy: Which algorithms are working better or which compression techniques are working better, or which cameras in principle, if we have that information? And so, if you don’t measure things then you don’t really know what the underlying situation is,” he said.
Most biometrics algorithms are closed source, subject to intellectual property protections which makes academic research on them limited. Those algorithms submitted to NIST’s ongoing benchmark are publicly available. Grother said they are existence proofs or proofs of concept that a particular neural network can work.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Amelia Brust is a digital editor at Federal News Network.
Follow @abrustWFED