Biometrics, privacy, security: Finding the happy medium

Biometric technology has been applied in many beneficial ways aimed at keeping people, workplaces and countries safe, from U.S. Customs and Border Protection employing a facial comparison system designed to detect people attempting to enter the country using falsified documents, to officials in India implementing face biometric technology to identify bad actors even if they’re wearing masks, sunglasses or other obscuring clothing.

However, for some citizens, biometrics are perceived as having privacy and security issues. With an increasing number of government agencies looking to implement biometrics, it’s more important than ever that an equilibrium between biometrics collection and privacy and security is reached and maintained. Biometric systems must be implemented in an ethical manner that builds public trust. How can we best do this?

Implement proper data storage, processing and security

Some worry that collecting and holding biometric data in one central location would make our society vulnerable to a mega data breach should a bad actor hack into that location. But multiple available data security options exist to protect against this possibility, and they have nothing to do with storing all data in one location. It’s very important that these options be communicated to the public as part of ongoing trust-building measures.

For example, organizations deploying biometrics may choose to delete data routinely, even in as little as a few milliseconds after a facial image or fingerprints are captured. Additionally, organizations should ensure this data is never shared with industry partners or third parties, and if a request was made it received the proper review and approvals. Biometric data can also be stored independently from other personally identifiable information, meaning that if a hacker were to be able to somehow access biometric data, it would hold no value because it would be without context.

There are other innovative processing techniques available in the market, as well. “Cancellable biometrics,” or the use of a distorted biometric image derived from the original, has become increasingly popular. To use a common example, if someone enrolls in biometric security measures using their fingerprint, that fingerprint is intentionally distorted, and that new print is used. This way, if a fingerprint is “stolen,” a new version of that fingerprint can be used by simply changing the distortion parameters.

Finally, groundbreaking technologies are now allowing organizations to break up biometric templates into anonymized bits and store this data in different locations throughout a network. This makes it virtually impossible for a hacker to access complete biometric templates, heightening security and peace of mind in the process.

Clearly outline all data collection areas and methods

When implementing biometric security or data collection measures, it is important to be clear with users (and prospective users) on exactly how that data is going to be collected and how and where it will be used. Reverting to an earlier example, U.S. Customs and Border Patrol is currently using biometrics to scan travelers at over 20 seaports and 150 land ports and airports nationwide. At every one of those locations, complete privacy notices are posted prominently, along with simple-to-understand instructions on how American citizens traveling can opt out of the screening. Several states, including Illinois, Texas and Washington, have passed their own laws requiring companies operating there to obtain opt-in consent for the collection of biometric data.

The International Air Transport Organization recently found that 73% of travelers are willing to share their biometric data to improve airport processes. While many travelers choose to opt into such systems due to the convenience the two-second verification option offers, the point is that they have the option to do so. It should be the same across the board; by identifying the type of data you’re collecting and why, and by giving users the ability to opt-in or opt-out, you’re empowering each individual to be fully aware of their options and thus feel in control of who is using their data and how.

Organizations requesting the collection of biometric information should also remain transparent about the way their data is being stored, processed and secured. Whether it’s a fingerprint deleted seconds after use, or a facial authentication app where the facial image never leaves the personal device, everyone has a right to know how their data is being safeguarded — and that level of accountability keeps organizations honest and compliant as well.

Eliminate biases in your data

Biometric technology isn’t inherently biased — but the design of biometric technology could introduce bias. Specifically, facial recognition research has shown that some biometric algorithms may not be as accurate in distinguishing and responding to the facial structures of Black, Asian and Indigenous people, with the least accurate recognition being for women of color.

While keeping concerns of bias top of mind continues to be important, facial recognition has come an extremely long way in recent years. An algorithm’s accuracy depends on the data it’s trained on as it “learns” how to perform biometric security functions, and today’s leading ones are being trained on more diverse datasets than ever before. It has been demonstrated that the top 150 algorithms are over 99% accurate across a variety of demographics. That said, even at these performance levels, humans should always be involved in any final decisions in high-stakes situations or investigations.

In short, it would be a huge mistake to do away with biometric technologies — they are just too powerful of a tool and the benefits are just too great. The good news is, we don’t need to go to extremes in either direction. By finding a middle ground between the power of biometrics and the right privacy safeguards around implementation, biometrics can continue to play a vital role in ensuring public safety while enshrining privacy and security.

Bob Eckel is the president and CEO of Aware.

 

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    (Photo courtesy of NIST)An employee swipes their personal identity verification (PIV) card

    GSA looks for ideas on using kiosks to collect biometric identity information

    Read more

    Biometrics, privacy, security: Finding the happy medium

    Read more