The security concerns that arise with biometrics and what to know

Buying groceries without pulling out your wallet isn’t that far fetched. Since the rise of biometrics, the convenience of using your body as an authentication of your identity will replace old habits, like paying with a credit card, with new ones, like scanning your face.

Research on biometric technology has amped up in the recent years and mobile apps are creating softwares that read various unique-to-you body parts to help verify your identity. Despite the convenience, however, biometrics creates all kinds of security and privacy concerns.

“Biometrics are tricky,” Woodrow Hartzog, an Associate Professor of Law at Samford University told WIRED. “They can be great because they are really secure. It’s hard to fake someone’s ear, eye, gait, or other things that make an individual uniquely identifiable. But if a biometric is compromised, you’re done. You can’t get another ear.”

Databases get hacked frequently- from small businesses to the IRS and major corporations- so worrying about linking data about your body parts online is understandable and common today.

Since Apple’s introduction of biometrics in the home button fingerprint sensor on the iPhone in 2013, the appetite for biometrics has expanded rapidly. Now MasterCard wants to use your heartbeat to verify purchases, Google wants to use speech patterns to identify it’s really you, and other apps are using vascular patterns in the eyes to authenticate identities. The use of biometrics isn’t new, but security questions are.

Law enforcement agencies know how public your body parts are. If you have a drink at a bar, your fingerprints are left behind on the glass. If your ear is exposed, someone can take a high resolution photo of it from afar. There are countless privacy concerns around the idea of using biometrics as a form of identity authentication. Overtime, society may be willing to exchange privacy for convenience, but it’ll take some ironing out in the biometrics world before taking the AI plunge.