TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Tech | Can the U.S. legal system adapt to biometric technology? Related reading: Notes from the IAPP Publications Editor, Dec. 14, 2018

rss_feed
GDPR-Ready_300x250-Ad
PrivacyTraining_ad300x250.Promo1-01

Fingerprint readers, eye scans, and voice recognition are no longer just the security methods of high-tech spy movies. Millions of mobile phone, bank, and investment customers now have these technologies at their fingertips. Schwab uses voice recognition, Apple uses fingerprints, Wells Fargo scans eyes, and other companies are developing heartbeat or grip technology to verify user identity.

Whether biometric technology will thrive or meet its demise depends not only on the security of the technology, but also whether the U.S. legal system will adapt to provide the privacy protections necessary for consumers to use it and for companies to invest in its development. Currently there is no federal law and only one state with a law protecting biometric information.

Biometric tech nuts & bolts

Biometric identity verification is primarily a security technology, but is also viewed as a privacy enhancing technology. PETs are often part security, part privacy technology. Investment in new privacy technologies and tools is growing at a rapid pace. Forty percent of companies surveyed by the IAPP and TRUSTe report an increased investment in privacy technologies.  The nuance between pure security technologies and PETs is whether the legal and regulatory regimes and privacy best practices are considered in the technological design. For example in the U.S., the faster a video uploads, the better. However in areas where governments suppress information, video uploads could trigger government investigation. A PET would allow slower upload speeds that make the upload appear like normal internet traffic so it doesn’t raise any red flags.

Biometric technology includes similar safeguards. Companies using biometrics do not store your actual fingerprints or other identifiers. Instead they store authentication codes or templates created from those identifiers. The templates are coded as long, hard-to-predict numerical sequences. Additional controls are used, such as requiring the customer to blink, to ensure the eye or voice information is a living person and not a recording. This technology is especially helpful for authenticating access to corporate accounts which often involves carrying around a token that generates a passcode every few seconds. It removes the possibility that an executive will lose the token or that it will otherwise be compromised.

That’s not to say biometric security doesn’t have its privacy issues. Biometric information can provide more security than a password, however, once biometric information is compromised, an individual has no way of changing their fingerprint or voice the way they could with a password. Individuals, in addition to companies, will need to take more precautions to protect their biometric information, such as wiping the fingerprints they leave behind or wearing facial recognition obfuscation sunglasses. Nico Sell, founder of Wickr, may seem a little odd when she wears sunglasses when she speaks publicly, but her efforts to reduce her digital footprint may become the new norm. Cameras on the street could be collecting facial recognition information, hidden devices could be recording voices, and then there’s concern over how easily a lifted fingerprint can be used to access a device.

The U.S. Legal Conundrum

There is no federal law protecting biometric information. Instead plaintiffs are turning to the Illinois Biometric Information Privacy Act. The law, passed in 2008, is still in its early phases, but the outcome of cases under BIPA will likely play an integral role in shaping the law in this area and serve as an example that other states may follow or improve upon.

With the protection of BIPA, biometric security can be privacy enhancing, but without it, it serves as an example of how security and privacy differ. Specifically, courts in California and Virginia have ruled that law enforcement can compel individuals to unlock a phone using their fingerprint but cannot compel them to use a password to do the same.

Often new technology is analogized to determine how the law should be applied. In this case, the courts analogized biometric identifiers as a “key” to a lock, whereas passcodes are more like a combination to a lock. Generally, physical evidence like a key is non-testimonial in comparison to personal knowledge, like a combination, which is testimonial. Testimonial evidence is protected from law enforcement compulsion by the Fifth Amendment right against self-incrimination. Thus, under those cases, while biometrics may provide more security for information, it provides less legal privacy protection than a password.

Passwords are increasingly considered unsecure and outdated and technologists should not be discouraged by the law in developing new, more secure verification methods. Future device access points could use two-factor authentication combining fingerprint security with a quick verification method such as a short code or pre-determined image selected by the user. As technology evolves and biometric security becomes more common, mobile phone companies, banks and other service providers may take these privacy issues and court rulings into consideration.

Legal analysis often involves a balancing test; here we are once again balancing technological advancement and security with privacy rights.

Government and law enforcement agencies already collect fingerprints for myriad reasons, including as part of booking processes and for employment background checks. Individuals do not expect those agencies to use that information for the purpose of accessing locked devices that the Supreme Court ruled contain “the privacies of life.”  

Perhaps then it’s the Fourth Amendment, and not the Fifth Amendment, that protects biometric information. Logically the court opinions from California and Virginia hold up, but they do not hold up to the gut-check privacy professionals use to determine whether a practice is intrusive: If it feels wrong or creepy, it is probably intrusive. And here, this perversion of the law that degrades privacy and smothers necessary innovation, feels wrong.

photo credit: 030420_1884_0077_x__s via photopin (license)</

Comments

If you want to comment on this post, you need to login.