On the heels of the Federal Trade Commission’s
on the consumer privacy implications of facial detection and recognition technology, the International Biometrics & Identification Association (IBIA) has released recommended best practices for safeguarding privacy when implementing the emerging biometric technologies.


A Washington, DC-based trade association, the IBIA seeks to promote “effective and appropriate use” of technologies that determine identity and improve security for individuals, organizations and government agencies.


The
of
Face Detection & Face Recognition Consumer Applications: Recommendations for Responsible Use
resides in the assertion that a faceprint— “the unique digital code derived from a photograph and which can be matched against a database of known faceprints to establish identity” —is consequently “a biometric and should be considered as personally identifiable information (PII).”


With its ascension to PII, a faceprint, the IBIA argues, should garner with it “all the security and privacy protections bestowed upon other PIIs.”


In its release, the IBIA has created three application categories, each with its own set of best practice recommendations. “Memory-less Face Detection Applications” do not collect, store or process faceprints but instead detect a human face. As these applications do not generate or store an individual’s faceprint, the IBIA “does not see any implications to privacy” but adds that consumers should be notified of the presence of the application.


Applications that generate and store faceprints— “Face Detection Applications with Memory: Tracking”—require more privacy protection, including the issuance of “active consent” from a consumer. Though technologies that fall under this category do not connect a faceprint with traditional identifying data such as an individual’s name or phone number, other types of data—like location and time—could give rise to a potential privacy invasion, the IBIA says.


Technology that uses “Full-Fledged Face Recognition” should not only require active consumer consent, but when used as surveillance, faceprints should not be added to a database “without a warrant or documentation of probable cause.”


The IBIA’s recommendations come just weeks after the Center for Democracy & Technology (CDT) released comments on the technology. Entitled
Seeing Is ID’ing: Facial Recognition & Privacy
, the
lays out the advocacy group’s belief that “A mix of government regulation, industry self-regulation and privacy-enhancing technologies can give consumers a greater measure of control over how facial recognition is used without unduly limiting the benefits of the technology or burdening free expression.”


The CDT report reasons that current “federal laws—and nearly all state laws—do not provide American consumers with basic privacy protections when it comes to biometric information for commercial purposes online or offline.” With little government regulation, an industry-wide code of conduct that addresses facial recognition products is paramount, the CDT argues.


Two industry trade groups—the Digital Signage Federation (DSF) and Point of Purchase Advertising International (POPAI)—have released voluntary codes of conduct for industry in recent years.


With the aid of CDT Privacy Counsel Harley Geiger, the DSF
“Digital Signage Privacy Standards” in early 2011. The DSF standards use Fair Information Practices (FIPs) as a basis to promote public trust in facial recognition and detection technology.


Additionally, POPAI released
“to marketers on maintaining ethical boundaries with consumer data and suggestions on how consumer observations and marketing insights should be collected and used.”


When representatives from industry, government and advocacy met
last month’s FTC Roundtable, it was clear that defining what constitutes appropriate use of facial recognition technology is not yet settled. Several industry representatives currently using the technology, however, employed some form of baked-in privacy in their products.


“The only effective way to address privacy,” asserts the CDT, “is with a nuanced mix of baseline consumer privacy legislation, industry self-regulation and privacy by design.”


Embedding privacy into digital signage and facial recognition applications are components of the DSF standards as well as the IBIA best practices. As an example, the DSF standards highlight the importance of “incorporating privacy into digital signage business models and data management practices” as the “best way to prevent privacy risks before they arise.”


The IBIA recommendations also embrace a built-in privacy approach. The IBIA says that “elevating the standards of privacy by design as a methodology to be adopted by the various industries concerned is worthwhile since there are many elements around the new applications of face detection and face recognition that will be difficult to control or regulate at a detailed level.”


The FTC is accepting public comment on the consumer privacy implications of facial detection and recognition technology until January 31.