The recent launch of the iPhone X’s Face ID functionality sparked a flurry of conversation from consumers and mainstream commentators concerned with how Apple will protect the privacy of such sensitive biometric data. Even aside from the iPhone X however, the collection of individuals’ biometric data is increasing, and the manner in which it is processed continues to grow more sophisticated.
Given these trends, it is unsurprising that regulators have taken notice. The General Data Protection Regulation is a perfect example of that, representing a more active approach with respect to the privacy of biometric data. Unlike its predecessor, the Data Protection Directive, the GDPR specifically singles out biometric data as a "sensitive" category of personal information, warranting robust protection. The GDPR defines biometric data broadly, in many cases requires privacy impact assessments for its processing, and empowers Member States to pursue divergent protections for biometric data. As such, data controllers who are processing or may process biometric data should take note.
Defining biometric data under the GDPR
As mentioned above, in a shift from the Data Protection Directive, the GDPR specifically recognizes biometric data as a subset of sensitive personal data deemed a “sensitive category of personal data.” Specifically, the GDPR defines biometric data as, “personal data resulting from specific technical processing relating to the physical, physiological or behavioral characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic (fingerprint) data.”
In defining biometric data under such broad terms, the GDPR appears to implicitly acknowledge that biometric technology is relatively nascent and will continue to evolve. As such, the definition seems well-positioned to encompass types of biometric data that may arise through the development of future technology.
The definition recognizes two categories of information that could be considered biometric data. The first is information pertaining to bodily characteristics — i.e., a person’s physical or physiological traits. This category is fairly straightforward and is consistent with what most people would think of as biometric data, such as facial information, fingerprints, iris scans, etc.
The second category of biometric data, behavioral information, is more broad. Logically, any behavioral characteristics that could permit the unique identification of a person would be considered biometric data. However, it is unclear just how narrowly regulatory authorities will interpret this category or what limiting principles, if any, will guide their analyses. Plausibly, information pertaining to someone’s habits, actions or personality could be considered behavioral information within the scope of the definition. This is a potentially broad category as it has no nexus to the sort of bodily information typically thought of as biometric data. Due to this inherent uncertainty, privacy professionals should closely monitor guidance delineating the types of behavioral information deemed biometric data. Further, privacy professionals should proactively identify any behavioral data their organizations are already be processing.
Biometric data as "sensitive" data
As the GDPR considers biometric data to be a special category of sensitive personal data, processing and protecting it must proceed under the framework reserved for sensitive personal data generally. While the GDPR broadly prohibits the processing of sensitive personal data, it recognizes certain bases to justify its processing, chiefly, the explicit consent of the data subject, the performance of specific contracts or processing for certain specific purposes.
However, merely having a legal basis to process biometric data is not in itself sufficient, as the GDPR introduces a new requirement that data controllers must conduct a privacy impact assessment when processing is likely to result in a high risk to the rights and freedoms of Data Subjects. This is especially true when the processing involves the use of new technologies. Privacy impact assessments are mandatory in the case of automated processing, large-scale processing, or when data controllers systematically monitor a publicly accessible area on a large scale. Additionally, the GDPR requires data controllers to consult with supervisory authorities prior to processing when the privacy impact assessment indicates that processing is likely to result in a high risk to individuals and there is an absence of measures taken by the Data Controller to mitigate such risk. Practically speaking, this consultation requirement may likely be avoided by identifying the relevant risks and implementing measures tailored to mitigate them.
THE GDPR’s practical impact: Privacy impact assessments and Member State divergence
One critical impact of the GDPR’s treatment of biometric data as sensitive personal data is that data Ccntrollers will need to conduct privacy impact assessments for many forms of biometric data processing. A key reason is because many forms of biometric data processing will necessarily involve the use of new technology — a factor which the GDPR stipulates weighs in-favor of conducting a privacy impact assessment.
Moreover, many forms of biometric data processing will trigger the GDPR’s mandatory privacy impact assessment requirement. This is because it is foreseeable that biometric data processing will be increasingly conducted on a large scale, employ automated processing, and in some applications systematically monitor publicly accessible areas (for example, using facial-recognition technology to monitor individuals in retail settings). In such instances where privacy impact assessments are necessary in order to process biometric data, data controllers will need to identify the risks the processing presents to data subjects and implement measures tailored to mitigate those risks. Such risk mitigation efforts are important as they will permit data controllers to avoid prior consultation with supervisory authorities.
Another critical impact of the GDPR’s treatment of biometric data is that data controllers should expect to encounter divergent approaches from Member States regarding biometric data processing. This is because the GDPR expressly permits Member States to impose additional conditions and limitations on the processing of biometric data. As it is likely that at least some Member States will choose to exercise the power the GDPR grants them in this area, data controllers should be vigilant in monitoring Member States’ divergent treatment of biometric data.
The evolving nature of biometric technology, the inherent uncertainties associated with the GDPR’s treatment of biometric data, and the expected divergence of Member States’ approaches to biometric data all warrant the attention and caution of data controllers. Accordingly, data controllers who are processing biometric data or contemplating it should keep themselves duly apprised of further developments in this fast-moving area.