The EU General Data Protection Regulation has unquestionably shaped the global data privacy discourse since it went into effect roughly two years ago — not just because of its extraterritorial jurisdiction, but because it has paved the way for similar regulations everywhere from Brazil to India to Nigeria. And the United States may not be far behind. While an omnibus privacy law has yet to materialize at the federal level in the U.S., the California Consumer Privacy Act has successfully imported some of the hallmark concepts of the GDPR, and several states appear poised to follow suit.
In this evolving regulatory landscape, U.S. companies should be particularly cautious when engaging in practices that historically would not have posed legal risk from a privacy perspective but may do so under the new regime. With roughly 275 GDPR enforcement actions now on the books, such risk is no longer theoretical. Numerous companies have incurred regulatory fines for activity that may not strike many in the U.S. as problematic. To illuminate these areas of unexpected risk, U.S. companies should take stock of the GDPR enforcement landscape (even if they themselves are not subject to the GDPR) and consider how practices that resulted in liability in the EU might have similar results under current or proposed U.S. law.
Over-retention of personal information
Many U.S. companies have mature processes in place for setting retention periods and ensuring data is retained for the required length of time but neglect to consistently delete or otherwise disposition data on the back end. For such companies, a retention schedule is essentially a floor with no ceiling, and the deletion of any meaningful quantity of data may be a surprisingly rare occurrence. Certainly, sufficient data retention is crucial for companies to protect their interest, comply with law and manage their business operations. But the conventional wisdom that unnecessary retention is less risky than premature deletion may not hold with respect to personal information, and companies may be exposing themselves to more liability than they realize.
A GDPR enforcement action out of Germany is instructive on the matter. In October of last year, the Berlin Commissioner for Data Protection and Freedom of Information fined a property company 14.5 million euros (1% of its annual turnover) for storing the personal data of tenants for longer than necessary for the purpose for which it was collected. Regulators concluded that maintaining such a “data cemetery” violated the GDPR's Article 5 (presumably, the “data minimization” and “storage limitation” principles, in particular) and Article 25.1 (which requires the implementation of “data protection by design”). Notably, the fine did not result from a cybersecurity incident; to the contrary, in penalizing the company for failing to properly limit its retention of personal data, regulators heralded the fact that the GDPR provides an avenue for issuing sanctions before such an event occurs.
While the concept of an affirmative obligation to limit retention has not yet taken hold in the U.S. outside of certain sector- and context-specific laws, no company is impervious to the legal ramifications of a cybersecurity incident. Personal information indefinitely retained in forgotten repositories or legacy systems is particularly vulnerable, as such data environments are often less actively monitored and controlled. By adhering to strict procedures for deleting personal information as soon as its retention is no longer necessary, companies can reduce their exposure in the event such “data cemeteries” are breached.
If the risk of a cybersecurity incident is not enough to make U.S. companies reassess their retention practices, they may soon find themselves subject to the same “data minimization” and “storage limitation” obligations as under the GDPR. This fall, Californians will vote on the California Privacy Rights Act, a ballot initiative that seeks to strengthen the California Consumer Privacy Act, including by expressly prohibiting companies from retaining personal information for longer than reasonably necessary for the purpose for which it was collected.
Use of biometric technology in the workplace
Many U.S. companies salivate at the efficiencies to be gained by using biometric technology in the workplace, from automating employee time tracking to strengthening facility access controls. In theory, age-old challenges such as time theft and “buddy punching” could all but be eliminated by the use of fingerprint scanning or facial recognition. Similarly, the use of retina scans at entry points to secured areas could significantly decrease the possibility of unauthorized access. But such operational improvements could quickly be offset by legal liability if companies rush to adopt such readily available biometric solutions without carefully considering their obligations under the law.
A recent GDPR enforcement action out of the Netherlands underscores the risk. In April, the Dutch data protection authority, Autoriteit Persoonsgegevens, issued its highest GDPR fine to date (725,000 euros) against an unnamed company in connection with its use of fingerprint scanning technology for employee time-tracking purposes. Regulators found that the company had not obtained valid consent to use its employees’ biometric data, nor was such use necessary for authentication or security purposes. Under these circumstances, the practice was found to be in violation of the GDPR's Article 9, which prohibits the processing of biometric data unless one of the enumerated exceptions under Article 9 applies.
U.S. companies increasingly face similar legal risk when managing their workforce using biometric technology. Under Illinois' Biometric Information Privacy Act, for example, companies must notify employees and obtain an executed written release as a condition of employment prior to collecting their biometric data. Similar, if less publicized, regulations are in effect in Texas and Washington, as well (though their application to employee biometric data collected for non-commercial purposes is less clear, and they lack a private right of action). Even companies that are not subject to the biometric laws of these three states will find that certain state laws, including the CCPA, regulate the use of employee biometric data to one degree or another. U.S. companies should, therefore, proceed with caution in implementing biometric solutions, even if the legal risks are not immediately apparent.
Unexpected collection of personal information on mobile devices
With the ubiquity of mobile devices and the unlimited possibilities of technology, companies are finding increasingly creative uses for their consumer-facing mobile applications. Some have even gone as far as developing features that serve a specific internal purpose for the company but have little connection to the company’s engagement of the consumer using the application. Taking into account the expansive legal definition of “personal information” that is quickly taking hold, companies that enable such features may be unknowingly collecting and processing personal information in a manner inconsistent with applicable U.S. law.
A GDPR enforcement action out of Spain illustrates the potential pitfalls of using mobile applications to collect personal information in ways the user may not reasonably expect. In June 2019, the Spanish data protection authority, Agencia Española de Protección de Datos, fined a professional soccer league 250,000 euros in connection with its use of its popular mobile application to prevent piracy. In addition to the application’s core function of allowing users to follow matches live, the league was activating the microphones of users’ mobile devices to detect whether bars and restaurants were playing matches without a license and collecting the geolocation of the mobile devices to locate the offending establishments. While users received notice of the practice upon downloading the application, regulators concluded that such notice was insufficient to satisfy the principle of “transparency” found in the GDPR's Article 5.1 and that the league should notify users each time the feature is activated. Regulators also found that the process by which users could withdraw their consent to the practice was unclear, in violation of the GDPR's Article 7.3.
The league has appealed the fine, and it remains to be seen whether the Spanish regulators’ interpretation of the GDPR will withstand scrutiny. Regardless of the outcome, though, the case highlights the potential importance of real-time notice when collecting personal information in unexpected ways. Channeling the Spanish regulators’ reasoning, the latest draft of the CCPA regulations goes beyond the CCPA itself in specifically requiring a “just-in-time” notice when collecting personal information from a mobile device “for a purpose that the consumer would not reasonably expect.” As U.S. companies continue to explore innovative uses of mobile applications, they should carefully consider whether they are collecting data points that constitute personal information under relevant laws, and if so, whether they are providing users legally sufficient and timely notice.
As data privacy law in the U.S. continues to move in a European direction, U.S. companies will increasingly find they must challenge their preconceived notions about the proper collection, use and management of personal information. In doing so, they should not overlook the utility of GDPR enforcement actions in shedding light on areas of unforeseen risk.
Photo by Danielle Cerullo on Unsplash