Recent guidance by Germany’s Baden-Württemberg Commissioner for Data Protection and Freedom of Information instructs EU entities they cannot lawfully transfer data to cloud, software-as-a-service or other technology providers not organized under the laws of the EU, European Economic Area or an equivalent protection country, rendering off-limits providers from the U.S. and U.K. This applies unless the “exporting” EU data controller and technology data “importer” implement safeguards that overcome the inability of standard contractual clauses to protect data subjects’ fundamental rights against surveillance.
The LfDI guidance stipulates that “in cases where the controller cannot provide suitable protection even with additional measures, he must suspend/terminate the transfer; [and] if such an adequate level of protection is not ensured, the data protection supervisory authority must suspend or prohibit the transfer of data if the protection cannot be achieved by other mean.”
The implication of these requirements extends across the EU. Finland’s Office of the Data Protection Ombudsman recently sent letters to seven technology companies (three U.S. technology companies — Amazon Web Services, Microsoft and IBM; and four Finnish/Scandinavian technology companies — Telia, Elisa, Tietoevry and Valtori) requesting information about their response to the landmark “Schrems II” decision. These letters also reminded these companies that if they fail to stop transfers where data is not adequately protected, the data protection authority has both an obligation and clear authority to stop the processing.
The LfDI guidance goes on to state that “although a transfer based on standard contractual clauses is conceivable, it will only rarely meet the requirements that the ECJ has set for an effective level of protection;” and “in this case, the controller must provide additional guarantees that effectively prevent access by the U.S. secret services and thus protect the rights of the data subjects.” Under the guidance, this would be conceivable in the following cases: “encryption in which only the data exporter has the key, and which cannot be compromised even by U.S. secret services; [and] anonymization or pseudonymization, where only the data exporter can determine the assignment.”
We use the term “protected usable data” to encompass the requirement of the LfDI guidance that identification of data subjects is not possible (anonymization) or not possible without access to additional information (encryption or pseudonymization) retained by the EU-based data exporter. The guidance concludes with the warning that absent guarantees embodied in protected usable data, the Baden-Württemberg commissioner will prohibit the transfer of data based on standard contractual clauses.
The three forms of protected usable data identified in the LfDI guidance are encryption (impervious to surveillance), anonymization and pseudonymization (under the exclusive control of the data exporter).
The EU Parliament highlighted this week the EDPB will not be suspending enforcement due to the lack of any regulatory grace period; both binding corporate rules and standard contractual clauses require “additional measures to compensate for lacunae [gaps] in protection of third-country legal systems” and failing that “operators must suspend the transfer of personal data outside the EU.” The EU Parliament stressed that absent additional measures capable of defeating “cryptanalytic and quantum computing efforts of intelligence agencies,” the advice from the Berlin, Hamburg and Dutch DPAs is to halt transfers to the U.S., with the Berlin DPA going further and advising the retrieval of all data from all services owned or operated by U.S. entities.
While not directly binding for EU DPAs, it is helpful to review the published positions of Max Schrems’ privacy advocacy organization, NOYB, since it is the notably successful protagonist initiating legal action leading to the invalidation of Safe Harbor in 2015 and Privacy Shield in July 2020. NOYB most recently filed lawsuits against 101 companies in August 2020, all for failure to protect the rights of data subjects. NOYB takes the following positions on these matters:
- Encryption: NOYB challenges the notion that encryption can be an effective technical safeguard given the purported ability of the U.S. government to break encryption. More importantly, though, with respect to meeting the challenges of “Schrems II,” encryption alone is not a viable solution because when encrypted, data is protected at rest and in transit but has no utility. And when decrypted to provide utility in use, data is no longer protected.
- Anonymization: NOYB highlights the U.S. government’s use of “selectors” to surveil EU personal data. These selectors may be “strong” (like email addresses, IP addresses or phone numbers) or “soft” (like indirect identifiers, keywords or attributes that by themselves do not identify a particular person, but when combined with other data can lead to reidentification). The use of “selectors” makes the use of anonymized data for “Schrems II” compliance difficult, if not impossible.
The reason anonymization has difficulty supporting the requirements for protected usable data is the prerequisite of determining the “perimeter of use” within which the data will be considered “anonymous.”
This prerequisite exists because adequately protecting data beyond a “perimeter of use” to ensure that no information from outside of the perimeter can be used to re-identify individuals necessarily require such aggressive protection of indirect identifiers (i.e., soft selectors as defined above) via masking, data generalization, suppression and perturbation that the data will have minimal residual value.
And, in a classic Catch-22, if attempts are made to rescue data value while protecting it beyond the perimeter by enabling later reversal to promote reidentification, even under controlled conditions by the data controller, then the purported anonymization will fail to satisfy the definitional requirements for anonymization in EU General Data Protection Regulation Recital 26. It states “ ... information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable” taking into account "all the means reasonably likely to be used, such as singling out, either by the controller or by another person."
The highlighting of pseudonymization in the guidance underscores the importance of the widely unnoticed redefining of the term by the GDPR. It no longer refers to the pre-GDPR privacy-enhancing technique of replacing direct identifiers with tokens to “anonymize” data within a perimeter. To comply with the new GDPR pseudonymization requirements in Article 4(5), both direct identifiers and indirect identifiers (i.e., "soft” selectors) must necessarily be protected using technical and organizational safeguards so that it is not possible to attribute data “to a specific data subject without the use of additional information, provided that such additional information is kept separately and is subject to technical and organizational measures to ensure that the personal data are not attributed to an identified or identifiable natural person.”
The LfDI guidance takes a clear public position on the capability of pseudonymization to enable post-"Schrems II" compliance. Given the expanded GDPR requirement that pseudonymization applies to the entirety of datasets, it seems more than reasonable to conclude that GDPR-compliant pseudonymization can help to deliver protected usable data — namely, data where the identification of data subjects is not possible without access to additional information retained by the EU-based data exporter.
Moreover, because attribution to data subjects subject to authorization and controls is expressly provided for as part of pseudonymization, the utility of protected usable data is fully preserved for the exporting data controller without loss of protection for data subjects.
GDPR-compliant pseudonymization-enabled protected usable data protects the fundamental rights of data subjects, fully preserves data utility, including secondary processing results for artificial intelligence and machine learning comparable in accuracy, fidelity and value to processing source data but without requiring the use of identifying information. It also enables the processing of data for privacy-respectful, commercial-related artificial intelligence, machine learning, analytics and other secondary applications that do not raise national security concerns.
A number of EU regulators and industry professionals spoke about the importance of pseudonymization, as newly redefined under the GDPR, in a recent LinkedIn article. In fact, it is a core element of the vendor-neutral Data Embassy proposal submitted to the EDPB for "Schrems II" compliant “additional measures,” the findings of which the EDPB Secretariat confirms “have been circulated to the relevant expert subgroup.”
For those interested in learning more about how to implement technical and organizational measures necessary to support GDPR-compliant pseudonymization, the EU Agency for Cybersecurity has provided recommendations on shaping technology according to GDPR provisions (November 2018) and guidance on pseudonymization techniques and best practices (November 2019).
Photo by Daniel Seßler on Unsplash