This is the second installment of a two-part series on Privacy Shield's invalidation. In part one, Schwarz discussed concerns about national security agency access to records and its role in Privacy Shield's demise. Here, Schwarz explores options for transparency as to redress, as well as ways companies can use transparency to bolster confidence and encourage continued data sharing through mechanisms such as standard contractual clauses.
As noted in part one of this series, in July, the Court of Justice of the European Union invalidated the EU-U.S. Privacy Shield (colloquially known as the “Schrems II” decision), due to concerns about U.S. governmental access to data using intelligence authorities. Also of concern was the lack of adequate redress for EU citizens.
It is, of course, axiomatic that companies cannot create a right of action that U.S. law does not provide for. Nonetheless, companies can commit to strenuously advocating to minimize the disclosure of EU citizens’ personal information upon receipt of government access requests, essentially serving as a proxy for redress.
Notably, there have been a few high-profile challenges to U.S. government access requests, the most public being the 2016 Microsoft and 2017 Google challenges to U.S. government search warrants seeking customer records stored outside the U.S., resolved with the passage of the Clarifying Lawful Overseas Use of Data Act. But these cases under-represent the overall extent of individual corporate efforts in this area, which is again, where transparency comes in.
U.S. data importers need to highlight their efforts to narrow, challenge and hold the government accountable for minimizing the disclosure of personal information upon receipt of production requests, whether the result of national security letters or Foreign Intelligence Surveillance Act requests or traditional legal processes, such as search warrants, court orders or subpoenas.
To begin with, a company should post its policy on handling data access requests from governmental entities and private parties, although the CJEU’s redress concern was focused primarily on governmental access (private legal actions generally allow affected parties to intercede). Such a policy should include:
- Procedures for notifying the data exporter (when permitted by law) that a party has demanded access to data.
- The circumstances under which it will challenge or contest access requests.
- Whether and when it will allow a data exporter to intercede in a challenge (where legally permitted).
- A commitment to actively narrow access requests to the greatest extent possible.
To demonstrate that a company speaks truth to power and dispel the notion that U.S. companies are merely a funnel for EU data to U.S. national security agencies, posting aggregated metrics of efforts to tailor and narrow disclosures to governmental authorities, whether through direct negotiation or court intervention, is key. Such metrics could include:
- The frequency or proportion of times the company has contested data access requests.
- The number or percentage of cases in which the company’s efforts resulted in a reduction in the number of records disclosed, customers affected and fields disclosed within records.
|January – June 2019
|Total # of US government access requests
(National Security & Criminal)
|Number of access requests challenged/
|Percentage where all requested data disclosed
|Percentage where # of records disclosed less than initially requested
|Percentage where # of fields disclosed less than initially requested
|Percentage where # of customers/ accounts disclosed less than initially requested
In short, the use of transparency, coupled with a record of advocating for privacy, contesting legal process and holding U.S. authorities accountable — ensuring that only what is “strictly necessary” is disclosed — should assist a data exporter in concluding that SCCs provide “legally adequate protection” for personal data transferred from the EU to the U.S.
Of course, this conclusion is as of yet untested, and there is no guarantee that an EU regulatory authority will concur. On the other hand, if the EU invalidates all data sharing mechanisms with the U.S. and doesn’t work to find viable alternatives to redress, it will neuter a relationship estimated to be “the largest in the world, with goods and services trade of $1.3 trillion in 2019.” Likewise, the EU risks becoming a virtual data island onto itself, to the extent similar concerns about law enforcement and intelligence access, and/or redress, exist with other international partners.
Companies should also bear in mind the European Data Protection Board’s guidance about conducting “an assessment of the law of the third country, in order to check if it ensures an adequate level of protection.” Don’t underestimate the value of sector-specific and state privacy laws, given the absence of a federal privacy law.
For example, if the data in question is covered by the U.S. Health Insurance Portability and Accountability Act, the HIPAA Privacy Rule provides stringent safeguards for protecting personal health information, sets limits and conditions on use and disclosure, and provides for the right to access, examine, obtain a copy of and request corrections to records. Alternatively, if, for example, the data is student records, the U.S. Family Educational Rights and Privacy Act — a law that arguably needs some updating — provides for the right to access, the right to seek to amend records, limits on disclosure, etcetera.
As the EDPB observed, “supplementary measures” can also include “technical or organizational measures,” which, again, is where transparency comes into play.
Implementing spot check and audit mechanisms evidencing fulfilment of EU General Data Protection Regulation–type rights, such as the right to access, amend, be forgotten, etcetera, and then making the results of those public publicly confirms that the company handles such requests in line with how they’d be handled in the EU. In particular, companies could spot check and audit the following (grouped in six-month periods, akin to national security transparency disclosures):
- Number/percentage of access requests, broken into the number/percentage where identity was validated and access granted and identity couldn’t be validated and access denied.
- Number/percentage of amendment requests, broken into the number/percentage where amendment was performed and the amendment was denied.
- Number/percentage of requests for restricting processing, broken into the number/percentage where restriction was granted and restriction was denied.
- Number/percentage of requests for data portability, divided into the percentage provided in a machine-readable format and the number transmitted to another data controller.
- Number/percentage of erasure requests (i.e., right to be forgotten), broken into the number/percentage when deletion was performed and deletion was denied.
|January – June 2019
|Total # of requests for access to records received = 85
|Total # of requests for amendment of records received = 43
|Total # of requests for restricting processing of a record = 12
|Total # of requests for data in a machine -readable/ transportable form =12
|Identity validated/ access granted
|Identity not validated/ access denied
|Amendment request granted
|Amendment request denied
|Restriction request granted
|Restriction request denied
|Provided to Data Subject
|Transmitted to Another Data Controller
|January – June 2019
|Total # of requests for erasure of data received = 25
Basis for requests for erasure of data (see GDPR Article 17)
|Basis for Erasure Request
|PII no longer necessary to purposes for which they were collected/ processed
|Data subject withdraws consent (and no other legal ground for processing)
|Data subject objects to the processing of Personal Data (and no overriding legitimate grounds for processing)
|Requested delisting when personal data
|Personal Data erased for compliance with a legal obligation
collected in relation to the offer of information society services to a child
Companies may also wish to consider providing transparency as to things that cannot easily be boiled down into metrics. As Albert Einstein is quoted as saying (some believe mistakenly), “Not everything that counts can be counted.” Items that fall into this category include:
- Corporate policies on compliance oversight (redacted, where necessary), data retention (potentially broken out into retention for personally identifiable information and metadata), use of anonymization and pseudonymisation (as defined by the GDPR).
- A description of the sector-specific and/or state privacy laws by which the organization is bound.
- The cybersecurity framework the organization adopted (e.g., the NIST Cybersecurity Framework, ISO/ICE 27001, etcetera), providing additional insight into how the company handles cybersecurity, breaches, etcetera.
- If the company uses encryption, the type of encryption, whether the encryption is “end-to-end,” and whether it’s used for data in transit, at rest, or both. Given the CJEU’s concern with the interception of EU citizen data by national security authorities, there’s clearly value in explaining how the data importer protects data while in transit.
Finally, given the EDPB’s urging that data exporters conduct a fact-specific review of each data transfer relationship to ensure adherence to the SCCs, companies could design, implement and publicly post a sampling of audits and spot checks validating adherence to some of the agreed-upon safeguards incorporated into the SCC Appendix.
This is very much akin the strategy we implemented at the National Counterterrorism Center in response to negative press, exacerbated by the Snowden leaks, when we publicly posted the results of our spot check and audit program validating our handling of data in compliance with the varied legal and operational requirements, as dictated by the sources of such data. We also posted our compliance processes, enhanced safeguards for sensitive datasets, and even compliance incident reviews and findings, eventually earning the trust and respect of Congress, the Privacy and Civil Liberties Oversight Board, the EU (from whom some NCTC datasets originated) and even the advocacy community.
If this can be done by the Intelligence Community, with limited budgets and maximum secrecy, any company can successfully do the same.
At the end of the day, the transparency recommended in this article may be a departure from how companies have historically provided information to the public.
But the Snowden leaks gave rise to an air of distrust, which has only been exacerbated by the perceived vacuum created in the absence of a federal privacy law, the lack of insight into state and sector-specific privacy laws, and a superficial understanding of how U.S. companies protect privacy on their own.
As such, companies need to actively rewrite that narrative, latching on to the age-old axiom that “seeing is believing.” It is only through concerted actions and purposeful transparency that companies can take the initiative and develop positive momentum.
As a potential secondary effect, transparency may attract the attention of EU exporters seeking reliable U.S. partners after the "Schrems II" decision invalidated their previous data-sharing relationships.
Photo by Nik Shuliahin on Unsplash
If you want to comment on this post, you need to login.