TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | The GDPR Catch-22 in the Dutch COVID-19 contact-tracing app Related reading: Notes from the IAPP Europe, 10 July 2020

rss_feed

""

""

Last week, the Dutch Parliament voted in favor of an amendment to a bill regulating use of the Dutch COVID-19 contact-tracing application, which would improve the privacy of users of the app, while at the same time sidelining the EU General Data Protection Regulation.

The amendment, which prohibits anybody from linking the data processed on the app’s backend server to the identity of the user, was a follow-up to a recommendation I suggested in my second opinion on the app’s data protection impact assessment for the Ministry of Health in mid-August.

Back in April, in the middle of the "smart lock down" in The Netherlands, the Dutch Parliament held a round table discussion with a number of experts on the development of a contact-tracing app for COVID-19. Just two weeks earlier, the Ministry held a weekend-long "appathon," which was live-streamed on YouTube. The "appathon" was a disaster, as none of the seven contributions withstood the scrutiny of the various panels, including one on privacy and security, as well as apps that were already in development or used in other countries like Singapore, Austria and France. The Ministry took matters into its own hands and decided to build its own app. The Parliament in turn wanted information from experts and stakeholders about the risks of a contact-tracing app.

In my contribution to the round table, I said that the most privacy-friendly app would be one that would paradoxically not be covered by the GDPR.

First of all, the government should not be a controller for data processing in the app and between users’ phones. After all, the last thing a democracy would want is the appearance of the government being present inside citizens’ phones. This would require the government’s role to be that of a mere software publisher with no access to or control over the data on the user’s phone, similar to other government apps such as the one Dutch citizens can use to calculate their income tax statements. So, it needed to be crystal clear from the app’s design that its use would be entirely in the private domain and that such use would be excluded from the GDPR per Article 2(2)(c).

Secondly, the data that a user would share with the backend server should not be personal data. The keys that are uploaded are technically pseudonyms as defined in GDPR's Article 4(5), but they should never be linked to any other data that the Ministry or any other party — such as the regional health service — has on the user. After all, as long as the linkability exists, there is a risk of the user being identified and of data being used for other purposes, which would undermine trust in the app. So, linkability of the data to the user would need to be prevented.

The Dutch notification app builds on the Google Apple Exposure Notification framework, which in turn is built on the DP3T architecture. In their reference DPIA, the DP3T team concluded that the architecture does not require the processing of personal data. However, as they can’t technically rule out the processing of personal data in the system, they recommend to implement the data protection principles of the GDPR in apps that use the DP3T decentralized design, to be on the safe side.

The data processed on the app’s backend server, which is fully under the Ministry’s control, qualifies as "near anonymous data." The privacy and security design team of the Dutch contact-tracing app led by cybersecurity specialist Brenno de Winter, has separated the two identifiers — the IP address and the authorization code from the uploaded keys (TEKs). The IP addresses are immediately separated from the TEKs once they reach the server and are stored separately for seven days for security reasons. The user’s authorization code, which is validated in advance by the user’s regional health service after a positive COVID-19 test, is stored in a separate portal, which is checked by the server to validate a user’s TEKs upload. In this setup, the regional health service has no access to the data in the backend server of the app. The Ministry’s DPIA stated that the risk of linkability was addressed by closing agreements with the parties involved prohibiting identification of the user.

The problem with this design is that in view of GDPR's Recital 26, the TEKs uploaded by an infected user would still qualify as personal data, because the Minister of Health — who according to Dutch law qualifies as the controller of the data on the backend server — is in direct control of the IP addresses of the infected users. Furthermore, the Public Health Act also tasks the Minister of Health with leading the fight against COVID-19. This makes the regional health service, which processes all kinds of personal data about infected users including their authorization code, not totally independent from the Ministry.

Therefore, from a GDPR point of view, the Minister in the context of the data processing does have the means at his disposal to identify the user who uploaded the data to the backend server without much effort “in terms of time, cost or man-power” (see Court of Justice of the European Union in the Breyer case, para. 46).

In my second opinion on the Ministry’s DPIA, I recommend that the act regulating the contact-tracing app contain a prohibition to relate the data on the backend server to the IP address or the authorization code, thus making such data truly anonymous (see Breyer, para. 46: “the identification of the data subject was prohibited by law”). According to Breyer, only the use of lawful means needs to be considered when assessing the “means reasonably likely to be used to identify a data subject.” This means that protecting non-personal data against the use of illegal means to identify a person is the domain of information security, but not the domain of the GDPR. So by including the amendment in the bill, Parliament effectively turned the data on the backend server into non-personal data and paradoxically put those data outside the scope of the GDPR’s protection.

This felt like a GDPR Catch-22.

For the GDPR to apply, the data would need to be personal data, which would require more than theoretical possibility of the use of lawful means to identify a user of the app. But to achieve maximum privacy protection of the user, the law would need to prohibit the identification, meaning the GDPR would not apply.

I strongly believe that including the amendment is the best possible outcome for the protection of the app’s users.

First of all, the user may now reasonably expect that the prohibition is enforced by the Ministry. This could help increase trust in the app. Any (now unlawful) identification of a user would give that user a right of action against both the Ministry and the person or organization identifying them. But more importantly, a successful identification of a user would immediately result in the illegal processing of personal data, so the Dutch Data Protection Authority could issue a fine to the person or organization doing so. The alternative — relying on Article 6(4) to prohibit secondary use and internal agreements not to link the data to a user — is not as strong a protection as a full-blown prohibition to link the data.

Secondly, the Ministry’s DPIA stated that it was well understood that identification of the user would be difficult, so if a user would invoke his/her right to be forgotten or any other GDPR right within the limited lifespan of the data (maximum 24 hours), the Ministry would simply rely on the GDPR's Article 11 to reject the request.

In other words, as long as the data are personal data, the app users have GDPR rights, but because of the privacy by design features implemented around the app’s backend server, they could not effectively invoke those rights. With the amendment, that problem goes away: if the data are not personal data, the users have no GDPR rights and cannot be disappointed when their rightful claim is rejected.

Parliament understood that — given the widespread awareness of the GDPR among the Dutch population — it would take tremendous political courage to have the GDPR not apply to the data on the app’s backend nor to the data processed in the app itself. Yet, after the Ministry showed the courage not to have the GDPR apply to the data in the user’s phones, the Dutch Parliament showed courage to improve the privacy of the app’s users by excluding the backend data from the GDPR.

Now it is up to the Minister to explain all this to users in the app’s privacy statement, in clear and plain language. After all, we should never forget that the GDPR’s fair information principles — such as data minimization, data quality, transparency and security — can well be achieved without the GDPR.

Photo by Adrien Olichon on Unsplash


Approved
CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.