After five years in privacy, I finally attended my first CPDP this year (it stands for Computers, Privacy & Data Protection, apparently, but you wouldn't know that from being there, and for some reason "computers" seems particularly quaint), here in Brussels. It is a unique experience, even for someone who has been to hundreds of conferences, both in and out of privacy and data protection. 

If you want to geek out on the details of privacy and data protection, this is your spot. 

This year, at least, it was devoid of pomp and circumstance. There were no "keynotes," yet many of the leading voices in privacy were on stage at one point or another, sometimes getting just a few minutes to speak (six or seven people on a panel is the norm). There were no official networking events whereby you could acquire drinks and snacks, at least that I could find, but there were numerous "side events" that could potentially provide networking, with at least one of them providing some drinks and snacks (and dancing!). No, not much window dressing at all. Rather, there was a ton of content, some of it overlapping, and almost all of it presented at breakneck speed. 

In lieu of full stories, which may come later, here at least are some three thoughts (the internet likes lists, right?) on what I happened to take in:

1. The DPIAs requirement in articles 35 and 36 of the GDPR could really mess with DPAs. One of the interesting things about the GDPR is that it places obligations on the data protection authorities. Where previously DPAs were like regulators in most industries, where they simply needed to police their area, now they have any number of specific jobs to do, some with deadlines. 

Take the area of data protection impact assessments: First, the member state DPA has to make a list of the types of processing that trigger a DPIA and make that list public. Good luck with that. Do they have to stay one step ahead of industry, which is constantly coming up with new types of data use? Then, they have to publish a list of processing activities that DO NOT trigger the DPIA requirement. I'm sure civil society will have plenty to say about that list. 

Further, both of those lists have to consider the consistency mechanism and be in line with the lists published by other member state DPAs, "where such lists involve processing activities which are related to the offering of goods or services to data subjects or to the monitoring of their behaviour in several Member States, or may substantially affect the free movement of personal data within the Union."

Are there any processing activities that only happen in certain member states? I'm not sure why the GDPR didn't just have the European Data Protection Board publish these lists, but I'm guessing it's a home rule kind of thing. 

Regardless, DPAs now find themselves in a bit of a quandary. If they publish a list of DPIA triggers that leans to the side of privacy, they'll find themselves with a lower threshold for "high risk," which means more businesses will have to consult with DPAs when their DPIAs indicate a "high risk" for data subjects, and DPAs will then have to "within period of up to eight weeks of receipt of the request for consultation, provide written advice to the controller and, where applicable to the processor, and may use any of its powers referred to in Article 58. That period may be extended by six weeks, taking into account the complexity of the intended processing."

So, a business is going to say, "hey, we want to do this thing, but it looks like it's going to infringe on people's rights and freedoms. Help a sister out?" And the DPAs are going to, what, figure out how to do that processing for the business? Or maybe the written advice is just going to be, "yeah, don't do that."

Paul Breitbarth, now at Nymity and formerly with the Dutch DPA, expressed incredulity at the idea. He doesn't think anyone will file a DPIA. And he also wonders about the position it puts DPAs in. Remember, DPAs are not the law. What happens if the DPA offers advice on how to make something compliant, then civil society files suit, and the CJEU decides that the processing was, in fact, against the law?

Wouldn't the business then sue the hell out of the DPA? It just seems totally unworkable in practice.

2. Is there anything left to say on government access to citizen data? I counted something like 16 panels that traded on questions about how much surveillance is appropriate, whether in the context of biometrics at border crossings or Privacy Shield discussions. On virtually none of them was an actual representative of a body that could actually do anything to change current practice. 

Sure, there were often privacy representatives from police forces or intelligence agencies, but they don't make the rules. They just make sure the organization follows the rules in place. And it's the rules everyone is concerned with. 

Perhaps I'm too cynical, but I just have a hard time believing that a panel discussion at CPDP (or anywhere) is going to move the needle on how police and intelligence forces act as they gather data in the process of trying to find the bad guys. Nor does it really educate citizens, since just about everyone in the audience already knows what the issues are. So, what's the point?

I witnessed one exchange where someone was passionately arguing that it shouldn't matter whether you're a citizen of the United States when intelligence services are deciding whether to gather your data. If you think the U.S. is going to treat all people, citizens and non-citizens alike, in the same manner, I don't think you've been paying attention. 

3. Encryption will not save you. I've seen an awful lot of security companies looking at the data breach provisions in the GDPR and telling potential customers: "Hey, just encrypt everything and you'll never have to notify anyone of a data breach." 

As panelist after panelist made clear, this is bad messaging. Sure, you should encrypt data at rest and in transit. The GDPR does seem to set that as a baseline, and this is something we've seen in the U.S., where the Office for Civil Rights has set encryption as a baseline in HIPAA enforcements. But encryption isn't some kind of panacea. 

First, you still have human error to consider. I'm not a computer security expert, but panelists had all manner of scenarios where people do stupid things and encryption doesn't work. Things like letting a laptop go to sleep and then waking it up, instead of powering it down (is that a thing? I dunno), or losing the key along with the data, so the bad guys can just decrypt what's encrypted like everyone else. 

Then there was the idea that an encryption method might be iron clad today, but might be easily crackable in five years. Imagine this scenario: You've lost data, but it's encrypted. No need to notify. Then, five years later, that method is cracked. Now you've got to notify. Are you going to remember to do that? Are you going to keep a record of everything that's been lost and the method you used to encrypt it, and then periodically check to see if that's still all good? I'm guessing not. 

Sure, encrypt. Obviously. But don't let your organization get complacent in thinking that's a get-out-of-jail-free card. It's still going to be vital that data minimization, data destruction, and other best practices are carefully managed to limit the possibility of a breach and the risks to rights and freedoms of data subjects.