TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Canada Dashboard Digest | Notes from the IAPP Canada Managing Director, Jan. 25, 2019 Related reading: A regulatory roadmap to AI and privacy

rss_feed

""

""

So, the CNIL (the privacy regulator in France) levied a massive fine against Google this week. Google is appealing, which is not too surprising considering the size of the fine (in excess of 50 million euros). It’s a bit of risky move by Google because, as I understand it, the court that will hear the appeal could actually levy a steeper fine.

Of course, in Canada, if a regulator disagrees with an organization’s handling of personal information, they cannot levy fines — let alone massive ones like the one levied by the CNIL. In fact, the federal privacy commissioner only has the power to recommend that the organization do things differently.

Which is the better model for achieving compliance with data protection laws? I see merit in both, but I also recognize that many people in Canada who have studied this issue have recommended that we adopt a more European approach. Case in point: The ETHI Parliamentary Committee recommended to the government that the privacy commissioner be given the power to levy fines.

There is one area in PIPEDA where an organization can be penalized for doing it incorrectly. That is with respect to the data breach notification regime. So, it’s not too surprising that we’ve been busy at my firm helping organizations deal with privacy incidents. In one such case, after a thorough investigation, we concluded that there is no "real risk of significant harm" (aka RROSH) in the incident that occurred. But, seeing as we haven’t had any meaningful interpretation of that legal threshold, we wanted to ask the Office of the Privacy Commissioner if it agrees with our assessment. Unfortunately, while it was professional, courteous, trying to be helpful, ultimately the OPC simply reiterated its own formula for assessing RROSH. That is, it will look at the "sensitivity" of the information and the "likelihood" of that information being misused.

Unfortunately, these are subjective terms, and it's not like there's any universal scale for evaluating how sensitive data is or how likely misuse is.

This left us wondering whether it was worth asking the OPC in the first place. It’s challenging for organizations that want to comply when, historically, the regulator will only provide general, high-level direction. We did not get a definitive answer as to whether or not they believed there was a RROSH, but, now, at least we have notified them of the incident, and we can’t help but wonder if this makes us more likely to be audited in the future.

While I understand the OPC is hoping for the power to levy fines like the CNIL did this week, I fear that this might come without the true interpretive guidance needed to make sure organizations that are trying to do the right thing can actually do that with any confidence.

Comments

If you want to comment on this post, you need to login.