TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Privacy by design — GDPR’s sleeping giant Related reading: How to operationalize privacy by design

rss_feed

""

""

Today’s approach to privacy by design, or data protection by design as it’s referred to in the EU General Data Protection Regulation, is fundamentally flawed. Privacy by design is the concept that privacy must be “baked” into every stage and aspect of a new product’s design, as well as a company’s organizational structures.

Privacy by design as a concept has been around for a long time. In 2009, Ann Cavoukian, then Information and Privacy Commissioner for Ontario, Canada, along with her colleagues, came up with "The 7 Foundational Principles" of privacy by design. According to Cavoukian and her colleagues, privacy must be (1) proactive and not reactive; (2) embedded into a product’s design; (3) user-centric; (4) visible and transparent; (5) positive-sum, not zero-sum; (6) full lifecycle protection; and (7) the default setting. Though for many years privacy by design was considered best practice, it was never a formal legal requirement. Until the GDPR. Article 25 of the GDPR introduces a legal requirement to implement privacy by design and “bake” privacy into product development and organizational processes.

So why is our approach to this topic flawed? If you ask anyone who advises on GDPR compliance what is required to comply with Article 25’s requirements, they will probably say the following: When you are designing a new product or launching a new service, carry out a privacy impact assessment to mitigate any risks identified, then decide whether to proceed with the new product as is or modify it.

I believe this approach is wrong and does not comply with Article 25’s requirements for three main reasons. The first reason is purely legal/textual. Article 35 of the GDPR already deals with PIAs, when they are necessary, the process for carrying them out, etcetera. There is no need for an entirely separate article, Article 25, simply to introduce another instance in which a PIA needed. The second reason is more practical and takes into account how businesses operate. By the time a PIA is carried out, a product is already conceptualized, planned and, in some instances, a beta is ready to be tested. By that stage, it might already be too late to change the functionality. A PIA is merely a reactionary tool, and privacy by design calls for proactivity. The third reason is also practical and has to do with what a PIA is designed to uncover. A PIA is inherently very narrow and tailored specifically to the new product that is being developed. It will not generally highlight systematic organizational issues rooted in a company’s structure, such as whether a company’s policies are being adhered to or not. I will elaborate more on this in a minute.

So why is privacy by design so important, and why are we talking about it now? The answer is that this requirement is slowly creeping into the headlights and gaining importance. European authorities are placing greater emphasis on this requirement, as well. On Nov. 13, 2019, the European Data Protection Board published its guidelines on data protection by design and default, the official “handbook” detailing Article 25’s requirements. The EDPB accepted public comments through January 2020.

Perhaps of equal significance is the attention this requirement received on the national-enforcement stage. The national data protection authorities are beginning to enforce this requirement and fine companies for noncompliance. On June 26, 2019, the Romanian National Supervisory Authority For Personal Data Processing fined Unicredit Bank 130,000 euros for unlawfully publicly disclosing the information of 337,042 individuals, in cases in which these individuals were transferring funds via the bank. On Oct. 7, 2019, the Hellenic Data Protection Authority fined OTE, a telecommunication service provider, 400,000 euros for unlawfully retaining details of subscribers and not properly deleting them in accordance with legal requirements. And, on Oct. 30, 2019, Germany's Federal Commissioner for Data Protection and Freedom of Information fined a property company, Deutsche Wohnen, a staggering  14,500,000 euros (the fifth largest GDPR fine to date) for unlawfully retaining customer data. To be clear, the property company had a retention policy in place. However, it failed to implement the proper process to ensure the automation and enforcement of its policy. In all three cases, the authorities cited breaches of Article 25 as one of the reasons for the fines.

What is amazing about the German Deutsche Wohnen fine is that enforcement of a retention policy is probably not something that would come up in a standard PIA. A standard question in a PIA would be “is the data collected retained in line with the company’s retention policy or not?” If the answer is yes, the examiner of the PIA will move on. This will not prompt a check of the retention policy itself and the company’s adherence to it. It is also interesting to note that none of the fines imposed thus far for a breach of the privacy-by-design article have been for new processes or products. They go to the core processing activities and practices of the company — its systems and its controls. Clearly, PIAs are not the answer.

So what is the answer? This is a very tough question. Looking at the EDPB’s guidelines, it is clear that a more systematic and rooted approach is needed to comply with this complex requirement. It is perhaps the most fundamental requirement in the entire GDPR because it encompasses everything — every process, every level, from the minutia of product design to the companywide policies and procedures. It forces companies to consider not just new products and services, but also the essence of their processes and take stock of whether they are privacy-friendly and privacy-centric or not. In my opinion, the fines we have seen in this space are only the beginning, and there will be a much greater wave of fines once the EDPB guidelines come into force. This requirement might very well be the GDPR’s sleeping giant, and it is only a matter of time before it rears its ugly head.

So, what can companies do to comply with this requirement? Looking at some of the suggestions made by the EDPB in its guidelines, it is clear that the Article 25 requirement is intended to ensure that privacy is essentially “baked” into corporate processes. Ensuring that privacy notices are clear, transparent and accessible; automating data flows to ensure no unnecessary copying of information; placing appropriate access controls on data — are only some of the examples that used to be best-practice and now legal requirements. Privacy is not something that can be added on as an afterthought; rather, it must become part of a company’s DNA. If a company doesn’t take these steps, it will very quickly find itself running afoul of Article 25.

In November 2019, Apple CEO Tim Cook spoke at a Salesforce conference. He very eloquently summarized the essence of privacy by design by saying, “You don't bolt-on privacy, you think about it in the development process of products... You have to design it in.”

Photo by Andrew Ridley on Unsplash


Approved
CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT
Credits: 1

Submit for CPEs

1 Comment

If you want to comment on this post, you need to login.

  • comment Rachel Thompson • Jul 6, 2020
    Very true. So I think the question then becomes not "how" do we do it, but "who" should own/build this process?