TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout
GDPR-Ready_300x250-Ad
OneTrust_tile_ads_300x250_final_20170725_
PrivacyTraining_ad300x250.Promo1-01

It will be harder to use cloud computing to process personal data compliantly with the proposed General Data Protection Regulation. The GDPR would enshrine prescriptive, cloud-impracticable, elements of the Article 29 Working Party’s cloud guidance WP196, while omitting elements that recognize how cloud works. One GDPR objective is to modernize data protection laws but, rather than making laws technology-neutral, the GDPR would perpetuate 1970s models of computing/outsourcing embedded in the current Data Protection Directive.

The GDPR would significantly complicate all modern outsourcing arrangements, not just cloud; cloud is just the prime example. Building on my previous article, this article focuses on controller-processor contract requirements and the GDPR’s Article 26 on processors.

The main problem

Infrastructure cloud providers are treated as processors; equipment manufacturers, vendors and lessors are not. Individuals and organizations use infrastructure services for computing, storage and networking purposes instead of buying their own equipment. Under the GDPR, if controller-customers process any personal data in-cloud, the providers of these services (i.e., IaaS/PaaS and pure storage SaaS) are considered “processors”. This approach may deter cloud use despite its cost and flexibility benefits.

For example, organizations buying or renting equipment for self-processing of personal data needn’t specify in the contract the processing’s subject-matter, duration, nature, purpose or types of personal data, etc. But if organizations use infrastructure cloud for the same processing, Article 26(2) of the GDPR would force them to include this information in their contracts or face fines up to €10m or 2 percent of total annual turnover if higher.

How can that improve data protection?

Consent and “instruction”

GDPR Articles 26(1a), 26(2)(d) require controllers’ prior consent to sub-processors (strictly, “another processor”) who are “enlisted” by processors. Cloud providers offer services already built on sub-providers’ pre-existing services (example: Dropbox’s SaaS storage on Amazon’s IaaS). In such situations, surely customers must either consent, or not use that provider’s service. Providers can’t re-engineer their services to work with another sub-provider’s service just because one prospective customer objects to a sub-provider. Similarly, if a sub-provider (e.g. connectivity provider) is to be changed because of a strategic business decision, if one customer objects then, in reality, it will have to stop using the cloud service. (Cloud providers may have little say anyway in decisions made by their sub-providers, particularly large IaaS/PaaS sub-providers.)

Cloud providers don’t actively process personal data as customers “instruct.” Customers use cloud services directly themselves, for example self-service uploading, editing and deleting of personal data processed using the cloud service. Discussing processing of personal data under controllers’ “instructions” (GDPR Article 26(2)(a)) makes no sense in cloud. The Data Privacy Directive’s “instructions” requirement was intended to target processors’ unauthorized use or disclosure of personal data for their own purposes, e.g., Salems (Sweden).

This objective could equally be met by banning processors’ use or disclosure of personal data without controller authorization, instead of referring to “instructions.” But the GDPR would perpetuate the “instructions” requirement; indeed, requiring “documented instructions” may force more logging, with customers ultimately bearing the costs.

Security measures

Controllers and processors must take security measures appropriate to the processing and its risks (GDPR Articles 26(2)(c), (f), (h), 30). A controller knows the nature/purposes of its intended processing and can secure its data by encrypting data pre-upload or implementing backups internally or to another cloud, for example. But public cloud providers can’t tailor security measures for each of their hundreds, even thousands, of customers. Their services are standardized; the numbers involved make individual customization impracticable. Yet Article 30 would make them investigate every customer’s intended processing/risks and customize security accordingly, or risk €10m or 2 percent fines.

These providers could implement the highest possible security, passing costs on to all customers – or offer tiered services where, e.g., services for sensitive data would have higher security levels and cost more. But if a customer deliberately processes sensitive data using a cheaper low-security service, could the provider be liable for not checking the customer’s real use and applying higher security levels accordingly?

Cue longer contracts, with extra customer disclosures, representations, warranties and indemnities … even intrusive provider monitoring of customer usage (which customers don’t want), could increase costs and maybe reduce performance. Compelling provider intrusion into customers’ processing won’t improve data protection. For infrastructure services, shouldn’t controllers be primarily responsible for assessing services’ appropriateness for their intended processing, including security levels, rather making processors pry?

Audits and notification

Even WP196 (p.22) acknowledged, “individual audits of data hosted in a multi-party, virtualized server environment may be impractical technically and can in some instances serve to increase risk,” recognizing that third-party audits (although by customer-selected auditors) might be acceptable. Yet GDPR Article 26(2)(h) requires contractual audit rights for controllers. Processors must notify “personal data breaches” to controllers without undue delay (GDPR Article 31(2)), but what about breaches affecting others but not that controller? This unclear obligation will be difficult to apply to public cloud. Similarly with the obligatory contract term requiring processors to “assist” with controllers’ breach notification obligations, which will again increase costs.

Sub-processors

Flowing down processor contract obligations to sub-processor contracts (GDPR Article 26(2a), mirroring WP196) may be difficult, even impossible. If cloud providers are “processors,” their infrastructural sub-providers must be sub-processors, think data center operators and connectivity or network service providers. This problem goes beyond cloud. Imagine asking every third-party data center operator and carrier involved in a technology service’s delivery to sign a contract on Article 26(2)’s prescriptive terms for every customer who processes personal data using that service – cloud or not. Thousands of tailored contracts for every service involving personal data processing (cloud or not) is impracticable, and doesn’t necessarily help data protection.

Of possible assistance is that this flow-down is required only where sub-processors are enlisted for a “specific processing,” but the meaning of this is insufficiently clear. Could commoditized cloud escape being considered “specific processing” – or not?

The consequences

I argue that treating providers of infrastructure services (cloud or not) as “sub-processors” (strictly, “another processor”), with obligations beyond maintaining certain minimum security measures, regardless of whether they can access intelligible data, doesn’t make sense, and may lead to absurdities.

One consequence (presumably unintended) of the GPDR is that only large customers and providers are likely to have the resources and bargaining power to make the entire supply chain sign the detailed, prescriptive (and cloud-inappropriate) contracts that Article 26 would require. Small customers and providers will be unable to operate – or may choose not to comply, despite potentially huge fines, taking the risk that regulators with limited resources will not go after a “small fry.” But that would bring the GDPR into disrespect.

In practice, controller-processor contracts will be the best place to address the GDPR’s uncertainties, and similarly with sub-processor contracts – e.g., detailing the scope of cloud providers’ obligation to “assist” controllers with security, etc (and payment for that assistance).

There’s no “grandfathering” of pre-existing contracts that comply with current DPD requirements. With contracts expiring after the GDPR takes effect, change control or change of law clauses will need to be included now, so that they can be amended to comply with GDPR Article 26 (as well as to allocate responsibility among supply chain actors with appropriate liability/indemnity clauses).

Space doesn’t permit discussion of other problems the GDPR will cause for cloud, indeed sourcing and outsourcing generally. My key point is, laws drafted for old outsourcing models just won’t work for commoditized, standardized, pre-built, self-service public cloud, particularly infrastructure cloud, and other modern outsourcing models. Twenty-first century laws should strive towards technology-neutrality, rather than enshrining 1970s models of computer services bureaux.

This whole thing puts me in mind of a song. You know that piece "Killing Me Softly" by the Fugees, and before them Roberta Flack? Well, sing along with me if you will (with apologies to both the Fugees and Roberta Flack):

The GDPR’s coming, soon to be law they say
Middle of 20-18 may be the fateful day!
What will this mean for clo-ud? Will cloud be here to sta-ay?
Don’t want to be pessimistic, not sure how we’ll find a way
Killing cloud quickly with DP, killing cloud quickly, with DP,
tearing up Sa-aS, Pa-aS and I-aaS
Killing cloud quickly with DP…

photo credit: Window Seat via photopin (license)

6 Comments

If you want to comment on this post, you need to login.

  • comment Stuart Ritchie • Mar 21, 2016
    Excellent article, Kuan! Good legal research and I've bookmarked it.
    
    I have a few queries on "tech/cultural" and methodological assumptions etc. ("What do we even mean by cloud?", "what do we really mean by technology-neutrality?", "Are we being excessively March-2016-centric?", "who is the dog, who is the tail, and who is wagging who?" etc). Getting those wrong might fatally undermine the utility of any conclusions: we need to take care cloud doesn't follow big data into vacuity. Also, we shouldn't underestimate the capacity of modern business to innovate. Even technology companies innovate sometimes (I'm not being ironic!), just not as fast as privacy law... yet. I have confidence technological innovation already is solving some of the issues you raise.
    
    Also, I'd be grateful to know if your thoughts from your August 2015 post (which I've just seen) have significantly changed following the December 2015 GDPR draft.
  • comment Andrew Sanderson • Mar 22, 2016
    Thought-provoking article, Huan – thank you.
    So if a SaaS service provider is by default a processor, then an auto manufacturer is a taxi company? Hmmm. 
    @Stuart – innovation coming up …
    Suppose we have a chain: SaaS vendor - business controller & processor – consumer. And the SaaS vendor aims to fulfil GDPR but distance itself from the risks of being responsible for the business controllers’ actions (which it cannot control). 
    Could a SaaS vendor create a legally valid position as a “technical and organisational measure” that enables business controllers to process consumer data alone - and without subcontracting back to the SaaS vendor?
    What steps might be necessary to achieve this: Re-design the service so that the SaaS legal entity was not a "recipient" of consumers’ personal data? Re-design the database that business controllers use to hold the consumer data, so it is indexed with pseudonyms? Ensure that the fields containing the consumers’ personal data would only be visible to the SaaS vendor at the request of / under instruction of the business controller? 
    At the same time, the SaaS vendor would probably have to upgrade security and safety features, so that the product could be “certified” [under Article 39] as ‘fit for purpose’ (just like auto manufacturers adding seat belts & headrests in the last century). In fact, they would probably want certification – first vendor in gets a competitive advantage. 
    So, speaking as a marketer, it would appear that the task for SaaS vendors now is to clearly define their role under GDPR via careful (re-)design of functionality and T&Cs. (Privacy law nudges T&C design into the remit of mainstream product marketing … brave new world, indeed.)
    What marketers need is our legal colleagues to help us define what is necessary, what combination is sufficient. Ideas, anyone?
  • comment Kuan Hon • Mar 25, 2016
    @Stuart @Andrew, many thanks for your thoughtful comments! I'll respond in order.
    
    @Stuart - I had only 2k words so couldn't go into foundational issues.
    
    What's cloud - see http://ssrn.com/abstract=2200592
    Tech-neutrality - yes many meanings, but where the same functionality can be provided by both cloud and traditional outsourcing, and regulation applies to one but not the other, it's not tech-neutral. Eg the "instructions" requirement perpetuated by GDPR. See http://www.iicom.org/intermedia/intermedia-january-2016/dark-clouds
    What do you mean by "March-2016-centric?", "who is the dog, who is the tail, and who is wagging who?" - can you please expand?
    
    Technological innovation - yes, startups (or upstarts?) may be disruptive partly because they ask forgiveness rather than permission, but non-tech-neutral or just bad regulation may deter incumbents (who have more to lose and can't be seen to be non-compliant) from innovating. The huge potential fines under GDPR will also make everyone more cautious to innovate. I'd be interested to know in exactly what ways you think innovation (and what innovation) is already solving some of those issues?
    
    Aug 2015 post - the draft of Art. 26 hasn't changed much from Aug 2015 so my thoughts haven't changed much. Good that the strange "processor as lawyer for controller" provision has moved from being statutory to a contractual obligation, but it'll trigger some negotiation I'm sure. Hopefully regulators/courts will require that the processor's breach contribute at least in part to the "event" (or "damage"?) before holding it liable under Art. 77, but whoever's unlucky enough to be sued first still has to pay in full before it can claim recourse against others in the supply chain who might perhaps be more at fault.
    
    @Andrew - absolutely! Part of the problem is because "processing" is defined so broadly as to include merely storing/holding personal data (when the underlying mischief targeted is really unauthorised use/disclosure of personal data). So in your example, if the SaaS vendor's infrastructure/system is used to store personal data even momentarily, it's a "processor". Pseudonymised data are treated as personal data (see recital 23). Some DPAs, particularly from the Continent (I had a long argument with one about this at a conference!), consider that even encrypted personal data remain personal data for all purposes, even in the hands of someone without the decryption key (whereas I feel that, bearing in mind the original legislative objective of data protection law, encrypted data should be considered as anonymous data when held by someone without the key, even though the controller with the key must treat still it as personal data). I argue that data protection obligations, controller or processor, should only be imposed on those with access to intelligible personal data, as someone without such access can't misuse or disclose the data, so they pose no privacy risk. Again please see http://www.iicom.org/intermedia/intermedia-january-2016/dark-clouds However, that's not what data protection laws say, or how they're interpreted by many regulators.
  • comment Martin Hoskins • Mar 29, 2016
    This is a very useful illustration of what happens when policymakers agree a 'political text' without giving 'experts' an opportunity to stress-test the text. So what will happen?  Simple - those parts of the Regulation will be ignored by the vast majority of data controllers who don't fall under the regulation of the Data Protection Taliban.  As for those controllers who do face the wrath of a 'Regulator on a mission', just remember you can always relocate to a more pragmatic jurisdiction, and hope that said Regulator isn't representing too many unhappy consumers in that part of Saxony (or wherever) ...
  • comment Kuan Hon • Apr 1, 2016
    Martin, yes indeed. But there were opportunities. As part of an EU-funded project where I was involved, some of these points and many others were made, in 2014 - http://www.a4cloud.eu/sites/default/files/D25.1%20White%20paper%20on%20new%20Data%20Protection%20Framework.pdf and http://ssrn.com/abstract=2405971 - but policymakers don't seem to have taken them into account. Moving to a more pragmatic Member State within the EU may not help very much given the one-stop shop provisions, which were simple and clear in the Commission's initial draft, but have ended up much more convoluted. Worst case scenario, some non-EU organisations may decide to relocate outside the EU altogether, and we in the EU may ultimately be the poorer for it.
  • comment Stuart Ritchie • Apr 9, 2016
    @Huan 25 March. Thanks for your response! I liked your SSRN piece on cloud, because unusually it doesn't appear to claim any technical (as distinct from commercial) innovation in cloud computing. Likewise the 2.1 restriction for 'Unilateral' types: "In fact, [a cloud supplier] does nothing for [the cloud customer] except provide and maintain the shared hardware and software infrastructure and environments...". If it were left at that, there would be no problem, because the controller makes all decisions. The difficulty is it is not left at that, as the rest of your paper makes clear - the extent that the "controller" loses control. As you rightly observe (p4) not all control is lost; however it is sufficient that some is lost.
    My references to tech-neutrality, 2016-centricity, and tails wagging dogs is all the same thing. We've lost the plot. Arguably law should change to accommodate new technology, NOT new commerce. Otherwise we would have done away with fraud laws years ago; and anything else commercially inconvenient, such as consumer torts and penalizing homicide where commercial interests might be involved. So: remote storage tech (re-branded as "cloud") hasn't really changed for 35 years (commercial modalities, products, and mini-technologies slaved to those don't count). Rhetorically: should we change our cultures, our laws, and our constitutions, just to accommodate the latest commercial hype cycle du jour (as Gartner might put it)? And: where the answer is in the affirmative, methodologically do we not automatically teeter on the edge of the naturalistic fallacy "cliff"?
    
    The idea of the GDPR inhibiting innovation is curious, if only because the ever-malleable patent law, R&D grants, etc achieve that already. But here's a counter-example: innovation in the so-called Internet of Things, which the GDPR is set to supercharge via enforcing portability (which undeniably is also a trivial exercise, the technical problems having been solved routinely in IT and marketing for decades).
    
    As for innovation solving the legal problems, the fact that tech companies refuse to try out simple exercises in innovation to solve them, constructively preferring ignoring the law to protect market share of a segment of commerce, is absurd. I've developed a technology solution at both the PIA and API-transactional levels, self-adapting to change of law (a necessary design constraint because privacy law "innovates" faster than even Google) and to changing individual customer circumstances. Patent pending and all that and I'm happy to show it to you, but the more disturbing point is: if I can do it, surely anyone can, and why didn't someone do it during the ten years I was wondering why they wouldn't?
    
    @Andrew: interesting thoughts. There are some simple options: for example ignoring your premises(!), and just licensing the software; and recognizing that in long-standing law the SaaS provider remains a controller and deal with it.
    
    My own instinctive offering, at which I confess I arrived years ago for commercial reasons, but prima facie it seems reusable, is to adopt a kind of "landlord/tenant" model: license the *service* back to the putative customer/controller and let them run it themselves directly. Divide up the infrastructure to make sure nobody else can see it, and ensure even the vendor can only look as absolutely necessary. I haven't thought the data protection viability through (it might be indistinguishable from the software licensing model!)  but I developed the infrastructure for some of my weird projects...