TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Did the WP29 misinterpret the GDPR on automated decision-making? Related reading: A view from Brussels: EDPS sends signal on data transfers 

rss_feed

""

When the European Union's data protection authorities last month released their draft guidelines interpreting the General Data Protection Regulation's sections on profiling and automated decision-making, one of their claims in particular raised some eyebrows.

According to the Article 29 Working Party, the body through which the regulators coordinate their approaches, the GDPR prohibits purely automated decision-making that includes profiling, with limited exceptions involving explicit consent or a contract between the data subject and controller. 

The thing is, that doesn't seem to be what the GDPR itself says. And, according to at least two leading privacy lawyers, the WP29's interpretation could hit a variety of sectors — from education to housing — with problems that weren't envisaged by the legislators who came up with the regulation.

For one thing, the WP29's stated interpretation could effectively outlaw the practice of automated differential pricing in online commerce, said Eduardo Ustaran, the leader of Hogan Lovells' European privacy and information management practice.

"I think that regarding Article 22(1) as an outright prohibition is not entirely in line with the risk-based approach of the GDPR," said Ustaran, who warned of "considerable uncertainty" for industry in a recent article on the subject.

"The WP29 stance on this would appear to be different from that of the [U.K.] ICO who talk about it very much in the context of a right that may be exercised by a data subject and a right that doesn’t apply if certain exceptions are in place, as opposed to the data controller not being able to do the activity in the first place unless the exceptions are there," said Elle Todd, the head of digital and data at CMS London.

Here's what Article 22(1) says:

The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.

And here's what the WP29 said in its draft guidelines:

Article 22(1) acts as a prohibition on solely automated individual decision-making, including profiling with legal or similarly significant effects. Instead of the data subject having to actively object to the processing, the controller can only carry out the processing if one of the three exceptions covered in Article 22(2) applies.

"The key issue is that the provisions dealing with automated decision-making are in 'Chapter III – Rights of the data subject.' All of the articles in that chapter are expressed as rights of individuals which can be exercised under certain circumstances and conditions," Ustaran told The Privacy Advisor. "This chapter is a key pillar of the GDPR, of course, but its overall aim is to give people control of their data, not to place prohibitions on what controllers or processors may do with such data."

Ustaran pointed out that there are parts of the GDPR that do provide flat-out prohibitions or restrictions, with language that leaves no doubt about that being the intent.

Article 6, for example, states that the processing of personal data "shall be lawful only if and to the extent that" certain conditions apply. Article 9, which deals with the processing of special categories of personal data, says that the processing of data "revealing racial or ethnic origin, political opinions" and so on, "shall be prohibited." 

As the WP29 noted in its draft guidelines, the GDPR does not define the "legal" or "similarly significant" effects to which it refers in Article 22(1). The regulators said that a legal effect "suggests a processing activity that has an impact on someone’s legal rights, such as the freedom to associate with others, vote in an election, or take legal action," while a similarly significant effect would refer to situations where "the data subjects could still be impacted sufficiently to require the protections under this provision.

"The decision must have the potential to significantly influence the circumstances, behaviour or choices of the individuals concerned," the working party said, before turning specifically to the issue of online advertising and marketing. 

Even if advertisers' and marketers' processing of personal data wouldn't normally have much of an impact on most people, the WP29 suggested, it could have a "significant effect" on minority groups or vulnerable adults, thereby incurring the prohibition.

The guidelines raised the example of "someone in financial difficulties who is regularly shown adverts for online gambling" —  if they sign up as a result and "potentially incur further debt," the prohibition may apply. "Automated decision-making that results in differential pricing could also have a significant effect if, for example, prohibitively high prices effectively bar someone from certain goods or services," the WP29 added.

Ustaran said, "If that is the case, such practices would be rendered unlawful by default and the only way to overcome that unlawfulness would be to obtain the prior explicit consent of the consumer — which is a very unrealistic proposition." 

According to Ustaran and Todd, the WP29's guidelines on automated decision-making could have a broad impact as presently phrased.

"There is a fundamental difference between a right that can be exercised and a default prohibition. This has very wide-reaching consequences for companies in many different sectors and it is critical that this is clarified and addressed in the final guidance," said Todd. 

"In my view, the key sectors affected by this will be those that are more significant to people's lives, such as education, finances, health or housing.  Given the role of profiling in decision making for providers of these services, they will need to keep a close eye on their practices," Ustaran said. "I also think that this would affect differential pricing practices, which would effectively be prohibited."

Todd added that this wasn't the only area of concern in the WP29's draft guidelines. "The suggestion that all demographic segmentation constitutes profiling that an individual could object to will catch a huge number of activities and be extremely difficult for companies to apply a simple technical solution to in the context of existing database architecture," she said.

The Article 29 Working Party's interpretation of the article in question. The regulators took comments from the public until Nov. 28.

The Privacy Advisor repeatedly asked the working party for its response to the concerns over the last couple of weeks, but had not received any comment at the time of writing.

12 Comments

If you want to comment on this post, you need to login.

  • comment Mike O'Neill • Nov 29, 2017
    There was no misinterpretation. The subject has "the right not to be subject to a decision based solely on automated processing, including profiling". This can only work if they can exercise the right before the processing occurs, because what effects it leads to may not be capable of being undone. 
    If there was a commonly understood and easily available way to indicate objection, e.g. the Do Not Track signal, then perhaps this could suffice, but as everyone knows this is currently ignored in most cases.
    The Parliament's agreed version of the ePrivacy Regulation requires this and similar signals be "binding on, and enforceable against" all parties. but is unfortunately being held up by the Council. 
    So the only option is the explicit opt-in.
  • comment Jeroen Terstegge • Nov 29, 2017
    I don’t agree with Eduardo. Apparently, Eduardo reads the right under art. 22 GDPR as a right that must be invoked (as an opt-out of sorts), and that therefore the controller has the obligation to cease the profiling of that individual, or to cease the automated decision making that significantly affects that person only if and when that person objects. In my view, the Article 29 Working Party seems to build on the same interpretation that many Member States have given to article 15(1) of Directive 95/46/EC, namely that it is a prohibition. Art. 15(1) states that “Member States shall grant the right to every person not to be subject...”. Many states have read this as a right that a data subject HAS (therefore it’s prohibited for others) rather than a right that a data subject must EXERCISE. See for instance § 6a Bundesdatenschutzgesetz in Germany, which uses the term “may not only” (“dürfen nicht ausschließlich”), thus suggesting a clear prohibition. A little less clear,  but suggesting the same are article 42 Dutch Personal Data Protection Act and § 49 Austrian Data Protection Act, which both state that “nobody may be subjected to”. Article 15(2) of Directiev 95/46, which states the exceptions, uses the phrase “may be subjected” and recital 71 of the GDPR uses “should be allowed”, which both suggest an exception to a prohibition rather than an exception to an exercisable right of the data subject.
    I also don’t agree with Eduardo that all rights of the data subject in Chapter 3 are expressed as rights that can be exercised. Articles 12, 13 and 14 are clearly expressed as obligations on the part of the controller. I also don’t think the rewording of art.22 GDPR suggests a deviation from the current data protection acts in Europe under art. 15 of the 95/46 Directive, nor do I think that this should be the unintentional consequence of the new wording.
  • comment Jari Perko • Nov 29, 2017
    Agree with Ustaran and Todd. Just like add that definition of profiling has in its core the term evaluate  ("evaluate certain personal aspects") and in many uses cases discussed being related to art 22.1 there's no factual evaluation, just automated process. It's a pity that art 22 ended up to as messy as it is.
  • comment Jeroen Terstegge • Nov 29, 2017
    (As follow up to my comments above).
    See also art. 10 French Data Protection Act, which also states a clear prohibition ("ne peut"). Idem art. 12bis of the Belgian Data Protection Act and art. 26a Polish Data Protection Act.
    But I can see where Eduardo got his idea of a right that must be exercised, since article 13(2) of the Spanish Data Protection Act is worded that way...
  • comment Jeroen Terstegge • Nov 29, 2017
    (Final update)
    The paint a compete picture, I have reviewed the rest of the current laws. Section 12(1) of the UK Data Protection Act has implemented art. 15(1) as a right that must be exercised. “1) An individual is entitled at any time, by notice in writing to any data controller, to require the data controller to ensure that no decision taken...”. Spain, Sweden, Denmark, Greece, Slovakia, Latvia, Romania and Malta seem to do the same. On the other hand, the Czech Republic, Italy, Luxemburg, Finland, Estonia, Lithuania, The Netherlands, Belgium, Austria and Bulgaria share the German/French idea of a prohibition. Portugal seems to suffer from the same ambiguity as the GDPR. And I couldn’t find any article about this matter in the Irish Data Protection Act. (*)
    
    So, lawyers from a lot of European countries are understandably confused that art. 22 GDPR is interpreted by the Article 29 Working Party as a prohibition. Yet, given the significance of art. 22 for the information society where robotics and artificial intelligence enters all aspects of human life, the German/French view (and that of most other Member States) that art. 22 is a prohibition is to be preferred over the Spanish/UK and some other countries’ view that it is a right that must be invoked. Article 22 is the corner stone of the protection of fundamental rights in smart environments, which perceive and observe huma behaviour, taking small and big decisions with varying impact on people’s lives. Therefore, I fully agree with the Article 29 Working Party’s interpretation that art. 22 is a prohibition. This interpretation is just better equipped to handle the complexity of the (future) information society and its (future) business models.
    
    (*) This analysis is based on unofficial translations of the Member States’ data protection laws as published on the internet by the various data protection authorities.
  • comment Jeroen Terstegge • Nov 29, 2017
    Sorry, I forgot about Cyprus: it’s a right that must be invoked.
  • comment Jeroen Terstegge • Nov 30, 2017
    Jules Polonetsky asked a good question: How did the final language of art. 22 GDPR evolve? So, here it is from my archives:
    - Commission’s Interservice Copy (2011): 1. Every natural person shall have the right not to be subject to a measure which produces legal effects concerning this natural person or significantly affects this natural person, and which is based on automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person's performance at work, creditworthiness, economic situation, location, health, personal preferences, reliability or behaviour.
    - Commission Proposal (January 2012): Every natural person shall have the right not to be subject to a measure which produces legal effects concerning this natural person or significantly affects this natural person, and which is based solely on automated processing intended to evaluate certain personal aspects relating to this natural person or to analyse or predict in particular the natural person's performance at work, economic situation, location, health, personal preferences, reliability or behaviour.
    - LIBE rapporteur Albrecht (2012): 1. The processing of personal data for the purposes of profiling, including in relation to the offering of electronic information and communication services, shall only be lawful if it:...
    - LIBE amendments 1545 (right to object) and 1546 (right to request) have been rejected.
    - LIBE final report, Amendment 115, distinguishes between a right to object to profiling for ordinary profiling, and a prohibition for profiling which produces legal effects or significant similarly effects.
    - ITRE opinion: right not to be subject...
    - IMCO opinion: “1. A data subject shall not be subject to a decision which is unfair or discriminatory, and which is based solely on automated processing intended to evaluate certain personal aspects relating to this data subject.”
    - JURI opinion: right not to be subject....
    - Parliament First Reading: idem as LIBE position.
    - Council, various Presidency compromise texts (i.e. Ireland, Lithuania, Italy, Greece) use the Commission’s phraseof a “right not to be subject”.
    - DAPIX documents on Chapter 3 (2015) use “right not to be subject”. No Member States’ comments/reservations on this phrase where made during the years of negotiations, except from Germany to clarify the concept of profiling.
    - Council First Reading uses the phrase “right not to be subject”.
    - It doesn’t seem to have been an issue in the Trilogue either.
    In sum, the wording “not to be subject” as proposed by the Commission was never an issue in the Council. Amendments in the Parliament to rephrase art. 22(1) to a right that must be invoked have been rejected. Ergo, the history of art. 22, especially in the Parliament, supports the view that art. 22 must be read as a prohibition.
  • comment Jeroen Terstegge • Nov 30, 2017
    And last but not least, current law in Hungary: a prohibition.
  • comment Jason Cronk • Dec 1, 2017
    I disagree with Eduardo's interpretation though I admit having not done a thorough analysis of the legal precedent or legislative history of the clause (see below comments for other's interpretations of that). I point to Art 13 (2)(b) which clearly provides that the individual must be inform of the "existence of" their rights to access, rectification, erasure, restriction or objection. Why must they be informed the existence of their right? So that they are aware that they are able to exercise those rights.  However, 13 (2)(f) requires that individuals be informed about automated decision making. It does not required that they be informed of the "existence of the right" not to be subject to automated decision making. The drafting is key. Had the drafters wanted individuals to be able to opt out, they would have required, similar to the other rights, that the individual be informed of the "existence of the right" so they could exercise it. Second, the logical conclusion of this being an exercisable right is that people would always exercise it when it's in their interest. If meaningfully informed of the significance & consequences per 13(2)(f), then they will exercise it. Consider a price based on automated decision: "Your price for is $x but this was decided based on automated decision making and if you exercise your right not to be subject to that decision, your price is $y." As long as $y < $x, individuals will exercise their right. Thirdly, the only time automated decisions would be allowed would be when they were subject to paragraph 2 of Art 22 a) necessary for a contract b) authorized by law or c) with explicit consent. If you read paragraph 3 of Art 22 it requires safeguards and the option for human intervention, but only for those automated decisions that meet the requirements of paragraph 2. If the case were such that other automated decision making were allowed [under some lower standard than a) required by contract b) authorized by law c) explicit consent] then paragraph 3 wouldn't apply and the law would not require safeguards and the option for human intervention. If anything, you would think these additional requirements would be more pertinent for automated decision making that didn't meet these higher standards. It stands to reason that the interpretation that Art 22(1) is flat out prohibition is more accurate than not.
  • comment Eduardo Ustaran • Dec 1, 2017
    Fascinating debate. Clearly my thinking is polluted by the two jurisdictions I happen to have law degrees from.
  • comment Jeroen Terstegge • Dec 2, 2017
    One more argument why art. 22 should be read as a prohibition, is a systematic one. All processing, which is not specifically regulated elsewhere, is governed by article 6 GDPR. Art. 22 specifically regulates ‘solely automated decisions, including profiling, which produces legal effects or similary significant effects’. So, automated decision making/profiling that does NOT have such effects as well as all non-automated decision making/profiling is governed by article 6. Where such profiling, after having met the proportionality test, could be based on 6(1)(e) or 6(1)(f), article 21 would give the data subject the right to object. Any interpretation of art. 22(1) suggesting it would a right that must be invoked by objecting to automated decision making/profiling which produces legal effects or has similarly significant effects would mean that art. 22(1) would be completely superfluous. In such case, art. 22(2) would also not make any sense, except that the consent would need to be explicit instead of unambiguous.
  • comment Jane Gill • Dec 19, 2017
    Does anybody have any thoughts as to how to reconcile this with the recent Art 29 WP guidance on consent?  They seem to place the HR assessment industry with potentially having to build very complicated systems.  Consent of employees or potential employees will virtually never be freely given but they cannot be subjected to automatic decision making (i.e. sifting solutions) without consent?