TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | WP29 releases guidelines on profiling under the GDPR Related reading: WP29 releases draft breach notification guidelines

rss_feed

""

""

Editor's Note:

 

The Article 29 Working Party adopted final guidelines on automated individual decision-making and profiling on February 6, 2018, available here

The Article 29 Working Party has adopted new draft guidelines covering profiling and automated decision-making under the forthcoming GDPR. 

The proposed guidelines acknowledge two general benefits of these technologies: first, increased efficiencies and, second, resource savings. And they note the potential to “better segment markets and tailor services and products to align with individual needs.” However, the WP29 warns that profiling and automated decision-making technologies can pose “significant risks for individuals’ rights and freedoms” and can “perpetuate existing stereotypes and social segregation” absent appropriate safeguards.

The new guidelines, which are open to public comment until November 28, 2017 (via JUST-ARTICLE29WP-SEC@ec.europa.eu and presidenceg29@cnil.fr) are divided into five chapters, which have been briefly summarized below.

Chapter I: Definitions

Profiling: The guidelines define profiling as a “procedure which may involve a series of statistical deductions … often used to make predictions about people” and analyzes Article 4(4) of the GDPR’s definition as describing three stages of processing that qualify:

  • An automated form of processing.
  • Carried out on personal data.
  • For the objective of evaluating personal aspects about a natural person.

The working party emphasizes that the GDPR defines profiling as automated processing of data to analyze or to make predictions about individuals; meaning that “simply assessing or classifying individuals based on characteristics” could be considered profiling, with or without predictive purpose.

Three ways of using profiling are identified in the GDPR:

  • General profiling.
  • Decision-making based on profiling.
  • Solely automated decision-making, including profiling.

Automated Decision-making: WP29 defines automated decision-making as “the ability to make decisions by technological means without human involvement.” Automated decision-making can take place with or without profiling, and can be based on any type of data.

Chapter II: Specific Provisions Applying to Solely Automated Decision-Making

The working group discusses specific provisions in Article 22 of the GDPR that govern automated decision-making. The guidelines summarize Article 22 as providing three major rules:

  • Article 22(1) creates a prohibition on fully automated individual decision-making, including profiling that has a legal or similarly significant effect.
  • There are exceptions to the rule.
  • A requirement that there be measures in place to safeguard the data subject’s rights and freedoms and legitimate interests.

The guidelines emphasize that the prohibition on “fully automated decision-making” only applies when the decision based on such technology “has a legal effect on or similarly significantly affects someone.” If a recommendation about a data subject is produced by an automated process but is reviewed by a human being who “takes account of other factors in making the final decision” it is notbased solely on automated processing. However, fabrication of human involvement (e.g. human employees rubber-stamping automatically generated profiles) will not enable a controller to avoid Article 22’s prohibition. “Meaningful” oversight must be by “someone who has the authority and competence to change the decision.”

Legal and Similarly Significant Effects: The WP29 clarifies the GDPR’s use of “legal” or “similarly significant” effects. The guidelines note possible impingements on the freedom to associate with others, vote in an election, or take legal action, or an effect on legal contractual status or rights as examples of “legal effects” under the GDPR. “Similarly significant” effects need not necessarily be legal ones — the working party suggests that threshold is the significance of the decision’s impact on the data subject — so to qualify, the processing “must be more than trivial … the decision must have the potential to significantly influence the circumstances, behavior, or choices of the individuals concerned.”

The guidance notes the difficulty of drawing a precise boundary for the types of decision that would qualify as significant, but points to recital 71 of the GDPR, which gives “automatic refusal of an online credit application” and “e-recruiting practices without any human intervention” as examples. The guidelines suggest that online advertising may qualify, depending on the following characteristics:

  • The intrusiveness of the profiling process.
  • The expectations and wishes of the individuals concerned.
  • The way the ad is delivered.
  • The particular vulnerabilities of the data subjects targeted.

Significant effects may apply to certain groups when they do not apply to individuals generally. Differential pricing technology is specifically mentioned as a potential example.

Exceptions: Article 22(2) sets forth three exceptions to the Article 22(1) prohibition:

  • The processing is necessary for the performance of or entering into a contract.
  • It is authorized by Union or Member State law to which the controller is subject and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests.
  • Is based on the data subject’s explicit consent.

Necessity must be interpreted narrowly, as described in the working party opinion on “legitimate interests”. The GDPR does not define explicit consent, but suggests it must be “specifically confirmed by an express statement.” The working party will address the issue in forthcoming guidelines.

Rights of Data Subjects: The working party specifically addresses the following rights in the context of automated decision-making:

  • The right to be informed.
  • The right of access.
  • The right not to be subject to a decision based solely on automated decision-making

The Right To Be Informed: The guidelines expand on the data subject’s right to be informed and the controller’s obligations. Per Articles 13(2)(f) and 14(2)(g), controllers utilizing automated decision-making must:

  • Tell the data subject that they are engaging in this type of activity.
  • Provide meaningful information about the logic involved.
  • Explain the significance and envisaged consequences of the processing. 

The working party suggests that “it is good practice” to provide that information regardless of whether the processing falls within the Article 22(1) definition of automated decision-making. Controllers are instructed to “find simple ways to tell the data subject about the rationale behind, or the criteria relied on in reaching the decision” by providing information “meaningful to the data subject.” To inform data subjects of the processing’s significance, the WP29 suggests that “real, tangible examples of the type of possible effects should be given.”

Right Not To Be Subject to Automated Decision-Making: Data controllers must act affirmatively to provide data subjects with access to “at least the right of human intervention,” even in cases where one of the Article 22 exceptions applies. Controllers must provide a “simple way for the data subject to access these rights” that will enable the subject to express his or her view and contest a decision.

Data Controller’s Obligations: Article 22(3) obligates controllers create safeguards to protect the rights, freedoms, and legitimate interests of data subjects; Article 22(b)’s Member State law exception requires any Member State law authorizing processing to include suitable protections for data subjects. The working party suggests that per Article 22 and recital 71, minimum safeguards must provide:

  • An explanation of the decision reached to the data subject.
  • A way for the data subject to obtain human intervention.
  • Express their point of view.
  • Contest the decision.

Controllers are instructed to carry out frequent assessments on their datasets to check for bias, and “develop ways to address any prejudicial elements.” These measures must be used continuously, not only at the design stage, as the profiling is applied to individuals.

Chapter III: General Provisions for Profiling in General

The new guidelines also address general provisions of the GDPR as they apply to both profiling and anonymous decision-making. The working party discusses the following principles from Article 5(1) in relation to both profiling and automated decision-making, and gives several examples to assist controllers in complying.

Transparency

  • Controllers must provide subjects with “concise, transparent, intelligible, and easily accessible information about the processing of their personal data.”

Fairness

  • Processing must be fair: It should not deny people access to employment opportunities, credit or insurance, or target them with excessively risky or costly financial products.

Further processing/purpose limitation

  • Additional processing is permitted depending on the relationship between the purposes for which the data were collected and the purposes of further processing.
  • The context in which the data were collected and the reasonable expectations of the data subjects.
  • The nature of the data and the impact of further processing on the data subjects.
  • The safeguards applied by the controller to ensure fair processing and avoid undue impact on the data subject.

Data minimization

  • The need to collect and hold personal data should be explained, or controllers should consider alternatives such as aggregation or anonymization. 

Accuracy

  • Accuracy must be considered at all stages of profiling, including collection, analysis, profile building, and decision-making applying a profile.
  • Controllers must introduce “robust measures” to verify and ensure that data re-used or obtained indirectly is up to date.

Storage limitation

  • Controllers are warned that storing collected personal data for extended periods may conflict with the proportionality consideration even if it satisfies the purpose specification and relevance requirements.

Lawful Bases for Processing: Chapter II of the guidelines addresses the Article 22 exceptions in greater detail insofar as they apply to automated decision-making; it enumerates a list of legal bases for profiling that do not all apply to automated decision-making. The greater list includes profiling:

  • With subject’s consent.
  • Necessary for the performance of a contract.
  • Necessary for compliance with a legal obligation.
  • Necessary to protect vital interests.
  • Necessary for the performance of a task carried out in the public interest or exercise of official authority.
  • Necessary for the legitimate interests pursued by the controller or third party.

The WP29 emphasizes the limitations of the consent exception to the prohibition on automated decision-making; it also sharply circumscribes the efficacy of the “necessity for performance of a contract” exception by making clear that the exception is not satisfied simply by including profiling in the small print of an otherwise unrelated contract.

Under Article 6(1)(f) the legitimate interests of a controller or third party may provide a basis for profiling. Determining if this is so requires balancing those interests against the data subjects’ interests, fundamental rights, and freedoms; the working party gives the following factors:

  • The level of detail of the profile.
  • The comprehensiveness of the profile.
  • The impact of the profiling on the data subject.
  • The safeguards aimed at ensuring fairness, non-discrimination, and accuracy in the profiling.

Data Subject Rights: Much of this section is similar to what is discussed in Chapter II. The rights discussed and their bases in the GDPR include:

  • Right to be informed (Articles 5, 13 & 14; recital 60).
  • Right of access (Article 15, recital 63).
  • Right of rectification (Article 16), including the right to supplement with additional data
  • Right to erasure (Article 17).
  • Right to restriction of processing (Article 18).
  • Right to object (Article 21).

Some data subject rights are further clarified in this section, in particular:

The Right to Object: The WP29 expands on the requirements of the data subject’s right to object. The guidelines cover the need for affirmative conduct by the controller to inform data subjects of the right to object, and the need for controllers to quickly respond if a subject exercises that right. Controllers are required to interrupt profiling unless able to demonstrate “compelling legitimate grounds” that outweigh the interests rights and freedoms of the data subject. While the GDPR does not provide explanation of what would qualify, the guidelines suggest that “profiling beneficial to society at large” and “not just the business interests of the controller” might suffice. Controllers must show two specific additional elements to continue profiling over a subject’s objection:

  • The impact to the data subject is limited to the minimum necessary to meet the particular objective.
  • The objective is critical for the organization.

Data subjects have an unconditional right to object to processing for direct marketing purposes.

Chapter IV: Children and Profiling

The working party concludes that the GDPR does not apply a blanket prohibition to the use of profiling and automated decision-making in relation to children, as the recital 71’s admonition that such techniques should not be “apply to children” is not affected in the text of Article 22 itself. The working party admonishes controllers to “refrain, in general, from profiling [children] for marketing purposes,” although it concludes that processing of children’s data may be carried out on the basis of Article 22(2)’s exceptions “as appropriate.” The guidelines suggest that this might be the case when the processing is “to protect [the children’s] welfare.”

Chapter V: DPIAs

The guidelines highlight Article 35(3)’s requirement that controllers carry out DPIAs for evaluations “including profiling” and “based on automated processing.” The working party concludes this language encompasses “decision-making including profiling with legal or similarly significant effects that is not wholly automated, as well as solely automated decision-making defined in Article 22.” See our recent analysis of the DPIA guidelines here.

Annexes

The guidelines also provide several annexes with helpful information for controllers. Annex 1 breaks out specific good practice recommendations for profilers by the applicable provisions of the GDPR. Annex 2 provides a list of all GDPR provisions that engage with profiling. And Annex 3 provides a list of other working party opinions and related literature references throughout.

3 Comments

If you want to comment on this post, you need to login.

  • comment Elizabeth Buggy • Oct 18, 2017
    Can you provide a link to the guidance? Don't see it in the article.
  • comment Isabel Barbera Garcia • Oct 18, 2017
    http://ec.europa.eu/newsroom/document.cfm?doc_id=47742
  • comment Loris Rossi • Oct 19, 2017
    After this reading I understand that using Google AdWords Remarketing or automatically clustering guests for newsletetter's recipient it's not "profiling"...is it right?