Meta's EU General Data Protection Regulation adventure has taken a turn with sweeping consequences for the online ecosystem. After Ireland's Data Protection Commission levied on it a 390 million euro fine — subsequently affirmed by the European Data Protection Board's binding decision — the technology giant could no longer rely on contractual necessity or legitimate interests as a legal basis to process user data for personalized advertising. It had exhausted all available options save one: consent.
In response to the DPC's order and the Court of Justice of the European Union's ruling in Meta v. Bundeskartellamt, and ahead of the EDPB publishing its urgent binding decision, Meta signaled its pivot toward operationalizing consent as its lawful basis for personalized advertising across its Facebook and Instagram services in the EU. Users in the EU can refuse to consent and may continue to use Facebook and Instagram without advertisements via a monthly subscription service — a model colloquially dubbed "pay or OK."
The dust has yet to settle on how EU data protection authorities view Meta's latest approach, but the adtech ecosystem will feel reverberations from the litany of past enforcement and litigation. There's likely more to come as the light is shone brightly over the parameters and conditions of consent as a legal basis.
The history of consent in data processing
Consent, as applied to personal data processing, well predates the GDPR. The original German and French data protection acts, the Bundesdatenschutzgesetz of 1977, and France's Act No. 78-17 of 1978 explicitly referenced consent as a means of permitting access to personal data. The 1980 OECD Guidelines called for personal data to be obtained "with the knowledge and consent of the data subject," a principle later incorporated into the 1995 Data Protection Directive that shaped modern consent under the GDPR.
For better or worse, consent undergirds much of privacy law today. It plays a part in facilitating a data subject's personal risk assessment and subsequent acceptance. It helps broker decisions related to personal information. However, it's been argued that consent has, over time, been reduced to a "procedural approach to maximizing individual control over data rather than individual or societal welfare," burdening data subjects with largely meaningless notices and limited choice.
For this reason, EU policymakers have tread carefully, heightening requirements for validity where appropriate to prevent consent from becoming unworkable and subsuming the entire data protection field — as some argue it has in the U.S.
Through legislation, opinions and enforcement by the regulators and the courts, consent, despite its limitations, has evolved into a sometimes desired or even sole remaining basis for certain data processing activities. Its reach extends beyond the GDPR and into other areas of digital regulation centering on user choice and consent, like the EU Digital Services Act, as well as into other data protection and privacy regimes internationally.
Article 4(11) of the GDPR clearly defines the requirements for consent: "'consent' of the data subject means any freely given, specific, informed and unambiguous indication of the data subject's wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her."
The GDPR provides additional guidance in Article 7 and Recitals 32, 33, 42 and 43 regarding how controllers may comply with the main elements of the consent requirement.
"Free" requires real choice and control for data subjects and is perhaps the strictest element to satisfy for valid consent. Regulators will — and thus data controllers should — ensure a data subject has freely given consent by assessing how specifically consent is tied into contracts or the provision of a service.
This highly fact-specific analysis will result in invalidation of consent if an entity fails to enable freely given consent, determined after looking at a non-exhaustive list of factors, including:
Imbalance of power. Consent for processing will not be freely given where there is a clear imbalance of power between the data subject and controller. Commonly raised in situations concerning processing by public authorities, employers, or, recently, hospitals, this prohibition may apply more broadly where a data subject has no realistic alternative to accepting the terms of processing of a controller. EDPB guidelines require "no risk of deception, intimidation, coercion or significant negative consequences (e.g., substantial extra costs) if he/she does not consent."
Meta and similarly situated large-scale processors of personal data for behavioral advertising may find this factor impassable. Norway's Data Protection Authority's Head of International Department, Tobias Judin, forewarned: "Considering the power imbalance between Meta and its users ... we doubt that the purported consents will be 'freely given' as required by the GDPR." In turn, a recent complaint filed by privacy rights group NOYB pins an imbalance of power argument on Meta over its dominant market position.
Meta is familiar with this argument. In July's Bundeskartellamt decision, the CJEU held that "the existence of such a dominant position may create a clear imbalance … between the data subject and the controller," in the context of terms imposed on data subjects under the guise of contractual necessity. It will face a similar argument in coming legal challenges to the use of its new basis.
Companies without comparably dominant market power and network effect lock-in should more easily stave off efforts to invalidate consent on such grounds.
Conditionality. The performance of a contract, including the provision of a service, should not be conditional on consent to the processing of personal data that is not necessary for the performance of that contract. Practically, consent will be invalidated if bundled in nonnegotiable terms and conditions.
In Instagram and Facebook decisions on processing data under contractual necessity, the EDPB acknowledged personalized ads and content "may (but [do] not always) constitute an essential or expected element of certain online services," and did not regard them as contractually necessary for Meta's provision of its Instagram and Facebook services. Regulators may apply a similar analysis to this prong and may find consent invalid if it has been obtained conditionally to provision Meta's services, as noted by NOYB in its complaint.
In an earlier case, the U.K. Information Commissioner's Office in 2018 invalidated consent in an early "pay or OK" model implemented by The Washington Post after it failed to offer a free alternative to accepting cookies for behavioral advertising. Although this decision lacks authority today, given the U.K.'s departure from the EU, businesses offering such subscriptions must avoid stumbling over conditionality by detaching processing for personalized advertising from the provision of service, as prescribed by Article 7(4) GDPR.
Further complicating matters, the CJEU in Bundeskartellamt stated that, in the event of a refusal of consent to the processing for personalized advertising, users must be offered an equivalent alternative for an "appropriate fee," if necessary. Otherwise, the provision of the service is conditional on consent to the processing of personal data not necessary for the performance of that contract. The decision lacked further context defining "appropriate," but European regulators are currently investigating this issue in the context of Meta's "pay or OK" model.
If charging a subscription fee, organizations may consider the anticipated importance of documenting and justifying their assessment of appropriateness; for example, by showing what may be provably related to the revenue reduction, it would sustain showing only contextual ads or no ads at all.
Alternative considerations for assessing the appropriateness of a subscription fee, offered by media strategist Eric Seufert, are whether (1) the price represents overall average revenue per unit parity for the business between the pre-subscription and post-subscription periods, and (2) the fee is comparable to that charged by similar services. This is an intricate and dynamic calculation, so while awaiting further guidance, organizations should consider how best to adhere to long-standing and overarching principles of fairness and transparency in these fee determinations.
Granularity. Controllers must provide specific granularity when seeking consent for different purposes, explaining why a user must approve each nonessential cookie before placing it on that user's device.
In assessing the legality of the ad-free subscription model in the context of the Austrian newspaper Der Standard, Austria's DPA criticized the newspaper's methods of obtaining consent. In particular, the newspaper wrongfully used a single consent for several purposes, including targeted advertising, analytics, and social media plug-ins. The DPA avoided addressing the "pay or OK" model, instead finding consent invalid for lack of granularity.
So long as services implementing the subscription models properly seek separate consent for personalized advertising, granularity should not present an obstacle to proving freely given consent for that purpose.
Detriment. The controller must show that the data subject may refuse or withdraw consent without detriment, such as costs or degradation of services.
The EDPB's guidance leaves room for interpretation on the imposition of costs, but several DPAs, including Norway's in Grindr, have indicated or stated explicitly that some fee may be appropriate and will not create detriment, assuming a controller abates the above-referenced concerns over conditionality.
Consent must be given in relation to "one or more specific" purposes of data processing, giving a data subject choice about each. This element ensures a degree of user control and transparency for the data subject.
The specificity of consent is closely linked to its granularity. Where the specificity element mandates each consent must be given for a specific purpose, granularity requires the separation of these purposes and obtaining consent for each purpose.
According to the EDPB, to satisfy this element, "a controller must apply:
- Purpose specification as a safeguard against function creep;
- Granularity in consent requests; and
- Clear separation of information related to obtaining consent for data processing activities from information about other matters."
Data subjects must be informed of all rights and circumstances surrounding the data processing and its consequences to enable them to make informed decisions. EDPB guidelines enumerate what information must be minimally provided to satisfy this element; any less and the consent may be invalidated as illusory. Controllers relying on consent must further separately follow the information duties in Articles 13 and 14 of the GDPR.
This element stems from the fundamental GDPR principles of fairness and transparency. Accordingly, information must be easily understandable to the average person relative to an organization's identified audience, such as, for instance, minors.
Simply put, informed consent generally only becomes an issue where information is withheld or improperly provided, for example, where data subjects could not identify a controller (Tested.Me), the categories of data collected (IAB Europe), or for what purposes processing was done (Personuvernd).
In the context of "pay or OK" subscription models, the scope of what consent to processing for personalized advertising covers, and thus what information must be conveyed to the data subject, may expand to include:
- The alternative forms of advertising that a business may use (e.g., contextual or no advertising).
- What data collection and processing will remain whether the data subject consents or refuses.
- If consent is refused, what is the legal basis for that continued data collection?
Unambiguous indication of wishes
An "indication" requires that the data subject's wishes must be active and necessitates that the data subject maintains a high degree of autonomy when opting to consent. This element flows naturally from the EU's long-standing predilection toward opt-in frameworks.
Preselected or pre-ticked boxes categorically do not meet this threshold, as affirmed by the CJEU in Orange Romania SA and Planet49. In its complaint, NOYB analogized Meta's "pay or OK" model to a pre-ticked box, requiring payment to opt-out or revoke consent of data processing — thus arguing that the scheme is expressly outlawed by the text of GDPR Article 6(1)(a) and ensuing case law.
Demonstrate consent. Controllers must prove that a data subject has validly consented to processing, imposing requirements of recordkeeping, often facilitated by consent management platforms. Businesses should find little trouble demonstrating consent but, in doing so, should refrain from further data collection or processing beyond what is already necessary.
Withdrawal of consent. Controllers must be cognizant of enabling withdrawal of consent, a concept that contributes to the analysis of freely given consent but also plays a prominent role in crafting many of the processes discussed herein.
For example, the principle that withdrawing consent should be as easy as giving consent requires streamlining withdrawals and refusals so as not to create a dark pattern, obstructing user experience and discouraging data subjects from affecting privacy choices.
Regulators have not offered much leeway in this regard, especially when controllers utilize paid subscriptions as consent mechanisms to effectuate a withdrawal. For example, in the Grindr case, Norway's DPA, Datatilsynet, did not consider that subscribing required payment, a process that often inherently requires more steps. The DPA simply counted the steps required for giving consent to those required to withdraw it and found the latter needed more steps and thus violated GDPR Article 7(3). Controllers implementing a "pay or OK" model must undergo a similar calculation for their withdrawal and refusal processes lest consent be invalidated.
Much remains in flux with respect to "pay or OK" models. EU regulators have taken the practice under consideration on 12 Dec. 2023 in what will certainly not be the final commentary on the matter. Until the story has fully played out, the adtech industry remains in limbo, uncertain of the lawfulness of their processing and of their industry at large.
In the interim, data controllers should lean on existing guidance and data processing principles to craft their own subscription models. Basic practices, like calculating fees fairly and transparently, avoiding bundled consent, and ensuring easy withdrawal of consent, will help ensure valid consent. Closely following coverage of coming legal challenges to processing personalized ads will provide insight into accepted practices.
If you want to comment on this post, you need to login.