Like most of the "free" internet, online social media is funded through online advertising tailored to individual users' behavior and interests. The Court of Justice in the European Union decision in Case C-252/21 relates to one such platform, Meta, regarding its online social network, Facebook. The case is noteworthy for the advertising industry because it involves a competition authority determining data protection issues and calls into question whether platforms can carry out personalized advertising on a basis other than consent.
To use Facebook, users are required to accept the platform's general terms when they sign up. The user data provided during sign-up is linked to other data collected from user activity both on and off the social network. The off-network data includes visits to third-party websites and apps that use the platform's advertising technologies, and the use of other online services provided by the platform's group, such as other social media and instant messaging services. These profiles enable detailed inferences to be drawn on user preferences and interests.
A competition authority in Germany prohibited the platform's general terms from making the use of its social network conditional on processing users' off-network data and processing data without consent. It also required the platform to change its terms to make clear that data will not be collected or linked with user accounts, or used without user consent.
The authority emphasized that consent is not valid where it is a condition for the use of the social network, based on its view that this processing did not comply with the EU General Data Protection Regulation and, therefore, constituted an abuse of Meta's dominant position in the online social network market. The platform challenged the decision, and the appeal court referred various questions to the CJEU.
The main themes that relate to personalized advertising in the decision are:
1. Competence: A competition authority can make findings about GDPR compliance in the context of examining the abuse of a dominant position. However, the competition authority is bound by the decisions of data protection authorities and must cooperate "sincerely" with them.
2. Special category data: Where users visit or enter information into (when making purchases or registering on) websites or apps related to special categories of data listed in GDPR Article 9(1), e.g., "flirting apps, gay dating sites, political party websites or health-related websites," data about such visits or information is considered special category data. Therefore, when that data is collected through integrated interfaces, cookies or similar storage technologies and linked to a user account, it is considered processing special category data, which is prohibited unless a derogation applies, e.g., "manifestly made public" in GDPR Article 9(2)(e).
3. Manifestly made public: Merely visiting such websites or apps does not mean the user has manifestly made special categories of data related to that visit public. Where a user enters information into websites or apps, uses integrated "like" or "share" buttons, or logs on to websites or apps using credentials linked to their social media accounts, telephone numbers, or email addresses, they manifestly make public special categories of data. But this is only the case when the user explicitly expresses their choice beforehand, through individual settings selected with full knowledge of the facts, to make their data publicly accessible to an unlimited number of people or, in the absence of such settings, with their explicit consent.
4. Contractual necessity: Collecting off-network data and linking it to users' accounts for subsequent use is only necessary for the performance of the contract with those users if the processing is objectively indispensable for achieving a purpose that is an integral part of the contractual service intended for those users. In other words, the main object of the contract must not be achievable in the absence of that processing. Personalization of content might be useful, but in this case the court considered that it did not appear necessary in order to offer the social network services in question.
5. Legitimate interests: Recital 47 of the GDPR recognizes that processing of personal data for direct marketing can potentially be carried out in the controller's legitimate interests. However, those interests must be balanced against and must not override the rights of users. In that balancing exercise, paying particular attention when the data subject is a child is necessary because Recital 38 recognizes they merit specific protection, particularly during marketing, creating user profiles or offering services aimed directly at them. Therefore, in this case, the balance tipped in favor of the users, given:
- Their reasonable expectations. Although the social network is free of charge, users would not reasonably expect the platform to process their personal data without their consent for the purposes of personalized advertising.
- The scale of the processing. The processing is particularly extensive as it relates to potentially unlimited data.
- The impact on them. The network has a significant impact on users, given that a large part of their online activities are monitored by the platform, "which may give rise to the feeling that his or her private life is being continuously monitored."
6. Consent: Being in a dominant position does not automatically invalidate consent. It is, however, an important factor in determining its validity, particularly as it is liable to affect users' freedom of choice and create a manifest imbalance between them and the platform. Users should be able to refuse specific data processing operations, which are not necessary for the performance of the contract, without being forced to stop using the social network. Equivalent alternative services, such as a pain version, should be offered to the user. Given the expectations, scale and impact of the processing on users, separate consent should be required for off-network data.
Many issues at the heart of this decision will already be familiar to EU regulators such as the Ireland Data Protection Commission. Earlier this year, the DPC concluded two inquiries on the lawful basis for behavioral advertising. On its blog, the DPC explained it initially viewed "personalised services that also feature personalised advertising" as "central to the bargain struck between users and their chosen service provider, and forms part of the contract concluded at the point at which users accept the Terms of Service." However, other regulators disagreed during the consultation process, and the European Data Protection Board intervened. It determined, as a matter of principle, the platform was not entitled to rely on contractual necessity as the legal basis for its processing of personal data for the purpose of behavioral advertising.
But on the issue of so-called "forced consent," i.e., conditional access to services based on user acceptance of the terms, the outcome ultimately was a decision by Ireland's DPC that: "the legal basis for processing of personal data under the Terms of Service […] does not, as a matter of law, have to be consent under Article 6(1)(a) GDPR." Clearly, that outcome is now at odds with this decision by the CJEU. It is hard to think the court considers any lawful basis other than consent to be appropriate when it comes to behavioral advertising by platforms.
Another conflict is the CJEU's apparent endorsement of "pay or okay" models, where nonconsenting users are "offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations." In other words, users should be able to choose between their data being used or buying a subscription. This flies in the face of the adverse regulatory reaction to cookie walls that were thrown up post-GDPR. Even the U.K. Information Commissioner's Office issued the Washington Post with an extraterritorial warning back in 2018. That said, regulators have more recently softened on this issue. France's data protection authority, the Commission nationale de l'informatique et des libertés, issued more balanced guidance on assessing the legality of cookie walls last year, followed by FAQs issued by Austria's DPA. This is just as well, given that Max Schrems' organization NYOB currently has the bit between its teeth on this issue and his blog suggests it may need to go up to the EU's highest court. Without even touching on other issues, such as the lawful basis for sharing with law enforcement, there is still much to unpack in this decision as it goes back down to the referring competition authority.
If you want to comment on this post, you need to login.