Many companies are facing uncertainty following the Court of Justice of the European Union's finding that Meta's personal data processing, when collected from third-party services and linked with personal data collected on Facebook for behavioral advertising, can neither be based on contractual necessity nor legitimate interest under the EU General Data Protection Regulation. Meta subsequently introduced a pay-or-consent model that has been criticized by data protection authorities, further complicating the situation.

While current regulatory proceedings focus on Meta, potential consequences concern the wider criticism against targeted advertising. Multiple companies may be impacted since pay-or-consent models are commonly used. This also relates to the fundamental right and freedom to conduct a business in accordance with both EU law and national laws and practices, as set out in Article 16 of the Charter of Fundamental Rights of the European Union.

Any limitation must therefore be proportionate and necessary, and genuinely meet recognized objectives of general interest or the need to protect the rights and freedoms of others. In other words, the fundamental right of Meta — or any other company regardless of size — to conduct business must be balanced against the general right to privacy.

For instance, a recent decision by the Norwegian Privacy Appeals Board, Personvernnemnda, the authority that handles appeals against decisions by Norway's data protection authority, Datatilsynet, found Grindr is not obliged to offer a gay dating app for free. It also recognized that a key feature of the social media business model is that registrants accept their personal data is used commercially, for example, by being disclosed to advertising partners.

All companies need to finance their operations to provide services to users. Some argue contextual advertising could offer another possible source of income. But for a social media company like Meta, which is not a search-based service, contextual advertising may not be possible as contextual conclusions are based on user interactions with content and sites.

Appropriate fees — what should be allowed?

A hurdle for the data protection authorities who have reacted negatively to the pay-or-consent model is the CJEU's statement that "users must be free to refuse individually, in the context of the contractual process, to give their consent to particular data processing operations not necessary for the performance of the contract, without being obliged to refrain entirely from using the service offered by the online social network operator, which means that those users are to be offered, if necessary for an appropriate fee, an equivalent alternative not accompanied by such data processing operations." A very long sentence from the CJEU, there.

The consequence is that it may look like the CJEU has already confirmed the door has always been open for pay-or-consent models and that consent may be given freely, even though the user otherwise will have to pay a fee to use the service. This is also in line with earlier decisions.

Furthermore, it seems rather unproblematic to argue that a fee is "necessary" if the reason is to fund the service. The remaining question is, therefore, related to the requirement that the fee must be "appropriate." The CJEU has not given any further explanation. However, it is likely that the requirement was included to ensure that the user does not refrain entirely from using the service by not consenting because of an unreasonably high fee. This leaves a rather complex question related to the determination of when a fee is inappropriate.

There are good reasons why the EDPB and data protection authorities should refrain from overregulating pricing of a service. This is a complicated question and the authorities or EDPB are most likely not qualified to conclude on the pricing of a service. It is necessary to strike a balance between the fundamental right to conduct business and the fundamental right to privacy. A regulation of Meta's pricing will constitute a limitation of Meta's right to conduct business. Thus, it must be necessary and proportionate. This will require caution.

Uncertainty ahead

The GDPR is a complex legal framework. Not only do businesses differ in interpreting its requirements and how to comply in practice, but so do data protection authorities. It is also interesting to note that German data protection authorities issued a joint opinion in March 2023, ahead of the CJEU ruling, legitimizing the subscription model as a way to obtain consent for behavioral advertising and focusing on addressing certain features, such as the transparency or consent granularity when purposes additional to behavioral ads were also at stake.

The Meta decision and the uncertainty it creates, impacts the industry at large. Meta has been called the canary in the coalmine when it comes to GDPR enforcement. Although not comparable to a canary in size, whatever affects Meta, affects others.

Some data protection authorities presently seem unwilling to understand the CJEU decision, and the earlier joint opinion in Germany, as a path to comply with GDPR enabling a business' economic viability with a pay-or-consent model.

If consent is now necessary for behavioral advertising using information from third parties — how shall groups of companies with many entities draw the line for what is acceptable behavioral advertising onward?

Quite possibly, using behavioral advertising based upon legitimate interest could still be legal for a company using its own information (first party data) to profile users. However, the CJEU decisions leave little guidance for companies when determining the threshold for when the processing should be regarded as something the user cannot reasonably expect the company to do without their consent.

It could also be argued that users of free network services are generally aware that such services are sustained through income from behavioral advertising. The users would also have the right to object to such processing based on legitimate interest. Following the recent developments, users may now be faced with the choice to either consent, pay for services or discontinue their use of such services. This prompts the question of whether a better and more effective solution would have been to open for such processing based on legitimate interest but impose stricter requirements for information and visibility of the user's choice, rather than rejecting legitimate interest and thereby pushing companies to start collecting payment for free services.

Many are also afraid the legal developments may stifle innovation. Small, up-and-coming digital platforms cannot afford to be free of charge, meaning they rely on personalized advertising. They likely do not (yet) have a large enough or loyal user base to pay for their services. Taken together this may harm competition and provide a barrier to market entry for new innovators.      

In this context, it's worth noting that the Norwegian Consumer Council's "Out of Control" report heavily criticizes the adtech business in general. The council is especially critical of long vendor chains and how some companies gather information on individuals over time. They even write "As demonstrated throughout this report, it is unreasonable to assume that consumers can give informed consent to the excessive tracking, sharing, and profiling that pervades in the adtech industry. It is therefore highly doubtful that this comprehensive system of commercial surveillance can be fixed by providing new consent mechanisms or better designed legal documents." This is interesting, as it suggests they do not see how any changes in consent mechanisms could help.

Recent developments will likely spur discussions on the efficacy of the one-stop-shop mechanism and questions on whether reform is needed. After all, this most recent order came from an urgent referral to the EDPB from Norway's DPA, not Ireland's Data Protection Commission which is Meta's lead data protection authority.

In practice, we now see the EDPB is very much making the calls in difficult legal questions. Criticism has been levied that companies are not invited to have procedural rights in the EDPB processes (which is internal between the data protection authorities).

What next? 

In accordance with Article 64 of the GDPR, the data protection authorities of Norway, the Netherlands and the German state of Hamburg asked the EDPB to issue an opinion. Is pay or consent legal? Some organizations say no. Should a company then be forced to render its services for free? Should one differentiate between which companies must render free services and not? Or should one implement limits on what remuneration may be asked for? There are many important questions to address.

The normal procedure is for the EDPB to adopt a position within eight weeks by simple majority. However, that period may be extended by an additional six weeks, considering the complexity of the subject matter. The questions before the EDPB may not have the largest impact on Meta, but on very many other companies in the European Economic Area.

Fundamental and complicated questions are related to the requirement of an "appropriate fee" and the striking of a balance between the fundamental right to conducting business and the right to privacy and data protection. Some weeks to decide on that seems frighteningly low. Further, it does not ring well with ordinary contradiction principles that the possibility for interested companies, and individuals, to comment on the procedure is quite limited.

The EDPB is given the power to address many important questions surrounding the GDPR and it seems appropriate it may issue opinions on interpretation of the law. But, it is also appropriate to ask if the procedural framework the EDPB operates within is equipped for it to render opinions on questions comparable to materially changing the law for the whole EEA.

 

Editor’s note: This author has provided legal advice to Meta Platforms in Norway and published a more exhaustive article on the matter.