TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | The implications of extraterritorial delisting in RTBF requests Related reading: Op-ed: Canada should forego ‘right to be forgotten’ laws

rss_feed

""

""

An appeal by Google to the French Data Protection Authority presents a question as to the future of right-to-be-forgotten requests. We all well know the Google Spain case, in which the company was required to take down indexed content on the basis that it caused prejudice to the data subject. Now, in France, we see Google's attempted compromise with the French data protection authority to remove content on a geographical basis. But doing so could be problematic. 

Some history

Europe has flirted with the right to be forgotten for years. Article 12 of the EU Data Protection Directive forms the basis for the right under current EU law, a tenuous link. The right was cemented pursuant to a 2014 ruling of the European Court of Justice (CJEU). In Costeja, the claimant wanted content and search results relating to a past foreclosure to be removed, alleging that it was irrelevant, dated news. The court ordered Google, Inc., and its Spanish subsidiary to delist the offending links from search results. The original content was left undisturbed on the grounds that the newspaper had valid justification in publishing it. The court observed:

As the data subject may, in the light of ... fundamental rights under Articles 7 and 8 ... request that the information in question no longer be made available to the general public ... in such a list of results, those rights override...not only the economic interest of...the search engine but also the interest of the general public in having access to that information upon a search relating to the data subject’s name. However, that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of its inclusion in the list of results, access to the information in question.

France reaches farther

As would be expected, there was a flurry of delisting requests immediately after the Costeja ruling. In France, upon Google’s refusal to remove certain listings, the data protection authority, CNIL, notified Google to remove search results in respect of these requests, across all (not merely European) domains. Google complied partially, delisting some results on European domains. It subsequently offered the CNIL a compromise: in addition to delisting results on European domains, it would adopt geographical barriers to access. The CNIL disagreed, stating that “ … varying the respect for people’s rights on the basis of the geographic origin of those viewing the search results does not give people effective, full protection of their right to be delisted.”

Google has appealed the CNIL’s decision before the French Supreme Court. In an interesting turn of events, the Wikimedia Foundation has sought to join the appeal as an affected party. Two crucial pillars of their challenges are: the extraterritorial application of law by the CNIL, and the principle of free access to information.

Extraterritoriality and the internet

This is not the first, nor will this be the last extraterritorial exercise of jurisdiction over the internet. Given the boundless nature of the internet, states are constrained to find ways to regulate activities in any corner of the world which may affect their citizens online and increasingly, even in their real lives. Additionally, as the CNIL points out in its June 2015 order, means exist to circumvent domain-specific delistings and geographical barriers to access. This fuels the necessity for states to exercise extraterritorial jurisdiction. Further, there are situations where extraterritorial censorship may be equitable. In situations involving “revenge porn,” or leaks of personal information without the data subject’s consent, for instance, limiting delisting or even content removal to specific domains or geographies will not completely address the data subject’s grievance. 

But something is missed in this approach. If the information sought to be delisted is likely to cause great harm, it is also likely to trigger civil and criminal remedies and be capable of being taken down. Further, in situations such as revenge porn complaints, intermediaries are less likely to refuse to take down or delist content across domains.

Moreover, however well-intentioned a rule may be, applying it extraterritorially is problematic, especially in ambiguous situations where activities are legal in some states and illegal in others. This acquires particular significance when considering internet regulation and censorship by undemocratic nations. What if they could limit online activities of citizens of democratic nations? This is one of the arguments that Google takes. Selective compliance with extraterritorial laws does not bode well for a private company, and total compliance could shackle the internet.

There is also some degree of logical fallacy in the arguments on circumventing techniques. If a user intentionally circumvents IP-based and domain-based restrictions on searches for particular individuals, the searcher must (most likely, unless using proxies as a matter of course) be targeting that individual. In that case, what the CNIL is attempting to do is to prevent online stalking/harassment. But that is arguably not within the scope of the CNIL’s authority. The aim here is to protect a general right to privacy and enforce it, not create new rights out of the CNIL's scope of powers.

Privacy and the right to access information

Privacy and free accessibility of information on the internet come into conflict in the implementation of the right to be forgotten. Removing search listings can have a stifling effect on information accessibility and constrain free speech. In fact, this is one of the reasons for the Wikimedia Foundation's challenge — about half its traffic comes from Google searches.

The right to be forgotten results in an inherently conflicting system — original content remains available, but search results (i.e., access to content) must be deleted. This creates a paradox and appears to diminish greatly the right to access publicly available information. The reality, however, is that without specific prior knowledge of the existence of content or search results to direct one, there is little chance of anyone stumbling upon such content. Hence the concern about search results and insistence on delisting, especially in situations affecting individual privacy rights.

The real issue then boils down to balancing the genuine privacy concerns of EU residents with issues of public importance. The CJEU, in Costeja, the Article 29 Working Party in its subsequent guidelines and the CNIL place the onus on private companies. This detracts from the basics of intermediary liability, which requires intermediaries to be neutral in the selection and transmission of content, in order to enjoy immunity. It is further arguable that requiring intermediaries to perform this analysia abdicates to private parties the powers of the state, undermining free access to information and the freedom of speech. After all, it is possible that intermediaries would choose to delist content rather than retain it and face potential liabilities and sanctions.

Is there a solution?

Delisting may be ethically and legally necessary in some situations. However, in the absence of collective protection of the common right to information, the world may end up with the most regressive controls, as extra-jurisdictional decrees of the least democratic nations will become binding rules.  

There are two concerns to address here. First, who decides what content is delisted? Google has appointed an advisory council for this purpose. While this approach is prima facie neutral, in order to reduce complaints to national data protection authorities, the EU could consider establishing a neutral independent body to arbiter delisting requests. Alternatively, the EU could engage with neutral, professional bodies (such as the IAPP) to analyze requests. This would involve costs for the EU, but it is arguable that such costs are better absorbed by the state than by private parties as compliance costs.

This cascades into a second concern over whether extraterritorial delisting should be mandatory. In certain situations such as complaints against revenge porn/child pornography, where ethics and local laws would generally justify even content removal, intermediaries could choose to delist. But in all other cases, delisting, and particularly, extraterritorial delisting, gets very complicated. Given the guidelines as currently in place, and the appeals against the CNIL’s decision, this thorny issue remains to be judicially resolved.

Comments

If you want to comment on this post, you need to login.