Resource Center / Resource Articles / Search Engines, Personal Data Law and Freedom of Speech
What Place Do Search Engines Have Between Personal Data Law and Freedom of Speech?
This resource provides analysis on search engines, personal data law and freedom of speech.
Published: March 2013
The question raised in the title of this article is the complex and long-debated one that the Court of Justice of the European Union (CJEU) will have to answer, following an application made by the Spanish High Court.[1] As a preliminary matter, let us first state that for the purpose of this article “search engine” means those services which, on request, automatically index web pages with a program called a spider or robot and which display the results in the form of a list of links to the external websites that are found.
The case leading to this question is between Mr. X, a Spanish national, and Google. Mr. X noticed that when he typed his name into the Google search engine, two of the results redirected the user to pages of a newspaper that included details of the auctioning of a property seized due to a failure by Mr. X to pay social contributions. Mr. X brought a claim against the newspaper and Google before the Spanish Personal Data Protection Authority. The Authority dismissed Mr. X’s petition against the newspaper, stating that the publication was legal and benefited from the protection of the Spanish "right to information." However, the Authority granted his petition against Google and ordered Google to remove the links from its results page and to prevent any further indexing of this property seizure decision. This decision by the Spanish Authority, given on July 30, 2010, created the somewhat paradoxical outcome in which the disputed content remained online, whilst the search engine was ordered to deindex it. The situation amounts to one in which a book is included in a library but is withdrawn from the catalog.
In order to reach this result, the Spanish Authority adopted the following three steps of legal reasoning: Step one, the indexing of web pages from the name of an individual constitutes a processing of personal data; step two, the controller of such data within the meaning of European law[2] is Google Inc., and step three, persons affected are entitled to oppose having their data processed in this manner pursuant to applicable European rules.
Yet, as the Forum des Droits sur l’Internet (Forum of Rights on the Internet or FDI) stated in 2003 in its recommendation in relation to “creators of hyperlinks,” “a claim brought against the creator of a link, and not against the publishers or host of the linked content, will not stop the distribution of the illegal content. Such claim, therefore, cannot have as its purpose to remedy the harm caused by the act of distributing the illegal content.”[3] The FDI added that, “insofar as the illegal content remains online, such content continues to be accessible by other paths (other search engines, directories, portals, all other sites on which such content may be indexed and by direct knowledge of the URL). For consistency and the sake of effectiveness, it therefore appears necessary to preserve search engines from any claims that do not also target the direct authors of a violation or, at the very least, their hosts.”[4]
In the case referred to the CJEU, the disputed content is not illegal. The FDI’s reasoning is, therefore, even more relevant. As Professor Ricard Martinez Martinez states with great imagery, “shooting the messenger will not suffice. Even if Google, Bing or Yahoo deletes the information again and again, it will come back to infinity.”[5]
The recommendation of the Committee of Ministers to the Member States of the Council of Europe also followed this logic.[6] Indeed, the recommendation states that search engines “should promptly respond to users’ requests to delete their personal data from (extracts of) copies of web pages that search engine providers may still store (in their “cache” or as “snippets”) after the original content has been deleted.”[7] Meaning, the opposite of the process that was followed in Mr. X’s case.
However, how does one explain that people now look more to search engines than to the publishers of the disputed content or, to use the preceding metaphor, more to the “messenger” than to the “message?" Maybe this is the result of what appears to be the growing confusion as to the role of the Internet and the role of search engines. Most of us now use the home page of a search engine to find a website even if we know the address. Consequently, would it not be more effective, and above all simpler, to serve writs on search engines to have disputed content deindexed rather than to file claims against controllers of the websites that distributed the content?
In the end, the case before the CJEU reveals a certain trend in Europe that is transforming regulations on data protection into an instrument used for deindexing, with the result that the digital visibility of information that is deemed to be inconvenient by the relevant parties disappears. Search engines are therefore progressively being vested with duties that they did not originally have: The duty to regulate Internet content and the duty to grant people’s requests insofar as the disputed content is merely disparaging, although legal. Search engines are therefore caught between several legal fields which are difficult to reconcile: Data protection, the right to privacy and freedom of speech. This is what is at the heart of the questions referred to the CJEU.
This article first proposes to discuss the applicability to search engines of the notion of the processing of personal data, and being a “controller” in relation to that processing, relying on the work of the data protection authorities and on case law handed down in this area (I). Then, second, this article will examine the risks involved in applying personal data law in such a manner that may restrict freedom of speech, when individuals’ rights to the protection of their privacy can be ensured by making use of other legal avenues (II).
I. Search engines, “data controllers” despite themselves?
A. Do search engines “process personal data”?
This is the first question to which the CJEU must reply.[8] According to Directive no. 95/46 of October 24, 1995 (the “Directive”), processing of personal data includes, “any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction.”[9] As one can see, this definition is particularly broad: No operation, even purely technical, on personal data present on the web would appear to be outside its scope.
Furthermore, the “plasticity” of this notion of processing has led to some questions and even some criticism by specialized authors. This is the case with Professors J. Frayssinet and A. Lepage, who have emphasized “the potentially excessive scope of the notion of data processing.”[10] Others, like J. Le Clainche, have worried about the “dangers of systematically applying the Data Processing, Data Files and Individual Liberties Act to search engines” due to the “drawing power” of the scope of that Act resulting from the “particularly broad definition of the notion of processing.”[11]
As J. Le Clainche states, this definition of data processing was “created in order to remedy the obsolescence of the notion of a file and to ensure the broadest and most evolutive protection possible for the data subjects. This praiseworthy objective indirectly results in having a threat hover over freedom of speech and it risks including too many behaviors to be effective. Its adaptation, but not its re-formulation, is therefore needed.”[12] This author points out that this definition of processing “has on occasion been the subject of restrictive interpretations by the French Supreme Court, which has notably held that ‘a sworn agent pursuant to Article L. 331-2 of the French Intellectual Property Code, who merely notes the IP address, does not necessarily perform data processing.”[13]
This was also the approach taken by the Paris Court of Appeal, in a claim about data entry by the French Competition Authority. In this case, the relevant company argued that such data entry constituted processing of personal data within the act’s meaning and that, therefore, it was subject to certain prior formalities which the French Competition Authority had failed to perform, thereby invalidating the evidence collected. The Court of Appeal ruled out the application of the notion of "data processing" to the French Competition Authority’s data entry, stating, “that one cannot consider that the data entry as performed, which was duly authorized, mentioning a legal entity, is likely to breach the Data Processing, Data Files and Individual Liberties Act.”[14] This reasoning, although somewhat elliptical, nevertheless illustrates this desire to limit the application of the notion of data processing with the aim of reducing its substantive scope, deemed to be too “broad”.
In addition to this legal dimension, application of the definition of personal data "processing" to search engines raises a practical problem. Indeed, a search engine is a robot which indiscriminately indexes all contents existing on the web, whether it be images, common names, places, technical data[15] or personal data.
It is clear that part of this information is personal data insofar as it identifies, even indirectly, a person.[16] Other information—most of the information—is not personal data.
It follows from the foregoing that, if a search engine were to be considered to process personal data, this categorization should only apply to probably quite a small part of its technical activity.
B. Can search engines be deemed “data controllers”?
Once the first question has been answered, the second question referred to the CJEU will be to ascertain whether, “the company which operates the ‘Google’ search engine is a controller of the personal data contained on the web sites that it indexes?”[17]
The 1995 Directive defines the “data controller” as the organization which “alone or jointly with others determines the purposes and means of the processing of personal data."[18] The question of applying the status of a controller to search engines has been the subject of profusive writing, both by the regulators themselves and by the courts in charge of applying the law.
1) Purposes difficult to determine
According to the French act implementing the directive, the purposes must be “specified, explicit and legitimate,” with the understanding that the data collected within this framework must not be “further processed in a way that is incompatible with those purposes.” But, what are the purposes of the processing operations performed by search engines? There is no simple answer to this simple question. Indeed, does the automatic indexing of all the information on the Internet at a global level constitute a “specified” purpose? Is improving the service any more of a “specified” purpose? As Art. 29 WP itself states, these purposes are “are too broadly defined to offer an appropriate framework to judge the legitimacy of the purpose.”[19]
This finding led Art. 29 WP to identify three categories of bases for the legitimacy of processing for search engines: The person’s consent, the performance of a contract, and “the legitimate interests” of the controller, including the improvement of the service, the system’s security, preventing fraud or customized advertising. Despite these various categories, which are part of the regulators’ praiseworthy desire to define a legal framework that is applicable to search engines, the following two questions remain:
- To what extent does a search engine performing a person’s search request determine the processing’s purpose since it obeys an outside order which itself results from an objective, therefore, from a purpose? Is the controller not the Internet user who orders the search? This path, which in our view has been inadequately explored, was nevertheless quickly contemplated by the Art. 29 WP, which stated in a footnote: “Users of the search engine service could strictly speaking also be considered as controllers, but their role will typically be outside the scope of the directive as 'purely personal activity'."[20]
- Consequently, to what extent can a search engine guarantee that the personal data collected will not be further processed in a way which is incompatible with the initial purposes, since the search engine is unaware of the initial purposes of the website’s publisher or the Internet user?
2) Are search engines “intermediaries” as the Art. 29 WP proposes?
The other difficulty that the desire to apply the status of a data controller to search engines brings about is the fact that search engines index third-party website content which they, therefore, do not control. Here again, the initial purpose is not determined by the search engines; it is determined by the website publisher.
This is why, in its 2008 opinion, the Art. 29 WP stated that, within the framework of only the search and indexing activity, “(t)he principle of proportionality requires that to the extent that a search engine provider acts purely as an intermediary, it should not be considered to be the principal controller with regard to the content related processing of personal data that is taking place. In this case the principal controllers of personal data are the information providers.” In this respect, the Art. 29 WP is in line with Recital no. 47 of the directive, which states as follows: “Where a message containing personal data is transmitted by means of a telecommunications or electronic mail service, the sole purpose of which is the transmission of such messages, the controller in respect of the personal data contained in the message will normally be considered to be the person from whom the message originates, rather than the person offering the transmission services.”
In its conclusion, the Art. 29 WP states that one must clearly distinguish between the two main roles of the search engines: “(In) their first role, as controllers of user data, such as the IP addresses they collect from users and their individual search history, they are to be held fully responsible under the Data Protection Directive. In their second role, as providers of content data, such as the data in the index, generally they are not to be held as primarily responsible under European data protection law for the personal data they process. Exceptions are the availability of a long-term ‘cache’ and value added operations on personal data, such as search engines aimed at building profiles of natural persons.”[21]
These two notions, “intermediaries” and “primarily responsible” mentioned by the Art. 29 WP are of interest because they reflect the functional reality of search engines as creators of hyperlinks from indexing. Yet, it has to be noted that the first notion is not defined by the directive. As for the second notion, it is based on the desire to exploit the legal potential of Article 2 d) of the directive, which refers to the fact that the data controller is the organization which “alone or jointly” determines the purposes and means of processing. Yet, the notion of a co-controller was not implemented under national laws. By way of illustration, this notion does not exist under French law, thereby preventing the relevant authorities and courts from referring to it.
3) The CNIL: A regulator with nuanced positions
In its 2001 recommendation on the publication of case law, and the personal data included in that case law, on the Internet,[22] the CNIL highlights at great length the role of search engines, as follows: “All court decisions which include the identity of the parties can be indexed by search engines…This the true ‘revolution’ caused by the Internet, which requires that particular precautions be taken in order to protect people’s privacy.” Fearing that these case law databases could “notably through use of search engines, be transformed into veritable information files on the persons cited in court decisions,” the CNIL called for “a fair balance between the public nature of court decisions and the rights and freedoms of the persons involved.” This is why it recommends that “publishers of databases of court decisions freely accessible on websites refrain, for the sake of respecting the privacy of the relevant individuals and the essential ‘right to be forgotten’, from placing on the websites the name and address of the parties to the trial.”
In so doing, the CNIL mentions only the website editors, because they are the only data controllers in its view. No recommendation is made vis-à-vis search engines, which are omnipresent, according to its reasoning, which is evidence of the fact that the CNIL does not consider them as being data controllers.
Conversely, in two public sanctions that it issued,[23] the CNIL considered that "people" search engines that collected, in the form of name files, all data about the people available on social networks, including last name, first names, photographs, experiences, education, location, etc., were controllers. On this basis, they have to perform all obligations provided for by the Data Processing, Data Files and Individual Liberties Act, notably: The obligation to inform persons and to respect their right to oppose the processing of their data, the guarantee of fairness in collecting data and updating them. It was precisely because these "people" search engines did not respect such obligations as data controllers that they were sanctioned.
It follows from the foregoing that the CNIL makes a clear distinction between, first, search engines as creators of hyperlinks, which are not data controllers within the act’s meaning and, second, people search engines, which fall within the category of data controllers.
4) Courts hand down contradictory decisions
Developing an analysis that is comparable to that of the Spanish Authority, certain French courts have ruled that search engines, including as creators of hyperlinks, were data controllers. Such was the approach taken by the Montpellier Court of Appeal, which, on September 29, 2011, ruling in summary proceedings,[24] found that Google was a data controller, granting the petition to deindex certain Internet pages.
This decision led to lively criticism in doctrine. For example, A. Lepage noted that, “undoubtedly,” it “appeared to be easier to mention only Google, Inc., rather than seeking to join editors or hosts of pornography websites containing the disputed video or redirecting to it. Consequently, the outcome, even if it is a response to Mrs. C’s claim, cannot be deemed satisfactory. Ordering the search engine to delete all search results…almost amounts to breaking the thermometer in order not to see the fever: In so doing, the fever does not just disappear. Similarly, supposing that Google, Inc., did deindex all the aforementioned results, that would not make the video disappear from the web, where it will continue to be freely accessible, for example through another search engine or, directly, by knowing the URL."[25]
Regarding this decision, A. Debet[26] states that the judge considered “in a very peremptory manner” that Google “was a data controller, which could undoubtedly be disputed. The Montpellier Court of Appeal, also in summary proceedings, takes an overly cut-and-dried position.” Mr. Debet finds that “the identification of the data controller, therefore, is not simple and should be put to a more fleshed-out debate." [27]
Other courts have shown themselves to be more reserved. Such was the case of the Paris Civil Court[28] (Tribunal de Grande Instance or TGI) ruling on similar facts. Although it ordered Google to remove links from its results page, it did so by invoking the Internet technical intermediaries’ liability regime, and in particular the “host” liability as defined by the 2004 Act for Trust in the Digital Economy (LCEN). Indeed, this act provides that a technical intermediary (providing storage) controls the content that it hosts only if, once it is informed that the content is in dispute or illegal, the host does not take prompt action to remove it, thereby causing manifestly illegal harm. In addition, it was the lack of control over the indexed content which led the court to make a finding that the search engine was a “host” and not a "data controller.”
It was in a more clear cut manner that a recent decision by the Paris Civil Court of September 19, 2012 [29] found that the host category excluded the categorization, of the same entity, as a data controller: “The host provider of a blog which provides an online communication system available to third-party content controllers, and whose behavior toward the third-party content controllers is neutral, technical and passive, cannot be qualified as being the data controller within the meaning of the Act of January 6, 1978.” This case involved a blog hosted by Google that contained information considered to violate the petitioners’ privacy, whom, in order to ask for the information’s removal, had invoked both the LCEN and the LIL.
C. Impossible obligations?
Once the CJEU has ruled on whether Google, Inc., is a data controller, it will then have to make a decision on individual rights based on the following terms: “Must the right to obtain the erasing and locking of personal data and the right to oppose having such data processed be interpreted as allowing the data subject to contact search engines in order to prevent the indexing of the information regarding such person from being published on third-party websites, by expressing his/her wish that such information not be known by Internet users when (s)he considers that such information may cause him/her harm or (s)he wishes that such information be forgotten, although this information is published legally?”[30]
All controllers must guarantee the data subjects whose personal data they process that they respect their rights as provided for by the directive.[31] Obviously, there are restrictions to these rights, even exceptions. This is the case in the example of the obligation to (provide certain information to) the data subject if complying with that obligation “proves impossible or involves a disproportionate effort.”[32] This exception already applies to search engines.[33] Yet, what about the other obligations? How will search engines be able to receive the people’s consent, particularly if case law determines that it is not possible for search engines to provide the required prior information to the data subject in advance of the processing of their personal data?
As regards the right to rectify data, insofar as search engines did not originally publish the data on the Internet, they will not be able to comply with this obligation. How, then, does one enforce this obligation against search engines knowing that none of the exemptions provided for by European law or French law apply to them?[34]
What is mainly at stake, in terms of applying individual rights to those search engines considered to be data controllers, is the ability to exercise the right of opposition. Applied to search engines, the right to oppose processing takes the form of deindexing the content, the scope of which may call into question freedom of speech and communication as explained in detail in the second part of this article.
In addition to these questions about individual rights, other provisions of the directive would be difficult to apply to search engines, insofar as they would be categorized as controllers. This is true of the principle of proportionality of the data collected in the context of the relevant purpose[35] since, as we have seen, it is not straight forward to identify the purposes themselves.
Finally, the principle of prohibiting the processing of “sensitive” data within the meaning of the directive, meaning data disclosing the racial or ethnic origin, political opinions, religious convictions or data related to the individual’s sex life,[36] or the prohibition against processing data related to offences and criminal convictions,[37] present difficulties for search engines. Indeed, many websites, blogs and profiles on social networks include such information, which the search engines index.
In order to comply with this ban on the processing of data relating to offences and criminal convictions, some countries, such as France, have implemented a policy of anonymizing court decisions published online. Others, such as Spain, have not, with the paradoxical consequences which the Mr. X case reveals.
Consequently, if such activities constituted processing of personal data, then search engines would be structurally and continuously in violation of European legislation, which appears difficult to contemplate without calling into question the very legality of their activity, and, therefore, in fine, their existence.
D. The need for another legal category?
Adopted more than 17 years ago, drafted at the beginning of the 1990s, the 1995 directive has difficulty taking into account the realities of the digital world today. The multiple questions as to the legal nature, role and scope of the activity of search engines is an illustration of this. This finding, inter alia, led the European Commission to propose a new legal framework for personal data on January 25, 2012.[38] Yet, the fact remains that, while waiting for this new legal instrument to enter into force, the 1995 Directive remains applicable.
1) Search engines: Processors?
If the activity of search engines as creators of hyperlinks must be characterized with respect to the directive’s provisions, which is debatable, should our focus not be on the notion of “processor”?
As we have said, the Art. 29 WP itself recognizes that the “main” controller is the provider of information, meaning the Internet user itself. Moreover, websites which create information and publish it are technically able to decide if they wish, or do not wish, to be indexed by search engines. Thanks to the insertion of a “noindex” metadata tag in the codes of their HTML pages, or by using the exclusion protocols called “robots.txt”, source websites are able to prevent indexing. In so doing, with respect to the directive, they are therefore determining the purpose—making or not making information public—and the means used—indexing or not. It is also on this basis that they must be considered as the sole data controllers.
Yet, since the search engines index only content which the websites have decided to make indexable, they are executing an order from the controllers. In so doing, it could be argued that the search engines are processing data “on behalf” of the editors’ websites, which corresponds to the directive’s definition of a processor.[39] We believe that this legal path, little explored by doctrine, should be disregarded. Indeed, the link between a processor and the controller follows a certain degree of formalism through the existence of a contract. Moreover, Article 35 of the January 6, 1978 act, as amended, provides that the controller that chooses a processor must ensure that the processor has “sufficient guarantees” in terms of security. Yet, neither the editors’ websites, nor a fortiori Internet users, have signed a contract with the search engines. Nor do they have any control over the search engines’ security. It follows that search engines cannot be categorized as processors.
2) Search engines: Internet technical intermediaries?
Other possibilities may be contemplated. As we have seen, the Art 29. WP itself advanced the notion, not mentioned in the directive, of an “intermediary". This is the path that some authors have taken, seeking to identify, outside the Data Processing, Data Files and Individual Liberties Act, a legal category that would be applicable to search engines. Hence, Mr. Le Clainche states that one “can argue more convincingly that the automatic indexing and the indispensable nature of search engines for access to information places them close to the service providing access, to the Internet, mentioned in Article 6 of LCEN.” As we have seen, certain courts have also taken the path of the LCEN by considering search engines as hosts.[40]
What is at stake with these various categorizations is being able to identify a legal regime which corresponds to the functional reality of search engines, without “perverting”[41] data processing, data files and individual liberties regulations to the detriment of freedom of speech, while effectively guaranteeing individual rights, and in particular the right to privacy. This reconciliation is complicated, yet possible.
II. Risks of overreach in applying the Data Processing, Data Files and Individual Liberties Act and identification of more suitable means of action
A. Risks of overreach in applying the data processing, data files and individual liberties act to search engines
In addition to the difficulties in applying the Data Processing, Data Files And Individual Liberties Act to search engines, as denounced in the first part of this article, such an application includes a real risk of overreach, which would have consequences for the very operation of the web and the social usefulness of search engines (A). Yet, substantive law includes more suitable and effective means of action to satisfy requests for rectification and deletion of content that is harmful, while preserving the balance between the various legitimate interests involved (B).
1) Misappropriation of the actual purpose of the data processing, data files and individual liberties act (Loi Informatique et Libertés or LIL)
One of the rights that is most often mentioned in the various lawsuits brought against search engines in order to have information indexed by them removed is the "right to be forgotten." Yet, for the time being, this right is not expressly mentioned, neither in Directive 95/46/EC of October 24, 1995, nor in the national legislation[42] implementing it, such as the French LIL.
The LIL’s provisions used to enforce the right to be forgotten within the current legal context are mainly Article 36, on limited storage of data; Article 38, on the right of any individual to oppose, for legitimate reasons, having its personal data processed, and Article 40, on the right to rectify inaccurate, incomplete, questionable, outdated data, etc.
However, if one reads these provisions, one wonders whether the conditions for exercising the right to be forgotten against search engines are relevant:
- The right of opposition is grounded on a legitimate reason which, by essence, search engines cannot assess. Indeed, if the information indexed is legal and does not breach the LIL on the original website, how would it be more legitimate to order search engines to deindex this information based only on the LIL?[43]
- The right to request that information be deleted, as provided for by Article 40, implies that the relevant information is inaccurate or incomplete, or even outdated, which the search engine also cannot assess insofar as it is not the source of the content and only reproduces, like a mirror, contents and images published by third parties.
- The exercise of the right to be forgotten against the search engine does not allow for the permanent deletion of the information as required by Article 40, but only to make it more difficult to have access to such information by deindexing it, as the source of the information remains the original website. In addition, one should note the very limited geographic scope of deindexing ordered by a national court, which will be effective only for the relevant country, but cannot prevent indexing of the disputed information on versions of the search engine for other countries.
- Lastly, application to search engines of the right to be forgotten also leads to giving a broad meaning to the notion of processing of personal data as defined by Article 2 of the LIL, whereby processing occurs if the name of an identified or identifiable person is merely present, thereby going well beyond the law’s spirit.[44]
Therefore, although one can understand the purpose of the right to be forgotten, particularly to be forgotten on the Internet, the aim of which is the final and effective deletion of one’s personal data on the Internet, presuming that this is technically possible, specifically when such data are related to harmful content, one doubts the relevance of exercising such right against search engines. Indeed, they can't in any respect satisfy such requests due to their simple role as technical intermediaries. As certain authors have emphasized,[45] one should therefore “not give a more important meaning and scope to the right to be forgotten than what the Internet’s technical operation actually permits.”
In this respect it should furthermore be noted that in cases where a request to deindex disputed information is made, some of the decisions mentioned more fully below,[46] in which such a right to be forgotten had been exercised, demonstrate an attempt to misappropriate the LIL by basing the exercise of the right to be forgotten on a right of opposition or a right of deletion under the LIL, which terms of exercise—proof of a legitimate reason for the right of opposition, or inaccurate or incomplete information for the right of rectification—were not often satisfied, leading the relevant courts to rule out the use of this ground.
This misappropriation is all the more concerning since it can be exercised at the discretion of an individual who generally wishes thereby to avoid the enforcement of the most restrictive and protective provisions of the protection of privacy and freedom of speech against the websites that are the source of the relevant information. By taking as its targets principally those search engines whose visibility makes lawsuits easier to bring, the LIL can become a formidable means of action, if the applicability of the LIL, which is much less restrictive than the provisions on respecting privacy and freedom of speech, to a given situation, is accepted if there is merely one piece of personal data in the index of results displayed.
2) Role of censor likely to be granted to search engines
The ease with which lawsuits based on data protection law can be brought against search engines is not without risk, since the bringing of these lawsuits forces the search engines to reply to individual requests, the merit of which may be completely challengeable.
This ease also makes the search for reliable information more fragile, since a politician, based on the right to be forgotten, right of opposition or deletion, could then request that a search engine delete all links to the pages that mention his legal problems, or other escapades in which he may have been involved by name. If such a request is directly sent to the author of the article in question, it would certainly result in a defensive reaction in order to provide that the information is non-libellous on the basis of the relevant facts. On the other hand, in the case of one link among the thousands of links indexed every day by search engines, it is not really in the interest of the search engine to fight, all the more so as they are not able to benefit from the same protection measures as the accused newspaper. Consequently, search engines will be obliged to withdraw the link without the intervention of a judge, thus resulting in the concerned article having lower visibility and, ultimately, in a public opinion that is based on twisted information.
Similarly, search engines may have to comply with an individual’s request that is aimed at controlling his image or reputation by deindexing pages that mention his name, although those pages do not violate respect for his privacy and are not libelous.
As we can see, in addition to the risk of misappropriation of the very spirit of regulations on personal data, the application of the LIL to search engines produces the perverse result of using the search engine as a censorship tool to be wielded at the sole discretion of individuals by allowing them to force the search engines to deindex only information that they deem to be unpleasant, with the information that is deemed to be the most laudatory being maintained.
Such a situation not only leads to making individuals less personally responsible—anyone may at any time ask that information (s)he considers to be bothersome be deleted—but also to having to call into question the right to information and freedom of speech, leading to a society without a history and without a memory.
3) An actual calling into question of freedom of speech and the right to information
The above analysis testifies to the real danger of having personal data protection law used in order to limit freedom of speech without any limit of proportionality or any attempt to create a balance between the various legitimate interests involved.
Fortunately, French courts have not been indifferent to such a danger and have decisively rejected the application of the LIL in cases in which, as they could not take action on the customary bases of the civil or criminal liability of the press, the LIL had been used as a last resort or as an alternative means of action.
Such was the case of a decision by the Paris Civil Court, handed down on June 23, 2008,[47] in which a well-known person had used the right of opposition to request that articles that were accessible on anonymous blogs and used as editorials on local political life in Valence be deleted. The court ruled that the plaintiff erroneously used the right of opposition of Article 38 of the LIL, and the court even appeared, “to suspect misappropriation of the LIL to avoid the protective provisions of freedom of speech of the Act of July 29, 1881, and of Article 93-3 of the Act of July 29, 1982 (for communication to the public by electronic means)."[48] The court inferred that the plaintiff’s interests “in this case could not justify disregarding the provisions of the Act of 1881” and the strict conditions of its provisions to put to an end any abuse, by, for example, bypassing the rule of a three-month statute of limitations period for the press.
The court also excluded the possibility of using the LIL in a civil court decision handed down on October 12, 2009,[49] in summary proceedings based on a lawsuit brought against, not a search engine, but the website that was recognized to be processing personal data.
In this case, it was claimed that Mr. C. had stated on his personal website that Mrs. X. had been the mistress of Mr. Philippe de V., a well-known politician, a fact that she had allegedly disclosed herself in an autobiography. In addition to compliance with Article 9 of the French Civil Code, Mrs. X invoked against Mr. C. Article 7, paragraph 5, of the LIL, arguing that Mr. C.’s personal website had processed her personal data without her consent. Mrs. X. had cleverly used the LIL as an additional measure in the event that her claim based on Article 9 of the French Civil Code was denied, since she herself had disclosed the disputed facts. As the Civil Court was concerned with avoiding a misappropriation of the LIL in order to limit freedom of speech, it ruled that, “As the processing of personal data performed by Mr. C. without Mrs. X.’s consent had no other purpose than to permit Mr. C’s public speech (here, criticizing the politician), the principle of freedom of speech, guaranteed by the constitution and by treaties, prevents a finding of a separate harm related to a violation of the rules enacted by the Act of January 6, 1978, which is not one of the standards that was specifically enacted to limit such freedom." The Civil Court therefore ruled out the application of the LIL, while nonetheless acknowledging a violation of privacy based on Article 9 of the French Civil Code.
This decision was criticized because it appeared to establish, quite awkwardly, a hierarchy between freedom of speech and the right of protection of personal data, both categories of freedoms being of equal standing. As Jean Frayssinet states in his note in relation to this decision,[50] in so ruling, the Civil Court in fact applies a principle that was established by the French Supreme Court (Cour de Cassation) in clearer terms in its decision dated July 9, 2003:[51] "The rights to respect for privacy and freedom of speech, carrying with them, with respect to Articles 8 and 9 of the European Convention and Article 9 of the French Civil Code, identical normative weight, require a court ruling on the matter to search for the most protective outcome for the most legitimate interest."
The court therefore rightly seeks to prevent having the use of the provisions of the LIL, right of opposition, deletion or consent, limit freedom of speech.
In another decision by a civil court dated June 25, 2009,[52] a company director had discovered that a search with his first and last names on the Google search engine displayed in the first two results two links redirecting to two articles in the French newspaper Les Echos, which detailed sanctions ordered against him by the French stock market regulatory authority, although several subsequent court decisions had established that, in fact, he had been the victim of fraud. The petitioner had sued only the newspaper Les Echos and not the search engine, asking for the deletion of all articles mentioning the name of the petitioner from the archived database of the newspaper or the withdrawal of his name from all articles, which was a form of implementation of the right to be forgotten. The court rejected this claim by stating that it was “radically contrary to what is required by freedom of speech and the requirements of documentary, journalistic or history research on previously published information.” On the other hand, the court preferred deleting the indexing by ordering the newspaper to request that Google “proceed with the deindexing” of the disputed articles. This request by the court is somewhat surprising, insofar as the deindexing remains under the newspaper’s control, as it is the only party that can implement the necessary measures on its website to avoid the indexing of its articles, here, paid articles, by search engines.[53]
On the other hand, a petition for deletion of the indexing was denied in another very recent decision by the Paris Civil Court dated May 9, 2012.[54] In this case, the petitioners, the Dokhan brothers, had served a writ on Les Echos pursuant to Article 38 of the LIL and Article 1382 of the French Civil Code, notably due to “use of their last name as a keyword on search engines providing, in the first position” an article archived on the lesechos.fr website and published in the physical newspaper in 2006. This article, entitled “Le Conseil d'Etat a réduit la sanction des frères X. à un blâme" (“The French Administrative Supreme Court Lowered the Sanctions of the X. Brothers to a Reprimand”) included information on a disciplinary case investigated by the Council of Financial Markets (Conseil des Marchés Financiers) against the petitioners.
The Civil Court rejected the petitioners’ arguments in wording that merits being highlighted insofar as it states the limits of claims based on the LIL against the authors of disputed articles, because such claims may infringe freedom of speech and the right to information. A fortiori, when the claims are brought against search engines, they should be rejected:
"It should be noted that, unlike what they claim, neither the title of the disputed article, which appears on the first page of the Google search engine when their last name is used as a keyword, nor the article itself, which is freely accessible on the Les Echos website, are biased, questionable or inaccurate…”; “that these texts do not contain any inaccuracy or unfair or biased presentation,” and “that no subsequent event, other than the passing of time” affects its relevance. It should be noted that the court insisted on the legality of the disputed article and on its relevance from a historical perspective.
In addition, “as regards the argument based on the fact that the power and widespread use of modern search engines are an obstacle to erasing this old case in the collective human memory, which, without any limitation in time, is still in the first rank…of the Google search engine,” the court ruled, “that this argument is in and of itself insufficient to order that the relevant article be deleted or even that it be deindexed based on the plaintiffs’ last and first names…as such measures infringe freedom of speech and one of its corollaries, that of receiving information." This excerpt forcefully shows the absence of a natural, implicit link between the right of deletion and the automated and unlimited storage of information that is inherent in search engines.
This position complies with the position adopted by the CNIL in its 1995 recommendation regarding personal data that is processed or used by written-press or audiovisual organizations for journalistic and editorial purposes, in which it stated that, “collection, recording and preparation of information are inherent in the exercise of freedom of the press.” Some information, therefore, cannot be deleted, even if after the fact, since storing the information is part of the news company’s purpose to inform, in which search engines participate by indexing the articles they publish. This right of deletion, therefore, appears to be less justified with respect to search engines, which merely re-transcribe the information published by the news company.
This last decision by the Civil Court of May 9, 2012, is all the more interesting since it testifies to a consistent and more motivated desire by French courts to avoid the misappropriation of the LIL to limit freedom of speech, including through requests to delete indexing by search engines.
It should be noted that, in all the aforementioned cases, the claims based on the LIL had been brought against the websites that were the source of the disputed information and which were the actual data controllers of personal data, which data may have also appeared in the content indexed by the search engines. This did not prevent the courts, based on the legitimate interests at issue, from laying down serious limits when they found that the use of the LIL was obviously misappropriated and could affect freedom of speech.
A fortiori, this same path must be followed when using the provisions of the LIL against search engines which, by essence, are tools for accessing information. Granting a right to individuals to exercise the right to be forgotten against search engines would affect not only the fundamental role of search engines of guiding Internet users based on their requests in the context of the information society, but also the activity of search engines. To this is added the risk of discrimination and the loss of competitive advantage, which may be caused by requests for deindexing against certain search engines, rather than against other search engines that are less used and which will continue to index the disputed information. Lastly, one cannot disregard the economic impact that a petition that is based on the right to be forgotten may have on the economic viability of the online commercial press, whose sole visibility to end users is provided by search engines and which will be seriously affected if, on the basis of a successful petition by an individual, their articles could be deindexed.
Personal data law is not only inapplicable to search engines, but is also dangerous due to the obvious risk of overreach that such application may cause. In contrast other, much simpler and more direct, means of action remain available in order to allow the data subjects to enforce their rights against the source of the disputed information, and without affecting third parties’ rights.
B. More suitable and effective legal grounds for protecting individuals’ data
The indexing service of a search engine is above all that of an intermediary service provider and cannot be considered to give the search engine actual control over the content that is published online. It is, therefore, up to the holder of the relevant right, who does not wish his content to be indexed in an online communication to the public, to take adequate measures for controlling that indexing.[55] The Paris Court of Appeal, moreover, stated that, due to the automated nature of search, the search engine result that leads to online content does not establish that the search engine could actively control such content. The court stated that the form of display—a mosaic of thumbnails—gives a simple view which “meets the requirement of neutrality laid out by the directive.”[56]
In order to be effective, the means of action must, therefore, be taken against the original publishers of the content that includes the personal data in respect of which removal is requested, namely the websites that originally published the information.
1) Making the websites that are the source of content responsible for the control of search engines’ indexing of information
Taking action against the search engine instead of against the data controller very quickly appears to be ineffective since the legal content remains accessible online on the publisher’s website and also on the other search engines that are not requested to remove the content. As the website publisher is the party that decides on the content that is indexed by search engines, it is, therefore, more appropriate to take action only against that publisher in order to require it to take any necessary measures in order to prevent such indexing.
Indeed, publishers may prohibit search engines in general or one search engine specifically to index certain pages via an exclusion protocol.[57]
They can also define rules for the content indexing performed by search engines by using HTML tags in the codes of their pages. For example, the tag could be “unavailable as from [date]”, “no indexing”, “no view in the result pages”.
As indexing is a continuous process, any new instruction will be quickly taken into account by search engines. Publishers therefore have precise control over the content that will be indexed.
This important role of website content publishers in the indexing of the content was, moreover, affirmed by the Paris Civil Court in its decision dated June 25, 2009,[58] as quoted supra. In addition, as suggested by certain authors,[59] it also appears quite possible and compliant with the preparatory work of the 2004 Act, which amended the LIL, to promote self-regulating initiatives by encouraging news companies to implement a voluntary deindexing policy. Indeed, even if, pursuant to Article 67 of the LIL, the news companies benefit from an exception regarding the application of the LIL, Article 67 refers to compliance with the ethical rules of professional journalism, which require the implementation of an editorial policy both as regards the permanence of data on the website and the storage of that data in the electronic archives.
The conditions that apply to the indexing of data by news sites on the Internet must form part of this editorial policy, by ensuring that a balance is sought between freedom of speech, the “right to information and to its history” and the protection of personal data. Furthermore, it is in this direction that the “Policy on the Right to Be Forgotten on Collaborative Websites and Search Engines”[60] provides that the publishers of content should be made responsible by providing for measures that are aimed at facilitating their control over search engines’ indexing of that content, including personal data.
2) Action more targeted at actors placing information online
a) Regarding lawful information
Tag-along right
This tag-along right, which allows one to correct information published in the past without affecting rights to the information and to its history, had already been mentioned by the CNIL in its recommendation of January 24, 1995.[61] The CNIL recommends that review of the data that is the subject of a court claim, a request for rectification or a right to respond should “trigger the mandatory review of the final court decision, the rectifications or responses provided which may have been previously published or distributed.”
In its recommendation of February 25, 1997,[62] similarly, the Art 29. WP states that “Limits to the right of access and rectification prior to publication could be proportionate only in so far as data subjects enjoy the right to reply or obtain rectification of false information after publication.”
In practice, it should be noted that some courts have already implemented this tag-along right by relying on the CNIL recommendation of October 24, 1995. Hence, in its decision on June 25, 2009, C. Vernes c/ Sas Les Echos, the civil court ordered the company Les Echos to take all appropriate measures to ensure that, “review of its archives base, accessible on the Internet, shall be accompanied by an attached text which will have to be immediately accessible through a hypertext link from the page reviewed on which the disputed articles are published, under the title ‘Tag-Along Right’.” The measure allowed for granting the rectification of the relevant data subject’s information, whilst preserving freedom of the press. It should be noted that such a tag-along right can only be used legally against the relevant content publisher, because it alone can grant the request and allow a disputed article to be published with the rectifications attached that the data subject wished to make. The search engine will then re-transcribe the corrected information in its automatic update of the indexing of its results. It should be noted that a claim against the search engine provider would not have permitted the implementation of such a tag-along right, whose benefit is also that it preserves the chronology of events and the rights of the original information publisher.
The LCEN’s right of response
Article 6.I.V of the LCEN provides the option to make a request to correct the message: “Any person named or designated in an online communication service to the public has a right of response to a message published online without prejudice to request the online communication service provider for correction or deletion of the message.”
The scope of this provision, which lays down an option and not a right, is rather broad since it mentions all online communication services to the public, with the exception of online press services which may not be requested to respond to a right of correction or deletion assuch option would be an open door for censoring information.
In the case of content that is litigious, but not illegal, this option therefore allows the data subject to directly contact the authors or producers of online content, for example, for blogs or discussion forums, in order to request rectification or deletion of information concerning the data subject. It generally translates into a hypertext link inserted on the content publisher’s website, allowing any relevant individual to make a request related to content that he thinks is not acceptable. The publisher website will act as moderator, in keeping with its editorial responsibility.
b) Claims which may be contemplated in the event of illegal information
Claims based on the right to privacy and the press’ right against content publishers
Privacy is an important principle recognized by French law and expressly protected by Article 9 of the French Civil Code. This legal ground is often invoked against disputed website content which is the source of information; e.g. websites disclosing the extra-marital relationship of a politician,[63] or the sanctions ordered by the French stock market regulatory authority against a company director even though a court decision had subsequently established that such sanctions resulted from him being a victim of fraud.[64]
This approach is more logical, since the website publisher is the only party that controls the content that it publishes online and, therefore, the only party that is able to assess the legitimacy of the claim made in respect of that content.
When the offence is committed through the press, notably in case of defamation or slander and by using an online means of communication to the public, Article 93-2 of Act no. 82-652 of July 29, 1982, on "audiovisual communication" provides a cascading liability regime similar to that of Article 42 of the act of July 29, 1881. The claim may be brought: 1) against the publication’s director when the information in dispute has been recorded prior to its communication to the public; 2) failing this, against the author, and 3) failing against the author, the claim will be brought against the producer as the principal author.
When the offence results from the content of an Internet user’s contribution in a personal contributions space, where that personal contributions space is identified as such, liability for the publication director will arise only if, as soon as he is effectively aware of the message, he fails to act promptly to remove the relevant message.
As search engines are only intermediaries, as a general rule they do not correspond to the legal definitions of "publication director", "author" or "producer".
It is only if the claim does not bear fruit against the primary controllers of the content, as defined by the legislation, that the petitioner may take action against the host of the illegal content, based on the provisions of the LCEN.
Claims based on the LCEN[65]
Before referring the matter to one of the technical intermediaries designated in Article 6.I.2 of the LCEN, the act requires that a copy of the correspondence; i.e. the request for interruption, deletion or modification of the information in question, is sent to the author or publisher of the information or content in question. Alternatively, the individual making the request must prove that the author or publisher could not be contacted.
Article 6.I.2 of the LCEN puts in place a liability regime which applies only for technical intermediaries. This definition encompasses hosts, notably blogs and discussion forum websites, and Internet service providers, as well as search engine optimization providers such as search engines.
These technical intermediaries can be found liable only if, after having become aware of any illegal activity or information, they do not promptly remove it. In this respect, we should distinguish between “illegal” content, whose removal may be ordered only by a court, from content which is “manifestly illegal” and which must be made inaccessible by technical intermediaries as soon as they become aware of it, meaning upon notice by the victim. The cases of “manifestly illegal” content means content that praises crimes against humanity, incites hatred, includes child pornography or incites violence or harm to human dignity.[66]
Violating privacy or the unauthorized processing of personal data does not, therefore, fall within the definition of “manifestly illegal” and pre-supposes an assessment by a court, in a summary proceedings or in an ex parte petition) which may order “any measures,” according to the Civil Procedure Code, which may cause the alleged act to cease.
As a technical intermediary, search engines may therefore be found liable under Article 6.I.2 of the LCEN, under a court decision handed down upon an ex parte petition or in summary proceedings in order to have the illegal content removed.
Then, on a case-by-case basis, one must consider the possibility of applying the status of a "host," within the meaning of Article 6.I.2 of the LCEN, to search engines in the scope of their search engine activity. By essence search engines only perform automatic indexing of information that is accessible on the Internet and do not provide content storage, such that systematic application of hosting provider status may not be relevant. In this respect, it should be noted that some decisions have clearly excluded this categorization, finding that search engines are mere intermediaries,[67] whereas others have found it to be applicable.
Hence, in a decision on February 15, 2012, a civil court found Google to be liable as a host, although the petitioner cumulatively invoked Article 6.I.2 of the LCEN, host’s liability, and implementation of her right to oppose the processing of her data[68] in order to obtain from Google the removal of links to pornographic videos in which she had acted at another time in her life. The civil court ordered Google to remove the links from its results page only based on the “host” liability regime, ruling that this basis was better suited to grant the deindexing petition. Indeed, the court in summary proceedings ruled in this case that there were grounds for ordering the deindexing of the disputed pages by Google, after having found that the relevant pages were a manifest violation of the plaintiff’s right to privacy. Furthermore, it is surprising to note that the court remained silent as to the role of the website publishers who originally placed the disputed videos online.
The fact that the court excluded the application of the LIL, thereby acknowledging that it was irrelevant and ill-adapted to the petitioner’s request with respect to the technical intermediaries of which search engines are a part, is to be saluted.
Hence, using Article 6.I.2 of the LCEN against search engines is the preferred path of courts because it allows one to preserve freedom of speech and also has the advantage of not making search engines practice censorship, insofar as they will only grant a request to remove or delete content at a court’s request, upon an ex parte petition or in summary proceedings—and not based on an individual request, which, by essence, is subjective and can undermine third parties’ rights.
Nonetheless, if such grounds are used against search engines, several conditions must be met: The provision of the formal notice of the disputed content as required by the LCEN, the need to first take action against the primary controller of the content and then petitioning a court which is the only one that may assess the "manifestly illegal" nature of the harm caused and cause search engines to perform deindexing.
In addition to the substantive difficulties in applying the data privacy law and its cardinal notions of data processing and data controller to search engines’ activities, such application itself, as we have seen, is likely to conflict with other fundamental freedoms such as the right to freedom of communication and speech. Yet, the classification as “personal data” is not required, as explained above, to obtain the withdrawal of litigious content that is damaging to an individual, since other means; e.g., control of indexing by site editors, LCEN, the right to privacy or the right of under press law, may provide satisfying solutions.
While the conflict between two rights is a familiar figure to lawyers, the conflict resolution must still be determined by a court in our legal systems. In this respect, the French Administrative Supreme Court had already, in a decision dated June 10, 2009,[69] clearly ruled that the right to freedom of speech shall now be understood as "the right to access the Internet which constitutes the right to freely access the online public communication services which limitation may only be decided by a court." This essential principle; i.e., courts guaranteeing the freedom of speech, has been confirmed recently by the European Court of Human Rights (ECHR) in a decision delivered on December 18, 2012.[70] In this decision, the ECHR specified that blocking the access to a website and consequently, to a piece of information, breaches article 10 of the European Convention on Human Rights (the convention). Indeed, this right establishes the freedom of speech, regardless of frontiers. Such a restriction may be allowed only if it is based on local legal provisions that offer certain protection against arbitrary breaches of the rights guaranteed by the convention. In our legal system, such protection is generally ensured by a court.
However, the European Parliament decided to take on this issue through the draft regulation to reform data privacy law, presented in January 2012. Indeed, as Viviane Reding already stated in relation to the new draft regulation, in her speech on January 22, 2012, the right to be forgotten, now expressly provided in Article 17 of the draft, is not an absolute right and cannot justify a limit to freedom of speech. The choices that the European Parliament will have to make involving these rights will have to take into consideration the overall implications and ensure that the fundamental rights involved are respected and are not lost at the cost of promoting a right to be forgotten, the consequences of which could be as unexpected as they are dangerous. Has the debate only just begun?
-
expand_more
Footnotes
- [1]On March 9, 2012, the Audiencia Nacional petitioned the CJEU regarding several preliminary questions under the dispute between Mr. X and Google Inc. and Google Spain.
- [2] Pursuant to Article 2 b) and d) of Directive 95/46 of October 24, 1995.
- [3] Recommendation of October 23, 2003, p. 7.
- [4] Op cit., p. 8.
- [5] “Designing the Right to be Forgotten”, an article published in the Spanish newspaper ABC, November 29, 2012.
- [6] Recommendation on “the protection of human rights with regard to search engines”.
- [7] Recommendation CM/Rec (2012)3, adopted by the Council of Ministers on April 4, 2012, in the 1139th meeting of the Ministers’ Delegates, p. 3, point 8 of II.
- [8] Point 2.1 of the preliminary questions of March 9, 2012.
- [9] The directive was implemented under French national law by the Act of August 6, 2004, modifying the Act of January 6, 1978, referred to as the “Data Processing, Data Files and Individual Liberties”.
- [10] “L’hébergeur et la Fausse Concurrence de la LCEN et du Droit de la Presse avec le Droit d’Opposition “informatique et libertés” (“Hosts and the False Competition of the LCEN and the Right of the Press with the Right to Enforce the ‘Data Processing, Data Files and Individual Liberties’”), by Jean Frayssinet, RLDI, 2008, no. 1393, then quoted by Mrs. Lepage in The tricky relationship between the January 6, 1978 act and freedom of speech, Communication Commerce Electronique, February 2010, comm 17.
- [11] In “Les Moteurs à la Recherche d’un Statut Juridique” (“Searches Engines Searching for a Legal Status”), by Julien Le Clainche, Légipresse no. 284, June 2011, p. 368.
- [12] Same author in “Droit des Données Personnelles et Liberté d’Expression: Hiérarchisation ou Conciliation?” (“Law of Personal Data and Freedom of Speech: Hierarchy or Reconciliation?”, in the Lamy du Droit de l’Immatériel review, 2009, no. 1798.
- [13] Op. cit., no. 284, p. 368. The decisions of the Criminal Chamber of the French Supreme Court (Cour de Cassation) to which the author refers are those of January 13 and June 15, 2009.
- [14] Paris Court of Appeal, order of November 15, 2011, Société Biotronik v/ Competition Authority.
- [15] A list of these technical data is attached to the opinion dated April 4, 2008 (on data protection aspects related to search engines) of the Working Party established pursuant to Article 29 of Directive 95/46, referred to as “Art. 29 WP”.
- [16] The last name and first name, of course, but also the IP address, insofar as this allows one to identify a person.
- [17] Question 2.2, referred to the CJEU by the Audencia Nacional.
- [18] Article 2 d) of Directive 95/46.
- [19] Opinion of April 4, 2008, quoted supra, p. 18.
- [20] Opinion quoted supra, footnote no. 17, p. 15.
- [21] Opinion quoted supra, p. 27. This Opinion includes an exhibit listing the data processed by search engines.
- [22] Deliberation no. 01-057 of November 29, 2001. This recommendation was the subject of a report made public on January 19, 2006.
- [23] Two public warnings issued: the first, against the company Pages Jaunes (deliberation no. 2011-203 of September 21, 2011), the second, against the company Yatedo France (deliberation no. 2012-156 of June 1, 2012).
- [24] Examination of an order of the Montpellier Civil Court in summary proceedings, October 28, 2010.
- [25] A. Lepage, Op cit., Communication, Commerce Electronique, no. 5, May 2011, comm. 47.
- [26] Member of the CNIL between 2004 and 2009.
- [27] A blog host is responsible for processing within the meaning of the Data Processing, Data Files and Individual Liberties Act, A. Debet, Communication Commerce Electronique, no. 4, April 2012, comm. 41.
- [28] Paris Civil Court, February 15, 2012, Diana Z case.
- [29] Paris Civil Court, summary proceedings order, September 19, 2012, JP Berenguier, R Mallois et E. Deschamps c/ Google Inc., JFG Networks et Automattic Inc.
- [30] Point 3.1.
- [31] Rights already mentioned. This mainly involves the right to be informed of the processing, the obligation to receive consent for processing, and the right to oppose processing for “overriding and legitimate” reasons.
- [32] Article 11, 2) of Directive 95/46, implemented in Article 32, III of the Act of January 6, 1978, as amended.
- [33] Order of the Montpellier Civil Court, October 28, 2010.
- [34] These exemptions essentially provid for data processing performed by the State and which involve public security pursuant to the provisions of article 12 of the aforementioned Directive, implemented in Article 41 of the Act of January 6, 1978, as amended.
- [35] Article 6 c) of Directive 95/46, implemented in Article 6 of the Act of January 6, 1978, as amended.
- [36] Article 8 of the directive, implemented in Article 8 of the aforementioned act.
- [37] Article 8 of the directive states that this prohibition is not applicable to public authorities.
- [38] Draft Regulation regarding the protection of individuals with respect to processing of personal data and the free movement of such data (2012/0011 (COD)).
- [39] Article 2 e) of the directive, implemented in Article 35 of the Act of January 6, 1978, as amended.
- [40] Cf. the decision quoted supra of the Paris Civil Court of 15 February 2012.
- [41] Using Mr. Le Clainche’s expression.
- [42] Although the right to be forgotten does not exist in the legal texts, the French Supreme Court already referred to this right in a decision dated November 20, 1990, to exclude the application thereof, because the court considered that “the facts relating to the private life of Mrs. X had been put, in the past, at the disposal of the public [….] thus, they had been disclosed lawfully, and consequently, were no longer part of her private life, Mrs. X … had no ground for requiring the enforcement of the right to be forgotten in order to prevent a new mention of these facts.””. - Bull. Cass. Appeal N° 89-12580.
- [43] However, this was the paradoxical outcome of the decision of the Spanish Data Protection Authority of July 30, 2010, quoted supra.
- [44] See our analysis on this point, Part I A.
- [45] “L'Oubli Numérique est il de Droit Face à une Mémoire Numérique Illimitée?” (“Is Being Digitally Forgotten a Right in the Face of Unlimited Digital Memory?”) Mathieu Berguig and Corinne Thiérarche - Rev Lamy droit de l'immatériel 201062 lamyline.fr.
- [46] Paris Civil Court, summary proceedings order, October 12, 2009, Note by Jean Frayssinet - RLDI 2009/54 no. 1798 – see our analysis, Part II-2.1 c).
- [47] Paris Civil Court, summary proceedings order, June 23, 2008, Note by Frayssinet – RLDI 2008/42 no. 1393.
- [48] In line with this, see Jean Frayssinet – Note under the Paris Civil Court summary proceedings order, June 23, 2008 - Note 44.
- [49] Paris Civil Court, summary proceedings, October 12, 2009, X., Sté L. & Com c/ J.-H.Z – Note by Jean Frayssinet - RLDI 2009/54, no. 1798.
- [50] See references, Note 48.
- [51] Publication: Bulletin 2003 I, no. 172 p. 134.
- [52] Civil Court, June 25, 2009, C. Vernes v/Sas Les Echos – Note by Nathalie Mallet-Poujol – Légipresse, no. 266 November 2009.
- [53] See, our analysis, Part II – B 1).
- [54] Paris Civil Court (17th civil chamber) of May 9, 2012 (S. et P. Dokhan v/ Sas Les Echos) – Note by Nathalie Mallet-Poujol, Légipresse - no. 297, September 2012.
- [55] In line with this, see Paris Court of Appeal, January 26, 2011, Saif v/ Sté Google – Legipresse, July-August 2011, no. 285-14, p. 400.
- [56] In line with this, see Paris Court of Appeal, February 4, 2011, Sté Google v/ Sté Auféminin.com - Legalis.net.
- [57] E.g., "robots.tx".
- [58] Order of the Paris Civil Court, summary proceedings, June 25, 2009, C. Vernes v/ SAS les Echos, Note by N. Mallet-Poujol – Légipresse, no. 266, November 2009.
- [59] In line with this, note by Nathalie Mallet-Poujol, “Du Droit à l'Oubli Numérique: Variations Autour de l'Article 38 de la LIL” (“Regarding the Digital Right to be Forgotten: Variations Around Article 38 of the LIL”, Legipresse, no. 297 - September 2012.
- [60] In French, “Charte du Droit à l'Oubli dans les Sites Collaboratifs et les Moteurs de Recherche”. Created by the French Secretary of State in charge of e-Economy Forecasting and Development, Nathalie Kosciusko-Morizet, the Policy on the Right to Be Forgotten on Collaborative Websites and Search Engines (charte du droit à l’oubli numérique) was signed on October 13, 2010 by twelve signatories, representing collaborative sites and search engines.
- [61] CNIL recommendation, October 24, 1995, relative to personal data processed or used by written press or audiovisual organizations for journalistic and editorial purposes.
- [62] Working Party’s opinion 29, 1/97 of February 25, 1997, data protection law and the media.
- [63] Paris Civil Court, summary proceedings, October 12, 2009, Mme X c/ M. C. V., cited in note 46.
- [64] Paris Civil Court, June 25, 2009, C. Vernes c/ Sas Les Echos - See citation note 53.
- [65] Act no. 2004-496 DC, June 10, 2004 for trust in the digital economy.
- [66] Decision no. 2004-496 DC, June 10, 2004.
- [67] In line with this, see Paris Civil Court, summary proceedings, February 27, 2006, Afflelou c/ Google.- legalis.net: "Google Images just automatically indexes images that are accessible on the Internet and it does not have information as to the identity of the images’ publishes”. The court heard Google’s argument, which was that its “Images” service operated in the same was as the ordinary search engine service: it merely indexes images that are visible on websites in the directory. Therefore, Alain Afflelou was not justified in serving a writ on Google as a host in order to obtain information on the publisher of the disputed image, which [he] wished to have removed.
- [68] Paris Civil Court, February 15, 2012, Diana Z. c/ Google - CCE no. 5, May 2012, comm. 54.
- [69] Decision of the French Administrative Supreme Court n° 2009-580 on the review of the law favoring the dissemination and protection of creation on the Internet.
- [70] European Court of Human Rights - Req. N° 3111/10 - Case Ahmet Yildirim v. Turkey dated December 18, 2012.