This article is the third installment of a three-part series that unpacks platform liability laws in the U.S. and the EU and analyzes their potential application to generative AI. Part 1 explores platform liability while Part 2 analyzes the landscape in the U.S.
In the late 20th century, the booming internet economy was not only causing legal shifts in the U.S., but also in the EU. While the economic benefits of electronic commerce were becoming apparent, divergent national laws created legal uncertainty for platforms operating across borders. As a result, the Electronic Commerce Directive of 2000 was adopted in the EU to create an internal market framework for online services and bring more legal certainty for online services and consumers. In addition to providing liability exemptions for intermediary service providers, the e-Commerce Directive also harmonized rules on transparency and information requirements for online service providers, commercial communications and electronic contracts. It encouraged the creation of voluntary codes for greater cooperation among member states.
Evidently, with rapid advancements in technology, the internet has transformed in unforeseeable ways. In 2022, the EU adopted the Digital Services Act to keep up with this evolution, but the e-Commerce Directive also continues to be in force today. However, the DSA predates the mainstream use of generative AI. In other words, the DSA was not drafted with generative AI in mind.
e-Commerce Directive 2000
The e-Commerce Directive was and continues to be a cornerstone for digital regulation in the EU in various respects. In the context of intermediary liability, it extended liability exemptions only to services that played a neutral, merely technical and passive role toward the hosted content, provided they met specific conditions set out in the directive. As such, the directive only covers three types of services: mere conduits in Article 12, caching in Article 13 and hosting in Article 14.
Although information location tools, such as search engines and hyperlinks, were excluded from the e-Commerce Directive, they could be subject to national laws, such as in Spain, Portugal and Austria. The three safe harbors only covered situations in which the technical activity, i.e. transmission/access, caching and storage, dealt with information provided by the service recipient and did not apply to the service provider's own editorial content. In other words, the safe harbors in the EU, inspired by the safe harbors in the U.S. Digital Millennium Copyright Act, created a framework under which service providers could not be held liable for third-party information.
Additionally, Article 15 of the directive provides that member states cannot legally require general monitoring from intermediary services to detect illegal content, as this could have negative repercussions on the principle of freedom of expression. However, specific monitoring may be required under national or sector-specific EU legislation.
The e-Commerce Directive did not create an absolute harmonized framework. Rather, a number of things were left for experimentation at the national level, such as notification obligations for illegal content, administrative and judicial preventative ad hoc measures, or details of notice and takedowns. In other words, the directive did not contain a liability basis; it only determined instances in which intermediary service providers were exempt from liability. Liability was to be established on a case-by-case basis through national or sector-specific EU laws.
Digital Services Act of 2022
The online environment has been developing rapidly since the introduction of the e-Commerce Directive in 2000. When it comes to online platforms, there has not only been exponential growth but also major changes in terms of the services they offer, the technologies they employ and the business models they use. With these developments, the need to revamp the 20-year-old rules governing digital services was apparent.
Moreover, the directive did not provide needed clarity regarding the liability of online platforms due to the differences in EU member-state national laws transposing it, diverging interpretations of some of its notions, as well as possible overlaps and contradictions with sectoral rules. The Digital Services Act was adopted in 2022 as part of the Digital Services Act package. With the DSA, the European Commission's plan is to "make the online world safer" and end the digital Wild West. The DSA is a regulation, hence its rules apply directly in all EU member states without the need to transpose them into national law, as is the case with directives, which often lead to fragmented interpretation.
The main goal of the e-Commerce Directive was to facilitate cross-border provision of such services in the EU. In contrast, the DSA focuses exclusively on safeguarding users from illegal and harmful activities online and the spread of disinformation. Therefore, the adoption of the DSA does not affect the core rules regulating e-commerce that were set out in the directive.
Platform liability exemptions are no longer governed by the directive after the adoption of the DSA. Articles 12 to 15 of the directive have effectively been replaced by Articles 4, 5, 6 and 8 of the DSA. In principle, this does not change much, as the DSA adopts the liability exemptions already established under the directive. That is, mere conduit, caching, and hosting services remain exempt from liability when they meet certain conditions. Moreover, the DSA upholds the prohibition of general monitoring obligations.
The DSA has tried to remedy the lack of clarity concerning certain rules established in the e-Commerce Directive, especially in the context of today's digital environment, which is defined by a variety and complexity of online services. For instance, the DSA tries to clarify the hosting services exemption in Recital 22, by confirming "the fact that the provider automatically indexes information uploaded to its service, that it has a search function or that it recommends information on the basis of the profiles or preferences of the recipients of the service is not a sufficient ground for considering that provider to have 'specific' knowledge of illegal activities carried out on that platform or of illegal content stored on it." Recital 29 also clarifies that "'hosting services' include categories of services such as cloud computing, web hosting, paid referencing services or services enabling sharing information and content online, including file storage and sharing. Intermediary services may be provided in isolation, as a part of another type of intermediary service, or simultaneously with other intermediary services."
Recital 29 also recognizes, due to the evolving nature of current online services, "whether a specific service constitutes a 'mere conduit', 'caching' or 'hosting' service depends solely on its technical functionalities, which might evolve in time, and should be assessed on a case-by-case basis."
The DSA also brings changes concerning information and transparency requirements for online platforms that were established under the directive. It introduces additional obligations and responsibilities, and structures them in a tiered approach. All online intermediary services, which are all services covered by the DSA, are assigned certain obligations. The obligations are more extensive if a service falls within the scope of hosting services and are even more elaborate for online platforms. Finally, only services designated as very large online platforms and very large online search engines must comply with the most extensive and stringent obligations.
The DSA became fully applicable in February 2024, but investigations concerning compliance with its rules have already started, even leading to the first preliminary findings, and some platforms have already contested their designations under this law. However, these cases concern the breach of obligations under the DSA rather than the scope of liability exemptions. Hence, it remains to be seen how the scope of liability exemptions will be interpreted under the DSA and whether this will bring any changes to the existing interpretation based on the e-Commerce Directive. It is important to note, even if a certain service falls within the scope of liability exemptions regarding illegal content, the breach of transparency, information and other obligations under the DSA can result in massive losses. The fines for noncompliance can reach up to 6% of the annual worldwide turnover of the online intermediary service provider.
Generative AI and platform liability in the EU
At the outset, it may be difficult to constitute stand-alone generative AI tools as one of the three intermediary services that are exempt from liability under the DSA, i.e., mere conduits, caching and hosting. This is because content generated by user prompts does not neatly fit into the definitions of those intermediaries as this technology was not contemplated by the DSA. However, generative AI tools are now also being incorporated into search engines, which generate information users are seeking instead of directing them to other websites as with traditional search engines. When generative AI is integrated into existing online platforms such as social media, gaming and cloud hosting services, content may be generated not only through a user prompt, but arguably can also be modified by the generative AI tool itself. This creates uncertainty for the application of the liability exemption provided under the DSA.
A recent paper argues whether liability exemption extends to generative AI under the DSA depends on whether it is a hosting service, search engine or embedded into an online platform.
Hosting service
Article 3(g)(iii) of the DSA defines a hosting service as "consisting of the storage of information provided by, and at the request of, a recipient of the service." Recital 18 clarifies that the liability exemptions should not apply "where, instead of confining itself to providing the services neutrally by a merely technical and automatic processing of the information provided by the recipient of the service, the provider of intermediary services plays an active role of such a kind as to give it knowledge of, or control over, that information. Those exemptions should accordingly not be available in respect of liability relating to information provided not by the recipient of the service but by the provider of the intermediary service itself, including where the information has been developed under the editorial responsibility of that provider."
Similar to liability exemptions in the U.S., a hosting service is not liable for the information stored at the request of a recipient of the service if it does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent. It is also not liable if it acts expeditiously to remove or disable access to the illegal content upon obtaining such knowledge or awareness.
As per the interpretations of Article 14 of the directive, hosting services fall into three categories, which cover a broad range of services, including:
- Storage and distribution, such as web hosting; online media platforms such as YouTube, WordPress and SoundCloud; file storage and sharing such as DropBox; and cloud computing such as Amazon Web Services.
- Networking, collaborative production and matchmaking, eg., online marketplaces, social media and online gaming.
- Selection, search and referencing such as Google search or ratings and reviews such as Yelp.
Whether generative AI fits the definition of hosting services depends on who is responsible for the generation of content. This determination can potentially be influenced by factors such as whether the role the intermediary service played in generation of harmful content could be considered active, the kind of knowledge and control it exercised over that content, whether the user prompt to generate content can be considered storing of information by the model, and who ultimately is responsible for the dissemination or distribution of the harmful content.
Online search engine
Although online search engines were excluded from the e-Commerce Directive, and are also not considered a stand-alone intermediary service under the DSA, Article 3(j) of the DSA defines them as an "intermediary service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found."
However, the DSA does not indicate whether online search engines belong in the category of mere conduits, caching or hosting services. Based on directive jurisprudence, such as the Court of Justice of the European Union's Google vs. Louis Vuitton judgment, search engines may be considered hosting services if they pass the neutral role test, meaning they do not play an active role toward the hosted content. Moreover, search engines were implemented as part of mere conduits or hosting services under national legislation in some member states.
As generative AI is being integrated into search engines, they very likely fall within the scope of the DSA. Moreover, such use cases could potentially also be designated as VLOSE, which would subject them to the stringent requirements of the DSA such as risk mitigation.
Embedded as part of an online platform
Although generative AI chatbots have already been embedded into a number of existing social media platforms for personal communication, such communication and private messaging fall outside the scope of the DSA. However, as time evolves, there are many other foreseeable and unforeseeable ways generative AI could be embedded into existing platforms that are subject to the DSA's requirements. For example, synthetic content could be generated through WhatsApp or Instagram's chatbots and disseminated by the user on those or other platforms. Such hybrid forms of use were not envisaged by the DSA, and as a result, exemption of liability is extremely uncertain to evaluate. The only certainty in this regard can be found in the guidance of Recital 29. Intermediaries' technical functionalities will evolve over time and must be assessed on a case-by-case basis.
Conclusion
The DSA represents a significant step forward in the regulation of online platforms and content moderation within the EU. However, its application to generative AI in the context of platform liability remains an open question. As generative AI continues to evolve and becomes more integrated into online services and platforms, whether the conditional liability exemption extends to synthetic content will be determined by a law that was not created with this new technology in mind. The extent to which generative AI will benefit or be excluded from the DSA's liability exemptions depends on the specifics of each case.
Laura Pliauskaite is the European operations coordinator for the IAPP. Uzma Chaudhry is the former IAPP AI Governance Center Fellow.