This article is part of a series on the operational impacts of the EU AI Act. The full series can be accessed here.


Published: August 2024


Contributors:


Navigate by Topic

What should you do if you are using or handling a high-risk AI system but are not a provider? The AI Act imposes a comprehensive set of obligations on deployers, importers, distributors and authorized representatives to ensure the safe and compliant use of high-risk AI systems within the EU.

These stakeholders must work together to uphold the principles of transparency, safety and accountability to foster trust and innovation in the AI ecosystem. Through diligent adherence to these regulations, the potential risks associated with AI can be mitigated, ensuring the benefits of AI are realized while protecting the fundamental rights and safety of individuals. The obligations imposed on deployers, importers, distributors and authorized representatives appear in Chapter II, Section 3.


Obligations of deployers

Deployers have a general obligation to take appropriate technical and organizational measures to ensure they are using high-risk AI systems in accordance with the instructions for use accompanying such systems. Deployers may be subject to further obligations under EU or national law in addition to those in the AI Act.

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more


Obligations of importers

Importers that place high-risk AI systems on the EU market are subject to rigorous checks to ensure compliance with the AI Act. See the table below for a comparison of the obligations of importers and distributors.

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more


Obligations of distributors

Distributors that make high-risk AI systems available on the EU market must ensure these systems comply with the AI Act's requirements. A number of these obligations are similar, but not identical, to those of the importers. See the table below for a comparison of the obligations of importers and distributors.

  • expand_more

  • expand_more

  • expand_more

  • expand_more


Comparison of importer and distributor obligations

 
Importers
Distributors

Due diligence

Importers

Verify that the provider has:

• Conducted the proper conformity assessment in accordance with Article 43.

• Drawn up the technical documentation of the AI system.

• Appointed an authorized representative in accordance with Article 22(1), when required.

Verify that the AI system has the required CE marking and is accompanied by the EU declaration of conformity and instructions for use.

Distributors

Verify that providers and importers have fulfilled their obligations, including conformity assessments.

Verify that AI systems bear the CE marking and are accompanied by a copy of the EU declaration of conformity and instructions for use.

Compliance

Importers

Indicate their name, registered trade name or registered trademark, and address on the AI packaging or its accompanying documentation.

Ensure the storage or transport of AI does not jeopardize the AI system's compliance with Section 2, Chapter III of the AI Act.

Not place any AI system that is noncompliant, falsified or accompanied by falsified documentation on the market until it has been brought into conformity.

Distributors

Ensure the storage or transport of AI does not jeopardize the AI system's compliance with Section 2, Chapter III of the AI Act.

Not make any noncompliant AI system available on the market until it has been brought into conformity.

When a noncompliant AI system has been made available on the market, take the corrective actions necessary to bring it into conformity, withdraw it, recall it or ensure the provider, importer or any relevant operator takes those corrective actions as appropriate.

Recordkeeping

Importers

Keep a copy of the EU declaration of conformity, the certificate issued by the notifying body and the instructions of use for 10 years after the high-risk AI system is placed on the market or put into service.

Ensure technical documentation is available for regulatory authorities upon request.

Distributors

N/A

Reporting

Importers

Inform the provider of an AI system, the authorized representatives and the market surveillance authorities, but not the deployer, of any risks posed by such AI system.

Distributors

Inform the provider or the importer of an AI system of any risks, as defined under Article 79(1), posed by such AI system.

Immediately inform the provider or importer and the competent authorities when it has made an AI system that presents a risk, as defined under article 79(1), available on the market and that AI system does not comply with Section 2, Chapter III of the AI Act. This includes providing the details of noncompliance and any corrective actions taken.

Cooperation with the authorities

Importers

Provide the authorities with all necessary information and documentation related to the AI system, including technical information, upon reasoned request.

Cooperate in any action taken by the authorities in relation to high-risk AI systems.

Distributors

Provide the authorities with all necessary information and documentation regarding actions taken to demonstrate conformity of an AI system with Section 2, Chapter III of the AI Act, upon reasoned request.

Cooperate with any action taken by the authorities in relation to high-risk AI systems made available on the market, in particular to reduce and mitigate the risks it poses.


Obligations of authorized representatives

The roles and obligations of authorized representatives are directly linked to providers. Providers that are established in third countries outside the EU are required to appoint authorized representatives in the EU prior to making their high-risk AI systems available on the EU market.

Authorized representatives can be any natural or legal person located or established in the EU who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to respectively perform and carry out the obligations and procedures established by the AI Act on behalf of the provider.

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more


Responsibilities along the AI value chain

Article 25 of the AI Act considers situations in which a deployer, importer, distributor or third party of a high-risk AI system becomes a provider when specific actions are taken.

Indeed, any distributor, importer, deployer or other third party shall be considered a provider of a high-risk AI system and shall be subject to the obligations of the provider under Article 16 if:

  • They put their name or trademark on a high-risk AI system that has already been placed on the market or put into service, regardless of the contractual arrangements between the parties.
  • They make a substantial modification to a high-risk AI system that has already been placed on the market or has already been put into service in such a way that it remains a high-risk AI system under Article 6.
  • They modify the intended purpose of an AI system, including general-purpose AI systems, which was not classified as high-risk when it was placed on the market or put into service but became a high-risk AI system as a result of modification.

Under the AI Act, if a deployer, importer or distributor takes any of the above actions, the provider that initially put the high-risk AI system on the market or into service is no longer considered a provider of that specific AI system. Therefore, depending on the situation, there is a change of designated role and a shift of responsibility from the initial provider to the new provider, which was previously a deployer, importer or distributor.

Nonetheless, the initial provider must closely cooperate with any new providers, make the necessary information available and provide the reasonably expected technical access and other assistance required to enable the new providers to fulfil their obligations, especially regarding compliance with the conformity assessment of high-risk AI systems.

However, in its contract with a deployer, importer or distributor, the initial provider can stipulate that its AI system is not to be changed into a high-risk AI system. In those cases, the initial provider has no obligation to hand over the documentation. Presumably, this only applies to AI systems that are not classified as high-risk when they are initially placed on the market or put into service.

For high-risk AI systems that are safety components of products, product manufacturers are considered providers and, therefore, must comply with the obligations imposed on providers under Article 16 when either the high-risk AI system is placed on the market with the product under the name or trademark of the product manufacturer or the high-risk AI system is put into service under the name or trademark of the product manufacturer after the product has been placed on the market.

Finally, any third party that supplies an AI system or tools, services, components or processes used or integrated with a high-risk AI system must enter into a written agreement with the provider. The agreement should specify the necessary information, capabilities, technical access and other assistance based on the generally acknowledged state of the art to enable the provider of the high-risk AI system to fully comply with its obligations under the AI Act. This requirement does not apply to third parties that make tools, services, processes or components accessible to the public under a free and open license, except general-purpose AI models.

The AI Office may develop and recommend voluntary model terms for contracts between providers of high-risk AI systems and third parties similar to the standard contractual clauses between controllers and processors under the GDPR.


Additional resources


Top 10 operational impacts of the EU AI Act

Published

Coming Soon

  • Part 7: AI Assurance across the risk categories
  • Part 8: Post-market monitoring, information sharing, and enforcement
  • Part 9: Regulatory implementation and application alongside EU digital strategy
  • Part 10: Leveraging GDPR compliance


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 3

Submit for CPEs