This article is part of a series on the operational impacts of the EU AI Act. The full series can be accessed here.


Published: July 2024


Contributors:


Navigate by Topic

Providers of high-risk AI systems will need to know Chapter III of the EU AI Act very well. Sections 2 and 3 of Chapter III set out the requirements a provider must meet when making a high-risk AI system available on the EU market.

One way to categorize these different requirements is dividing them broadly into organizational, documentation, system design and regulatory requirements, while recognizing that certain articles perform dual roles.


Product safety, financial institutions and AI literacy

A different approach is available for products with an AI system that are also subject to the EU harmonization legislation listed in Section A of Annex I. These products are regulated by other EU directives and regulations, e.g., on machinery, safety of toys, lifts, medical devices and in vitro diagnostic medical devices.

In these circumstances, according to Article 8(2), providers "have a choice of integrating, as appropriate, the necessary testing and reporting processes, information and documentation they provide with regard to their product into documentation and procedures that already exist and are required under" the EU harmonization law. This approach is encouraged to ensure consistency, avoid duplication and minimize additional burdens. For instance, a provider of a high-risk AI system can rely on a single set of technical documentation, as permitted under Article 11(2).

Likewise, the AI Act allows for providers that are financial institutions subject to the EU financial services law to avoid duplication in certain instances. The obligation to implement certain aspects of a quality management system under the AI Act can be fulfilled by the provider complying with the rules on internal governance arrangements or processes under EU financial services law, according to Article 17(4).

Under the requirements in Article 4 of Chapter III, a provider of any AI system, whether high risk or not, must ensure sufficient AI literacy within its organization. Staff and "other persons," presumably contractors, etc., who deal with the operation and use of the AI system are expected to have sufficient skills, knowledge and understanding to make an informed deployment of the AI system, as well as to be aware of the opportunities and risks of AI and the possible harm it can cause. Of course, this does not mean each individual needs to demonstrate the same level of AI literacy — this obligation takes into account the context in which the AI system will be used, who will use it and the affected persons.


Articles 8-22

This section provides an overview of the individual Articles 8-22 of Chapter III of the EU AI Act, which contains the core requirements on high-risk AI providers.

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

    a. recording of the period of each use of the system (state date and time and end date and time of each use);

    b. the reference database against which input data has been checked by the system;

    c. the input data for which the search led to a match;

    d. the identification of the individual involved in the verification of the results, as referred to in Article 14(5)."

  • expand_more

  • expand_more

    a. measures identified and built, when technically feasible, into the high-risk AI system by the provider before it is placed on the market or put into service; or

    b. measures identified by the provider before placing the high-risk AI system on the market or putting it into service and that are appropriate to be implemented by the deployer."

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more

  • expand_more


Annex

Articles 8-15 comprise Section 2 of Chapter III and Articles 16-27 make up Section 3. If we treat Articles 8-22 as containing the core requirements on high-risk AI providers, it is possible to break down the requirements as set out in the table below. Please note our assessment of the obligations for the use of high-risk AI systems by other actors who are not providers will follow in Part IV of this series.

 
Links with
Requirement

Article 8: Compliance with the requirements

Links with

Section 2

Requirement

Organizational process

Article 9: Risk-management system

Links with

Articles 72, 13 and 60

Requirement

Documentation and system design

Article 10: Data and data governance

Links with

None

Requirement

System design

Article 11: Technical documentation

Links with

Annex IV

Requirement

Documentation

Article 12: Record keeping

Links with

Articles 79, 72 and 26

Requirement

Documentation and system design

Article 13: Transparency and provision of information to deployers

Links with

Articles 12, 14 and 15

Requirement

System design

Article 14: Human oversight

Links with

Annex III

Requirement

System design

Article 15: Accuracy, robustness and cybersecurity

Links with

None

Requirement

System design

Article 16: Obligations of providers of high-risk AI systems

Links with

All of Section 2 of Chapter III, Articles 17-20, Articles 43, 47-49

Requirement

Organizational, documentation, system design and regulatory

Article 17: Quality-management systems

Links with

Articles 9, 72 and 73

Requirement

Documentation

Article 18: Document keeping

Links with

Articles 11, 17 and 47

Requirement

Documentation

Article 19: Automatically generated logs

Links with

Article 12

Requirement

Documentation

Article 20: Corrective actions and duty of information

Links with

Article 79

Requirement

Regulatory

Article 21: Cooperation with competent authorities

Links with

Article 12

Requirement

Regulatory

Article 22: Authorised representatives

Links with

Section 2, Articles 47 and 49

Requirement

Regulatory


Additional resources


Top 10 operational impacts of the EU AI Act

Published

Coming Soon

  • Part 7: AI Assurance across the risk categories
  • Part 8: Post-market monitoring, information sharing, and enforcement
  • Part 9: Regulatory implementation and application alongside EU digital strategy
  • Part 10: Leveraging GDPR compliance


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 3

Submit for CPEs