TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | In scope or not? An EU AI Act decision tree and obligations Related reading: MEPs set to debate, vote on AI Act position

rss_feed

""

""

As with a colorblindness hue test, if you stare at the new version of the EU Artificial Intelligence Act for long enough, some patterns form (or maybe you are just losing your vision or your mind). Below is a decision tree to help assess whether you fall in scope.

Is it an AI system?

Per the pending legislation, an AI system is "a machine-based system that is designed to operate with varying levels of autonomy and that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations or decisions that influence physical or virtual environments." And it is not one exclusively for military use.

To answer, take a close look at your product and service offerings, after doing the following, and determine whether the definition applies.

  1. Spell out every process, product and component of a product that involves an automated machine-based system.
  2. Spell out the nature and degree of automation in each case.
  3. Spell out, if any, the degree of human involvement in each process.

Is it a foundational model?

Under the law, a foundational model is a machine-based system designed to operate with varying levels of autonomy that can, for explicit or implicit objectives, generate outputs such as predictions, recommendations or decisions that influence physical or virtual environments.

Do the same process as above to determine whether or not this applies to your products.

Is it a high-risk AI system?

Under the proposed EU AI Act, a high-risk AI system is an AI system intended to be used as a safety component of a product, or is itself a product, covered by the EU's harmonization legislation listed in Annex II and whose safety component is the AI system. Or the AI system, itself as a product, is required to undergo a third-party conformity assessment related to risks for health and safety, with a view to the placing on the market or putting into service that product pursuant to the EU's harmonization legislation listed in Annex II.

It is also a system listed under Annex III if it poses a significant risk of harm to the health, safety or fundamental rights of natural persons and, where an AI system falls under Annex III point 2, the environment.

Annex III includes AI systems that are biometrics or biometrics-based (i.e., identification, inferences); for management or operation of critical infrastructure; education and vocational training; employment, workers management and access to self-employment; access to and enjoyment of essential private services and public services and benefits; law enforcement; migration, asylum and border control management; or administration of justice and democratic processes.

Follow up questions

Spell out every stand-alone process, product and component of product that involves one or more automated machine-based system and, for each of them, spell out:

  1. What is the process/product?
  2. What is the automation? What is the output of the automation?
  3. Is there any human involvement in the process? What is it?
  4. What is the process or products in which the automation is incorporated?
  5. What is the impact of the automation?
  6. What are the possible consequences or risks from the automation or decision?

What is your role with respect to the system?

If your system falls in scope of the legislation, you have obligations dependent on your role. Ask yourself these questions:

Are you placing the product on the market? This includes any supply of an AI system for distribution or use on the EU market in the course of a commercial activity, whether in return for payment or free of charge.

Do you provide the product? This means you are "a natural or legal person, public authority, agency or other body that develops an AI system or that has an AI system developed with a view to placing it on the market or putting it into service under its own name or trademark, whether for payment or free of charge." You would also be deemed a provider if your name or trademark is on a high-risk system already on the market, you make substantial modifications to a high-risk system already on the market or you make substantial modifications such that the system becomes high risk.

Are you a manufacturer? This is not a defined term in the legislation and the obligations are the same as those of a provider.

Do you deploy the product? This is any natural or legal person, public authority, agency or other body using an AI system under its authority, except where the AI system is used in the course of a personal nonprofessional activity.

The proposed law would apply to a provider with a main establishment outside the EU, but either member-state law applies by virtue of public international law or the output produced by the system is intended to be used in the EU; a provider or deployer located in the EU, even if it places systems outside the EU; a deployer of the system with an establishment in the EU.

For this, and to assess whether you fit in to the other roles, assess the nature of your EU establishment. Is this a subsidiary, reseller, distributor or employee? Provide information with respect to the role of the EU entity, their relation to you and what they do.

For your EU entity, assess whether you are an importer of the product. This is "any natural or legal person established in the (European) Union that places on the market or puts into service an AI system that bears the name or trademark of a natural or legal person established outside the Union."

Finally ask whether you are a distributor of the product. This is "any natural or legal person in the supply chain, other than the provider or the importer, that makes an AI system available on the Union market without affecting its properties."

Once you have all the AI aspects and your role with respect to them mapped out, proceed to assessing your unique obligations under the Act.


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

4 Comments

If you want to comment on this post, you need to login.

  • comment John Kropf • Jun 15, 2023
    Great article.  Curious, though, what was the context for adding the sentence, "And it is not one exclusively for military use."?
  • comment Camille Talbot • Jun 16, 2023
    Thank you, great article super precise, very useful!
  • comment Ilia Dubovtsev • Jun 20, 2023
    Thank you so much, Odia. A good guide to the AI Act: definitions and key roles.
  • comment Frederika Fransonette De Koster • Jul 2, 2023
    Thank you Odia, this is very insightful.