Table of contents

Section 1: Classification of AI Systems as High-Risk

Article 6: Classification Rules for High-Risk AI Systems

Article 7: Amendments to Annex III

Section 2: Requirements for High-Risk AI Systems

Article 8: Compliance with the Requirements

Article 9: Risk Management System

Article 10: Data and Data Governance

Article 11: Technical Documentation

Article 12: Record-Keeping

Article 13: Transparency and Provision of Information to Deployers

Article 14: Human Oversight

Article 15: Accuracy, Robustness and Cybersecurity

Section 3: Obligations of Providers and Deployers of High-Risk AI Systems and Other Parties

Article 16: Obligations of Providers of High-Risk AI Systems

Article 17: Quality Management System

Article 18: Documentation Keeping

Article 19: Automatically Generated Logs

Article 20: Corrective Actions and Duty of Information

Article 21: Cooperation with Competent Authorities

Article 22: Authorised Representatives of providers of high-risk AI systems

Article 23: Obligations of Importers

Article 24: Obligations of Distributors

Article 25: Responsibilities Along the AI Value Chain

Article 26: Obligations of Deployers of High-Risk AI Systems

Article 27: Fundamental Rights Impact Assessment for High-Risk AI Systems

Section 4: Notifying Authorities and Notified Bodies

Article 28: Notifying Authorities

Article 29: Application of a Conformity Assessment Body for Notification

Article 30: Notification Procedure

Article 31: Requirements Relating to Notified Bodies

Article 32: Presumption of Conformity with Requirements Relating to Notified Bodies

Article 33: Subsidiaries of and Subcontracting by Notified Bodies

Article 34: Operational Obligations of Notified Bodies

Article 35: Identification Numbers and Lists of Notified Bodies Designated Under this Regulation

Article 36: Changes to Notifications

Article 37: Challenge to the Competence of Notified Bodies

Article 38: Coordination of Notified Bodies

Article 39: Conformity Assessment Bodies of Third Countries

Section 5: Standards, Conformity Assessment, Certificates, Registration

Article 40: Harmonised Standards and Standardisation Deliverables

Article 41: Common Specifications

Article 42: Presumption of Conformity with Certain Requirements

Article 43: Conformity Assessment

Article 44: Certificates

Article 45: Information Obligations of Notified Bodies

Article 46: Derogation from Conformity Assessment Procedure

Article 47: EU Declaration of Conformity

Article 48: CE Marking

Article 49: Registration

Section 1: Post-Market Monitoring

Article 72: Post-Market Monitoring by Providers and Post-Market Monitoring Plan for High-Risk AI Systems

Section 2: Sharing of Information on Serious Incidents

Article 73: Reporting of Serious Incidents

Section 3: Enforcement

Article 74: Market Surveillance and Control of AI Systems in the Union Market

Article 75: Mutual Assistance, Market Surveillance and Control of General Purpose AI Systems

Article 76: Supervision of Testing in Real World Conditions by Market Surveillance Authorities

Article 77: Powers of Authorities Protecting Fundamental Rights

Article 78: Confidentiality

Article 79: Procedure for Dealing with AI Systems Presenting a Risk at National Level

Article 80: Procedure for Dealing with AI Systems Classified by the Provider as a Not High-Risk in Application of Annex III

Article 81: Union Safeguard Procedure

Article 82: Compliant AI Systems Which Present a Risk

Article 83: Formal Non-Compliance

Article 84: Union AI Testing Support Structures

Section 4: Remedies

Article 85: Right to Lodge a Complaint with a Market Surveillance Authority

Article 86: A Right to Explanation of Individual Decision-Making

Article 87: Reporting of Breaches and Protection of Reporting Persons

Section 5: Supervision, Investigation, Enforcement and Monitoring in Respect of Providers of General Purpose AI Models

Article 88: Enforcement of Obligations on Providers of General Purpose AI Models

Article 89 : Monitoring Actions

Article 90: Alerts of Systemic Risks by the Scientific Panel

Article 91: Power to Request Documentation and Information

Article 92: Power to Conduct Evaluations

Article 93: Power to Request Measures

Article 94: Procedural Rights of Economic Operators of the General Purpose AI Model

Annexes

Search within the Act

2 May 2024 – The AI Act Explorer has now been updated with content from the European Parliament's 'Corrigendum' version from 19 April 2024. The content of the Act is unlikely to change any further.

Article 74: Market Surveillance and Control of AI Systems in the Union Market

Feedback – We are working to improve this tool. Please send feedback to Risto Uuk at risto@futureoflife.org

1. Regulation (EU) 2019/1020 shall apply to AI systems covered by this Regulation. For the purposes of the effective enforcement of this Regulation:

(a) any reference to an economic operator under Regulation (EU) 2019/1020 shall be understood as including all operators identified in Article 2(1) of this Regulation;

(b) any reference to a product under Regulation (EU) 2019/1020 shall be understood as including all AI systems falling within the scope of this Regulation.

2. As part of their reporting obligations under Article 34(4) of Regulation (EU) 2019/1020, the market surveillance authorities shall report annually to the Commission and relevant national competition authorities any information identified in the course of market surveillance activities that may be of potential interest for the application of Union law on competition rules. They shall also annually report to the Commission about the use of prohibited practices that occurred during that year and about the measures taken.

3. For high-risk AI systems related to products covered by the Union harmonisation legislation listed in Section A of Annex I, the market surveillance authority for the purposes of this Regulation shall be the authority responsible for market surveillance activities designated under those legal acts. By derogation from the first subparagraph, and in appropriate circumstances, Member States may designate another relevant authority to act as a market surveillance authority, provided they ensure coordination with the relevant sectoral market surveillance authorities responsible for the enforcement of the Union harmonisation legislation listed in Annex I.

4. The procedures referred to in Articles 79 to 83 of this Regulation shall not apply to AI systems related to products covered by the Union harmonisation legislation listed in section A of Annex I, where such legal acts already provide for procedures ensuring an equivalent level of protection and having the same objective. In such cases, the relevant sectoral procedures shall apply instead.

5. Without prejudice to the powers of market surveillance authorities under Article 14 of Regulation (EU) 2019/1020, for the purpose of ensuring the effective enforcement of this Regulation, market surveillance authorities may exercise the powers referred to in Article 14(4), points (d) and (j), of that Regulation remotely, as appropriate.

6. For high-risk AI systems placed on the market, put into service, or used by financial institutions regulated by Union financial services law, the market surveillance authority for the purposes of this Regulation shall be the relevant national authority responsible for the financial supervision of those institutions under that legislation in so far as the placing on the market, putting into service, or the use of the AI system is in direct connection with the provision of those financial services.

7. By way of derogation from paragraph 6, in appropriate circumstances, and provided that coordination is ensured, another relevant authority may be identified by the Member State as market surveillance authority for the purposes of this Regulation. National market surveillance authorities supervising regulated credit institutions regulated under Directive 2013/36/EU, which are participating in the Single Supervisory Mechanism established by Regulation (EU) No 1024/2013, should report, without delay, to the European Central Bank any information identified in the course of their market surveillance activities that may be of potential interest for the prudential supervisory tasks of the European Central Bank specified in that Regulation.

8. For high-risk AI systems listed in point 1 of Annex III to this Regulation, in so far as the systems are used for law enforcement purposes, border management and justice and democracy, and for high-risk AI systems listed in points 6, 7 and 8 of Annex III to this Regulation, Member States shall designate as market surveillance authorities for the purposes of this Regulation either the competent data protection supervisory authorities under Regulation (EU) 2016/679 or Directive (EU) 2016/680, or any other authority designated pursuant to the same conditions laid down in Articles 41 to 44 of Directive (EU) 2016/680. Market surveillance activities shall in no way affect the independence of judicial authorities, or otherwise interfere with their activities when acting in their judicial capacity.

9. Where Union institutions, bodies, offices or agencies fall within the scope of this Regulation, the European Data Protection Supervisor shall act as their market surveillance authority, except in relation to the Court of Justice of the European Union acting in its judicial capacity.

10. Member States shall facilitate coordination between market surveillance authorities designated under this Regulation and other relevant national authorities or bodies which supervise the application of Union harmonisation legislation listed in Annex I, or in other Union law, that might be relevant for the high-risk AI systems referred to in Annex III.

11. Market surveillance authorities and the Commission shall be able to propose joint activities, including joint investigations, to be conducted by either market surveillance authorities or market surveillance authorities jointly with the Commission, that have the aim of promoting compliance, identifying non-compliance, raising awareness or providing guidance in relation to this Regulation with respect to specific categories of high-risk AI systems that are found to present a serious risk across two or more Member States in accordance with Article 9 of Regulation (EU) 2019/1020. The AI Office shall provide coordination support for joint investigations.

12. Without prejudice to the powers provided for under Regulation (EU) 2019/1020, and where relevant and limited to what is necessary to fulfil their tasks, the market surveillance authorities shall be granted full access by providers to the documentation as well as the training, validation and testing data sets used for the development of high-risk AI systems, including, where appropriate and subject to security safeguards, through application programming interfaces (API) or other relevant technical means and tools enabling remote access.

13. Market surveillance authorities shall be granted access to the source code of the high-risk AI system upon a reasoned request and only when both of the following conditions are fulfilled:

(a) access to source code is necessary to assess the conformity of a high-risk AI system with the requirements set out in Chapter III, Section 2; and,

(b) testing or auditing procedures and verifications based on the data and documentation provided by the provider have been exhausted or proved insufficient.

14. Any information or documentation obtained by market surveillance authorities shall be treated in accordance with the confidentiality obligations set out in Article 78.

The text used in this tool is the ‘Artificial Intelligence Act, Corrigendum, 19 April 2024’. Interinstitutional File: 2021/0106(COD)