Cambridge Commentary on EU General-Purpose AI Law

Chapter V
Authorised representatives of providers of general-purpose AI models
Hannes Bastians

AI Act provision

Article 54: Authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. of providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market.

  1. Prior to placing a general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. on the Union market, providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries shall, by written mandate, appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. which is established in the Union.
  2. The provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. shall enable its authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to perform the tasks specified in the mandate received from the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .
  3. The authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. shall perform the tasks specified in the mandate received from the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. . It shall provide a copy of the mandate to the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. upon request, in one of the official languages of the institutions of the Union. For the purposes of this Regulation, the mandate shall empower the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to carry out the following tasks:
    1. verify that the technical documentation specified in Annex XI has been drawn up and all obligations referred to in Article 53 and, where applicable, Article 55 have been fulfilled by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ;
    2. keep a copy of the technical documentation specified in Annex XI at the disposal of the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. and national competent authorities Article 3(48) AI Act: ‘national competent authority’ means a notifying authority or a market surveillance authority; as regards AI systems put into service or used by Union institutions, agencies, offices and bodies, references to national competent authorities or market surveillance authorities in this Regulation shall be construed as references to the European Data Protection Supervisor. , for a period of 10 years after the general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. has been placed on the market Article 3(9) AI Act: ‘placing on the market’ means the first making available of an AI system or a general-purpose AI model on the Union market. , and the contact details of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. that appointed the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ;
    3. provide the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. , upon a reasoned request, with all the information and documentation, including that referred to in point (b), necessary to demonstrate compliance with the obligations in this Chapter;
    4. cooperate with the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. and competent authorities, upon a reasoned request, in any action they take in relation to the general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. , including when the model is integrated into AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. placed on the market Article 3(9) AI Act: ‘placing on the market’ means the first making available of an AI system or a general-purpose AI model on the Union market. or put into service Article 3(11) AI Act: ‘putting into service’ means the supply of an AI system for first use directly to the deployer or for own use in the Union for its intended purpose. in the Union.
  4. The mandate shall empower the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to be addressed, in addition to or instead of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. , by the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. or the competent authorities, on all issues related to ensuring compliance with this Regulation.
  5. The authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. shall terminate the mandate if it considers or has reason to consider the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to be acting contrary to its obligations pursuant to this Regulation. In such a case, it shall also immediately inform the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. about the termination of the mandate and the reasons therefor.
  6. The obligation set out in this Article shall not apply to providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. that are released under a free and open-source licence that allows for the access, usage, modification, and distribution of the model, and whose parameters, including the weights, the information on the model architecture, and the information on model usage, are made publicly available, unless the general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. present systemic risks Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. .

Recitals

Recital 82

To enable enforcement of this Regulation and create a level playing field for operators Article 3(8) AI Act: ‘operator’ means a provider, product manufacturer, deployer, authorised representative, importer or distributor. , and, taking into account the different forms of making available of digital products, it is important to ensure that, under all circumstances, a person established in the Union can provide authorities with all the necessary information on the compliance of an AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. . Therefore, prior to making their AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. available in the Union, providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries should, by written mandate, appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. established in the Union. This authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. plays a pivotal role in ensuring the compliance of the high-risk AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. placed on the market Article 3(9) AI Act: ‘placing on the market’ means the first making available of an AI system or a general-purpose AI model on the Union market. or put into service Article 3(11) AI Act: ‘putting into service’ means the supply of an AI system for first use directly to the deployer or for own use in the Union for its intended purpose. in the Union by those providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. who are not established in the Union and in serving as their contact person established in the Union.

Note: The recital explicitly only refers to AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. , not general-purpose AI (“GPAI”) models. As Article 54 is similar to Article 22 (which Recital 82 addresses), Recital 82 might still be useful as guidance for the interpretation of Article 54.

Select bibliography

  • Clemens Bernsteiner & Thomas Reiner Schmitt, KI-VO Art. 54 Bevollmächtigte der Anbieter von KI-Modellen mit allgemeinem Verwendungszweck, in KI-VO: Verordnung über Künstliche Intelligenz (Christiane Wendehorst & Mario Martini eds., 2nd ed., C.H. Beck 2026).
  • Michael Beurskens, KI-VO Art. 54 Bevollmächtigte der Anbieter von KI-Modellen mit allgemeinem Verwendungszweck, in KI-VO: Verordnung über künstliche Intelligenz (David Bomhard, Fritz-Ulli Pieper & Susanne Wende eds., Fachmedien Recht und Wirtschaft 2025).
  • Timo Bosman, KI-Systeme von Nicht-EU-Anbietern – Wertschöpfungskette, Pflichten, Anbieterfiktion (2025) Kommunikation und Recht 217.
  • Adrian Schneider, KI-VO Art. 54 Bevollmächtigte der Anbieter von KI-Modellen mit allgemeinem Verwendungszweck, in Beck’scher Online-Kommentar KI-Recht (Jens Schefzig & Robert Kilian eds., C.H. Beck 2026).

Commentary

1. General remarks

1.1. Introduction

1Article 54 serves the purpose of enabling and ensuring enforcement of the AI Act – particularly Articles 53 and 55 – in cases in which the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of a GPAI model is not established in the Union.1 In those cases, the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. functions as a point of contact for the authorities and can provide them with necessary information from within the Union.

2A provision containing rules on the appointment of authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. by GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. was not included in earlier drafts of the AI Act. This can be attributed to the fact that provisions concerning GPAI in general were incorporated into the legislative text for the first time only during the trilogue negotiations. By contrast, the provision governing the obligation of providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of high-risk AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. established in third countries to appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. (Article 22), as well as Recital 82 related to Article 22 and the definition of ‘ authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ’ in Article 3(5), were already included in the Commission’s first proposal.2 The latter two of these have undergone comparatively minor changes during the legislative process: from Recital 82, an addition proposed by the Commission and the Council (‘where an importer Article 3(6) AI Act: ‘importer’ means a natural or legal person located or established in the Union that places on the market an AI system that bears the name or trademark of a natural or legal person established in a third country. cannot be identified’) was deleted, while only part of the clarification proposed by the Council, which emphasises the ‘pivotal role’ of the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. , was incorporated. The definition of the term ‘ authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ’ also underwent minor changes. Compared to the Commission proposal and the European Parliament’s mandate, it was added that the designated representative must also accept the written mandate. The provision now set out in Article 22 was subject to more substantial amendments. It is readily apparent that the structure and wording of Article 54 have drawn on Article 22. Nevertheless, certain differences between the two provisions can be identified and will be addressed below to the extent they are relevant for interpreting Article 54. However, it remains partly unclear whether these divergences were intentional or rather the result of the time constraints and the political complexity of the trilogue negotiations. Arguments based on the differing wording of the two provisions should, therefore, be treated with caution.

3Systematically, the provision is most closely interlinked with Articles 53 and 55, which, together with Article 54, fall under Chapter V on ‘ General-Purpose AI Models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. ’.

4The provision addresses providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries placing their GPAI models on the Union market as well as their representatives. According to Article 54(6), providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. without systemic risk Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. that are released under a free and open-source licence are exempt from the obligations laid down in Article 54. The provision is in force since 2 August 2025 (Article 113(3)(b)).

1.2. Article structure overview

5Article 54 comprises six paragraphs. Paragraph (1) stipulates the core obligation for providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of GPAI models established in third countries to appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. prior to placing their model on the Union market. Paragraph (2) provides that GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries must enable the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to perform the tasks specified in the mandate. Further details on the scope of these tasks are set out in paragraphs (3) and (4). Pursuant to paragraph (5), the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. must terminate its mandate under certain conditions. Finally, paragraph (6) lays down an exemption from Article 54(1) for GPAI models released under a free and open-source licence, subject to a counter-exception for GPAI models with systemic risk Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. . The commentary on the substance of the provision will follow the aforementioned structure.3

1.3. Authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. in EU law

6Provisions governing the appointment of representatives by actors established in third countries are not a novelty introduced by the AI Act within European law. The idea underlying rules on the appointment of representatives is to ensure efficient enforcement of the respective legal acts by guaranteeing accessibility and availability of actors not established in the EU through simplified communication and shortened communication channels.4

7Rules comparable to Article 54 can be found in European digital law in Article 27 GDPR5, Article 17 Terrorist Content Online Regulation (“TCOR”)6, Articles 11(3) and 19(3) Data Governance Act (“DGA”)7, Article 26(3) NIS2 Directive8, Article 13 Digital Services Act (“DSA”)9 and Article 31 European Health Data Space Regulation (“EHDSR”)10. The concept of the representative is also familiar from product safety law as well as market surveillance and product liability law; for instance, it appears in Article 11 Medical Device Regulation (“MDR”)11, Article 8(1)(c)(ii) and (iii) Product Liability Directive (“PLD”)12 and Article 5 Market Surveillance Regulation (“MSR”)13. Although these respective legal acts evidently pursue different objectives and regulate distinct subject matters, cross-instrument systematic considerations may nevertheless offer valuable insights for interpreting Article 54 AI Act, since the respective provisions on the appointment of authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. are ultimately all based on a common underlying principle. For this reason, this chapter will, at times, refer to some of the provisions listed above.14

1.4. Enforcement of Article 54

8The consequences arising from breaches of obligations by either the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. or the represented provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. are not regulated in Article 54 but are nonetheless of considerable relevance. Where a provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of a GPAI model established in a third country fails to comply with its obligation to appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. , it faces, pursuant to Article 101(1)(a), an administrative fine of up to 3% of its total worldwide annual turnover in the preceding financial year, or €15m, whichever is higher. Article 101 of the AI Act does not (expressly) provide for the liability of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. for breaches committed by the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. . Instead, the wording of Article 101 AI Act addresses only infringements committed by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. itself (‘when the Commission finds that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. intentionally or negligently …’).

9Moreover, according to the wording of Article 99(4)(b), penalties against the representative itself seem to be envisaged solely with regard to infringements committed by authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. mandated pursuant to Article 22. An identical provision does not exist with respect to the representative mandated pursuant to Article 54 – neither in Article 99 nor in Article 101. One could argue that Article 99(5) may be applicable. This stipulates that administrative fines of up to €7.5m or, where the offender is an undertaking, up to 1% of its total worldwide annual turnover for the preceding financial year may be imposed where ‘incorrect, incomplete or misleading information’ is supplied to notified bodies Article 3(22) AI Act: ‘notified body’ means a conformity assessment body notified in accordance with this Regulation and other relevant Union harmonisation legislation. or national competent authorities Article 3(48) AI Act: ‘national competent authority’ means a notifying authority or a market surveillance authority; as regards AI systems put into service or used by Union institutions, agencies, offices and bodies, references to national competent authorities or market surveillance authorities in this Regulation shall be construed as references to the European Data Protection Supervisor. in reply to a request. However, Article 99 is not applicable to GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. and their representatives at all, so non-compliance by the representative cannot be fined under Article 99.

10Joint and several liability between the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. and the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. is also not envisaged. Recital 56a of the Council’s proposal, which – with regard to the representative under Article 22 – states that ‘it is appropriate to make the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. jointly and severally liable with the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. for defective high-risk AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. ’, was not incorporated into the final text; neither for the representative pursuant to Article 22 nor the representative pursuant to Article 54. From a cross-instrument systematic perspective, this conclusion is also supported by a comparison with Article 11(5) MDR and Article 31(4) EHDSR, both of which expressly establish such joint and several liability (‘jointly and severally with’).15 Joint and several liability would additionally give rise to difficulties in practical implementation. Whilst Article 54(3)(a) tasks the representative with verifying that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. fulfils its obligations under Article 53 and, where applicable, Article 55, it is unclear how a representative could ensure or enforce that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. terminates an infringement of the AI Act.16 The representative would equally find it difficult to ensure that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. would reimburse it for any fine payments made – not least because the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. is established in a third country.17

11Ultimately, therefore, the representative cannot be held liable under the AI Act if it is non-compliant with its obligations laid down in Article 54 or the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. that appointed it is non-compliant with its obligations pursuant to Article 53 and/or 55. This is especially problematic with a view to the fact that most providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of GPAI models with systemic risk Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. will not be established in the Union. A provision similar to Article 99(4)(b), which allows penalties to be imposed on the Article 22 representative of high-risk AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries, would have been helpful to incentivize compliance, at least with respect to providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of GPAI models with systemic risk Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. .

12This raises the question of whether the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. can be penalised if the representative does not comply with its obligations under the AI Act. Looking at Article 101(1)(a), one could argue that this is not the case since Article 101 only addresses the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. infringing the relevant provisions in the AI Act and the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ’s obligation is only to appoint the representative and enable it to perform its tasks. However, Article 54 would be in danger of becoming a largely ineffective tool should it not be possible to hold the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. liable for non-compliance by the representative. From a policy perspective, it therefore seems plausible to argue that providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. can be held liable for non-compliance by their representatives, also giving them an incentive to carefully select a suitable and competent representative. Otherwise, the main goal of enabling enforcement would be contradicted.

13The Commission will be able to make use of Article 91(1) to request the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to provide information necessary to assess compliance with Article 54 – especially on who the representative is, its contact details and when it was mandated. A provision like Article 49(1) or (2) that mandates the registration of the representative in the database referred to in Article 71 does not exist for representatives of GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .18 However, it seems clear that it lies in the very nature of the role of the representative as a point of contact that the appointment must be communicated to the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. .19 Without prior notification, the representative could not fulfil its role, the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. could not file a request in the sense of Article 54(3)(c) and the representative could not effectively cooperate with the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. in the sense of Article 54(3)(d).20

2. Substance

2.1. Article 54(1): Appointment of an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation.

2.1.1. Providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries

14The main requirement for the applicability of the obligation according to Article 54(1) is that a provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of a GPAI model21 is established in a third country. Article 2(1) as well as Recital 21 et seq. show that ‘third country’ refers to all countries outside the EU.22 That means that, at the moment, GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. from Iceland, Liechtenstein and Norway must also appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. according to Article 54(1) prior to placing their model on the Union market. This situation will change, however, once the EEA countries decide to apply the AI Act, at which point they will no longer be considered third countries.23

2.1.2. Prior to placing a GPAI model on the Union market

15The GPAI model provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. must appoint its authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. prior to placing its GPAI model on the Union market. Article 3(9) defines such placing on the market Article 3(9) AI Act: ‘placing on the market’ means the first making available of an AI system or a general-purpose AI model on the Union market. as ‘the first making available of an AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. or a general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. on the Union market’. The making available on the market Article 3(10) AI Act: ‘making available on the market’ means the supply of an AI system or a general-purpose AI model for distribution or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge. in turn is defined in Article 3(10) as ‘the supply of (…) a general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. for distribution or use on the Union market in the course of a commercial activity, whether in return for payment or free of charge’.

16The AI Act does not provide for a more detailed definition of the term ‘prior’. Looking at the purpose of the provision, it might be arguable that any moment prior to making the GPAI model available is sufficient, since it is only from that point in time that a contact person for the authorities in the EU is needed. Nevertheless, it is advisable for providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to appoint a representative as early as possible, since the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. will encourage close informal cooperation with providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. already during the training phase of the model in order to facilitate compliance and to ensure timely placement on the market.24

17It should be noted that an actor may become a provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. – and thus be subject to the obligation under Article 54 – even if they have not developed the original model but ‘modified or fine-tuned [it] into new models’.25 The Commission guidelines suggest that a downstream modifier of a model becomes the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of the model ‘if the modification leads to a significant change in the model’s generality, capabilities or systemic risk Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. 26 – an indication for that being ‘that the training compute used for the modification is greater than a third of the training compute of the original model’.27

2.1.3. Appointing an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. established in the Union

18Third-country providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of GPAI models must appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. that is established in the Union. The term ‘ authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ’ is defined in Article 3(5) as ‘a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of an AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. or a general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation’.

19The authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. must be a natural or legal person.28

20The representative can also be a subsidiary or parent undertaking, which is not involved in the making available on the Union market of the GPAI model.29 There is no provision in the AI Act that prohibits this, and ultimately there seems to be no reason to do so. It can even be argued that subsidiary and parent undertakings may be best suited for the role as a point of contact in the EU because they are closely linked to the GPAI model provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. itself.30 From a cross-instrument viewpoint, one can add that Recital 44 DSA also allows the representative to be a subsidiary or parent undertaking. It is equally permissible to appoint a downstream provider Article 3(68) AI Act: ‘downstream provider’ means a provider of an AI system, including a general-purpose AI system, which integrates an AI model, regardless of whether the AI model is provided by themselves and vertically integrated or provided by another entity based on contractual relations. as the GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. , even though this could lead to conflicts of interest in some cases.31

21Under the definition of Article 3(5), an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. must be ‘located or established’ in the Union, while Article 54(1) requires the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. which is ‘established in the Union’. This raises the question of whether those are different requirements. According to the wording of Article 3(5), the first alternative appears to refer to the physical location, while the second alternative refers to the registered office of the representative.32 Support for this may also be found in the legislative history: the addition ‘physically present’ in the Council’s proposal was changed to ‘located’ during the trilogue negotiations.33 However, it remains unclear why Article 54(1) only refers to the representative being ‘established’ in the Union, while the legal definition in Article 3(5) speaks of it being ‘located or established’ in the Union. On the one hand, this could be an oversight and the addition of ‘located’ was simply forgotten to be added to Article 54(1) under the time pressure and political complexity of the trilogue negotiations. On the other hand, Article 22 also only refers to the representative being ‘established’ in the Union and has not been changed throughout the whole legislative process. Ultimately, it therefore seems to depend on whether the representative is ‘established’ in the Union.34 Following a purposive reading, this requirement would not be met, for example, where the representative serves as a mere letterbox-company or an empty shell or where the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. only occasionally travels to the EU to perform its duties.35

22The AI Act does not contain any express requirements regarding the professional competence and technical and legal expertise of the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. . However, this is not unique to the AI Act. The vast majority of the EU legal acts with provisions on representatives do not contain such requirements either.36 This initially suggests that third-country providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of GPAI models are generally free to decide whom they appoint as a representative in this regard.37 On the other hand, however, this understanding is contradicted by the fact that the effective enforcement of the AI Act – being the main purpose of the provision – cannot be guaranteed if the representative acts without any professional competence or expertise.38 This understanding is also emphasised by Article 54(3)(a), according to which one task of the representative is to verify the fulfilment of Article 53 as well as (if applicable) Article 55 by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. – a task which cannot be fulfilled without any technical and legal knowledge. Article 15(6) MDR, which stipulates that the representative ‘shall have permanently and continuously at their disposal at least one person responsible for regulatory compliance who possesses the requisite expertise’ could serve as a model for future regulations or guidance.39

23Additionally, the question arises as to whether one person can act as the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. for several providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. . The fact that there is an absence of any express provision that speaks against this seems to be the main argument in favour of such a possibility. In addition to that, a cross-instrument systematic argument speaks for the possibility of one representative to be mandated by different providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. : Article 11(1) MDR speaks of a ‘sole’ representative that needs to be appointed. This addition – not present in Article 54 AI Act – could be understood to mean that a representative may only act for one provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. at a time. On the other hand, a clarification like it can be found in Recital 44 DSA that it ‘should be possible for a legal representative to be mandated, in accordance with national law, by more than one provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of intermediary services’ is also not to be found in the AI Act. One could therefore argue that, due to the lack of such a clarification in the AI Act, it is not permissible for a representative to act on behalf of several providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .40 However, the force of an argument based on the omission of a certain clarification in the recitals is limited by the fact that the recitals of the AI Act offer only very limited guidance for the interpretation of Article 54 in general. From a policy point of view, there are arguments in favour as well as against allowing a single representative to act on behalf of several providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. . What speaks against such an option is the fact that this would open the possibility of one representative responding to inquiries by authorities on behalf of competing providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. and managing the technical documentation of potential competitors, leading to confidentiality risks Article 3(2) AI Act: ‘risk’ means the combination of the probability of an occurrence of harm and the severity of that harm. . On the other hand, however, one can argue that permitting one representative being mandated by multiple providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. serves the precise goal of the provision by centralising expertise, promoting compliance and ensuring enforcement across multiple providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .

24Lastly, one could ask whether one person can act as the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. for the same entity under several legal acts at the same time. Concerns in this regard could target confidentiality concentration risks Article 3(2) AI Act: ‘risk’ means the combination of the probability of an occurrence of harm and the severity of that harm. as well as capacity problems. For example, a representative that acts as such under the AI Act, the GDPR and a national law implementing the NIS2 Directive would have access to the model’s technical documentation, the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ’s DPIA and existing cybersecurity incident reports. This would significantly increase the risk Article 3(2) AI Act: ‘risk’ means the combination of the probability of an occurrence of harm and the severity of that harm. of misuse and disclosure and make the representative an attractive target for malicious actors. In the first place, however, Article 54 does not forbid a representative from acting in this role for the same person under several acts expressly. Additionally, as the AI Act does not seem to impose any express requirements regarding the competence of the representative, one could argue that this also allows for a representative to act in this role under several legal acts. An argument for this could also be that acting as a representative under several legal acts does not hinder the representative to serve as a point of contact in the Union.

2.1.4. By written mandate

25Article 54(1) AI Act stipulates that the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. shall be appointed by written mandate. Article 3(5) clarifies that this means that the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. has ‘received and accepted a written mandate’ from the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .41 More detailed requirements cannot be derived from the AI Act. However, the wording (‘received and accepted’) implies that a written contract must be concluded between the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. and the representative.42 This is also indicated by Article 54(5), speaking of the termination of the mandate by the representative under certain circumstances.

26However, the question arises of what exactly is meant by ‘written’. Here it seems to be useful to recall the purpose of the provision. Any form of written communication is sufficient to establish an externally recognisable point of contact. If one does not see the purpose of the requirement in protecting against hasty decisions or verifying the authenticity of the mandate, a simple text form should suffice – a handwritten signature would then not be necessary.43 An argument for this could be that the representative does not need to be protected from hasty decisions, because there is no joint liability along with the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. . However, it could be argued that at least a certain degree of integrity and authenticity as well as protection against hasty decisions must be ensured and that a signature is therefore necessary.44 Whether an electronic signature is sufficient should the latter view be taken remains open. This assumption could be countered by the fact that the AI Act expressly mentions the possibility of electronic signatures in Article 47(1), but not in Article 54.45

2.2. Article 54(2): Enabling the representative to perform the tasks specified in the mandate

27Article 54(2) stipulates that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. shall ‘enable its authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to perform the tasks specified in the mandate received from the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ’. This definitely includes that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. does not hinder the representative in fulfilling its tasks.46 In addition, commentators seem to agree that the verb ‘enable’ must be understood as requiring active action on the part of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .47 The provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. would therefore have to provide the representative with all resources necessary to fulfil its obligations under Article 54(3)–(5). This includes granting access to all necessary information and documents. In cases where the representative is a subsidiary of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. , a failure to enable could be seen, for example, if the subsidiary is subject to reconstruction proceedings, bankruptcy, or personal or corporate insolvency.48

2.3. Article 54(3): Tasks of the representative

2.3.1. Tasks specified in the mandate

28According to Article 54(3), first sentence, the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. shall ‘perform the tasks specified in the mandate received from the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ’. This formulation raises the question of whether there are limits to the transfer of tasks by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. . At the outset, it is clear that the representative must be entrusted with the tasks specified in Article 54(3), third sentence. However, the AI Act does not indicate whether the transfer of further tasks beyond this is permissible; that is, whether the provision merely establishes a minimum threshold.49 From a cross-instrument systematic perspective, there are arguments against such permissibility. Other provisions governing the appointment of authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. expressly state that the representative must be entrusted with ‘at least’ the further specified tasks (Article 31(2) EHDS; Article 11(3) MDR). The absence of such a wording in the AI Act suggests that only the obligations set out in Article 54(3), third sentence, are to be transferred. This interpretation is, however, countered by the German version of the AI Act, which speaks of ‘zumindest’ (at least) the specified tasks.

29Ultimately, there appears to be no reason why the representative could not be entrusted with additional powers, provided that certain upper limits are respected: although the AI Act does not contain a provision like Article 11(4) MDR, which expressly excludes some specified tasks from being transferred to the representative, it can be assumed that certain core responsibilities in the AI Act are non-transferable and must remain with the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .50 Indications supporting this view may be taken from the Commission’s Blue Guide on the implementation of the product rules 2022,51 which, in its general explanations on the New Legislative Framework, states that ‘the manufacturer may neither delegate the measures necessary to ensure that the manufacturing process assures compliance of the products nor the drawing up of technical documentation, unless otherwise provided for. Further, an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. cannot modify the product on his own initiative.’ The precise boundary between permissible and impermissible delegation of tasks can hardly be determined in the abstract and will need to be assessed on a case-by-case basis. Considering the purpose of the rules on authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. (Recital 82 with regard to high-risk AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. : ‘enable enforcement’) suggests that the permissibility of delegating additional tasks should be examined in light of whether the delegation serves to enable the enforcement of the AI Act or pursues other objectives.

2.3.2. Providing a copy of the mandate to the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission.

30Article 54(3), second sentence, stipulates that the representative shall ‘provide a copy of the mandate to the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. upon request, in one of the official languages of the institutions of the Union’. Unlike Article 54(3)(c) (‘provide the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. , upon a reasoned request, with all the information and documentation…’), Article 54(3), second sentence, only refers to a ‘request’ by the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. and not to a ‘reasoned request’. Therefore, Article 54 does not require the request to be reasonable and the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. must always provide a copy of the mandate.

31One could ask what exactly the representative has to provide, when the AI Act speaks of ‘a copy of the mandate’, since the representative, according to Article 3(5), also needs to accept the mandate. In practice, the representative will likely send a signed copy to the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. . The official languages of the institutions of the Union, in one of which the copy of the mandate must be provided, are Bulgarian, Croatian, Czech, Danish, Dutch, English, Estonian, Finnish, French, German, Greek, Hungarian, Irish, Italian, Latvian, Lithuanian, Maltese, Polish, Portuguese, Romanian, Slovak, Slovenian, Spanish and Swedish.52 The fact that the copy must be provided in a language of the institutions of the Union does not, however, mean that the mandate itself must be drawn up in one of those languages – providing a translation to the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. is sufficient.53

32It remains unclear whether the entire copy of the mandate must be provided to the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. or whether certain information, such as exact salary arrangements, can be blacked out. The AI Act does not contain any information on this. A request by the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. for disclosure of the complete mandate will be justified, for example, if there are indications that the salary arrangements or other agreements prevent the representative from performing its tasks; in other words, if there is a concern that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. is not ‘enabling’ the representative within the meaning of Article 54(2).54 Concluding contractual agreements that may obstruct the representative’s ability to fulfil its role might cause the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to be in breach of Article 54(2). From a teleological perspective, this also follows from the overall purpose of Article 54 to ensure effective enforcement of the AI Act.

2.3.3. Specific tasks

33According to Article 54(3)(a), the representative shall ‘verify that the technical documentation specified in Annex XI has been drawn up and all obligations referred to in Article 53 and, where applicable, Article 55 have been fulfilled by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ’. There are no expressly stated further details in Article 54 or in the recitals describing the exact content of this task. Accordingly, at first glance, it does not seem mandatory for the representative to carry out an on-site verification. If the representative can adequately fulfil its obligations in another way, it seems not to be required to be on site at the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ’s premises.55 However, there are some obligations laid down in Article 55 in particular that seem to indicate that it may sometimes be advisable for the representative to conduct on-site verification. One example is Article 55(1)(d), which stipulates that providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of GPAI models with systemic risk Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. must ensure an adequate level of cybersecurity protection for the model itself as well as for the physical infrastructure of the model. The Code of Practice suggests that this also includes measures to prevent unauthorised physical access to sensitive working environments.56

34Also apart from that, it is not clear from Article 54(3)(a) what the exact obligations of the representative are. It seems like the representative is not required to review the content of the technical documentation, but only to check that the documentation has been drawn up at all, as is clear from the wording of Article 54(3)(a) ‘that [it] has been drawn up’. It seems conceivable, however, to require the representative to at least point out any obvious deficiencies.57 This is because, if such deficiencies exist, it could be argued that no technical documentation as ‘specified in Annex XI’ has been drawn up.58

35The question that then arises is what exactly is meant by the representative verifying that the obligations referred to in Article 53 and 55 have been fulfilled. Firstly, it stands out that Article 54(3)(a) expressly mentions the obligation to verify that the technical documentation has been drawn up, as this forms part of the obligations pursuant to Article 53 (Article 53(1)(a)), of which the fulfilment must be verified by the representative anyway.59 On the one hand, one could understand the clarification in Article 54(3)(a) to be lex specialis over the general obligation to verify that the obligations in Articles 53 and 55 have been fulfilled. This could be understood as meaning that only the obligation regarding the technical documentation is rather low in its threshold for fulfilment, while the representative must meet a higher threshold to verify the fulfilment of the other obligations. On the other hand, this could simply be an oversight without any meaning and the requirements are the same.

36Ultimately, the meaning of verification can be understood in light of Article 54(5), which obliges the representative to terminate the mandate where it considers or has reason to consider that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. is acting in breach of its obligations. Since the representative will have to terminate the mandate in such cases, there would be no scope for a full investigation to establish that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. really is non-compliant with Articles 53 and 55. It therefore seems more plausible that verification in this regard means that the representative must take reasonable measures that would enable it to become aware of reasons to consider that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. is acting contrary to its obligations. Such measures could include requesting additional information, taking interviews or inspecting the physical premises.

37A problem in this regard is that the representative cannot be held liable under the AI Act for being non-compliant with the AI Act at all, especially since the joint liability between it and the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. proposed in the Council’s proposal did not make it into the final legal text. Not taking the appropriate measures to verify compliance with Articles 53 and 55 will therefore not lead to fines imposed on the representative under the regime of the AI Act.

38Article 54(3)(b) stipulates that the representative shall ‘keep a copy of the technical documentation specified in Annex XI at the disposal of the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. and national competent authorities Article 3(48) AI Act: ‘national competent authority’ means a notifying authority or a market surveillance authority; as regards AI systems put into service or used by Union institutions, agencies, offices and bodies, references to national competent authorities or market surveillance authorities in this Regulation shall be construed as references to the European Data Protection Supervisor. , for a period of 10 years after the general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. has been placed on the market Article 3(9) AI Act: ‘placing on the market’ means the first making available of an AI system or a general-purpose AI model on the Union market. , and the contact details of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. that appointed the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ’. No specific type or format is prescribed for the copy of the documentation, so that any form seems to be sufficient, provided it enables the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. to ascertain the information required under Annex XI of the technical documentation.

39According to Article 54(3)(c) the representative shall ‘provide the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. , upon a reasoned request, with all the information and documentation, including that referred to in point (b), necessary to demonstrate compliance with the obligations’ in Chapter V ( general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. ). Unlike Article 54(3), second sentence, Article 54(3)(c) refers not merely to a request but to a ‘reasoned request’. This implies that the requirements placed upon the request are higher than those under Article 54(3), second sentence. Nevertheless, these requirements cannot be set too high, given that, according to Recital 82 AI Act, it is precisely the function of the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to act as the contact point for the authorities. A look at the Blue Guide suggests that, in general, EU product law proceeds on the assumption that ‘for a request to be reasoned it is sufficient the market surveillance authority Article 3(26) AI Act: ‘market surveillance authority’ means the national authority carrying out the activities and taking the measures pursuant to Regulation (EU) 2019/1020. explains the context in which the information is requested (e.g. inspection on specific characteristics of the products, random checks, etc.)’.60 Even though the AI Act cannot be considered to be a purely product regulation, but rather also a model and entity regulation, it seems plausible to adapt this approach here.61 In any case, a request is to be regarded as reasonable where the requirements of Article 91(4) are met, namely that the request ‘shall state the legal basis and the purpose of the request, specify what information is required, set a period within which the information is to be provided, and indicate the fines provided for in Article 101 for supplying incorrect, incomplete or misleading information’. The general power of the Commission to request information is addressed in greater detail in Article 91.

40Article 54(3)(d) provides that the representative shall ‘cooperate with the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. and competent authorities, upon a reasoned request, in any action they take in relation to the general-purpose AI model Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. , including when the model is integrated into AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. placed on the market Article 3(9) AI Act: ‘placing on the market’ means the first making available of an AI system or a general-purpose AI model on the Union market. or put into service Article 3(11) AI Act: ‘putting into service’ means the supply of an AI system for first use directly to the deployer or for own use in the Union for its intended purpose. in the Union’. The clarification, found solely in Article 54(3)(d), that the representative must also cooperate with the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. in cases where the model has been integrated into an AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. , could be interpreted to mean that, in other cases than in those covered by Article 54(3)(d) – so Article 54(3)(a)-(c) – cooperation with the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. is no longer required once the model has been integrated into an AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. that has been placed on the market Article 3(9) AI Act: ‘placing on the market’ means the first making available of an AI system or a general-purpose AI model on the Union market. or put into service Article 3(11) AI Act: ‘putting into service’ means the supply of an AI system for first use directly to the deployer or for own use in the Union for its intended purpose. in the Union. However, this interpretation is countered by the consideration that providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of GPAI models are often best placed to assess the risks Article 3(2) AI Act: ‘risk’ means the combination of the probability of an occurrence of harm and the severity of that harm. arising from integration of the model into an AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. .62 Additionally, Recital 97 clarifies that the ‘specific rules for general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. and for general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. that pose systemic risks Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. ’ should ‘apply also when these models are integrated or form part of an AI system Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. ’.

2.4. Article 54(4): Representative as point of contact

41Article 54(4) additionally stipulates that the mandate shall ‘empower the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to be addressed, in addition to or instead of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. , by the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. or the competent authorities, on all issues related to ensuring compliance with this Regulation’. This especially includes that the Commission may request the representative instead of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to provide documentation or additional information according to Article 91(1). Furthermore, the structured dialogue pursuant to Article 91(2) can be initiated with the representative instead of the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .

2.5. Article 54(5): Termination of the mandate by the representative

42Article 54(5) entails a provision on the termination of the mandate by the representative. The representative shall terminate its mandate ‘if it considers or has reason to consider the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to be acting contrary to its obligations pursuant to this Regulation’. In such cases, the representative shall also ‘immediately inform the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. about the termination of the mandate and the reasons therefor’. The latter serves the purpose to make the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. aware of cases in which action by it may be required.63

43The wording ‘considers or has reason to consider the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to be acting contrary to its obligations’ seems at first glance to establish a very low threshold for terminating the mandate. Other regulations, such as Article 11(2)(h) MDR, use stricter wording (‘if the manufacturer acts contrary to its obligation’).64 The alternative relationship between ‘considers’ or ‘has reasons to consider’ implies that the representative would have to terminate the mandate even if it does not itself consider that the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. is acting contrary to its obligations, but already if there are indications of a violation that would be reasons for an objective third party to assume a violation.65 Since Article 54(5) does not specify whether any acting contrary to the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. ’s obligations is sufficient, it could be argued that any, even minor, violation of the AI Act would be sufficient.

44However, this understanding is contradicted by the overall purpose of the provisions on authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. in general, as well by those set out in the AI Act specifically. According to Recital 82,66 the representative in the Union is intended precisely to ensure the enforcement of the AI Act and to serve as a contact point for authorities. Even if one could argue that the precise aim to ensure enforcement is fulfilled since the representative must inform the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. about its reasons to terminate the mandate, this does not fully reflect the overall purpose of the provision. Firstly, the representative will likely never conduct a full investigation of a breach in question.67 Secondly, the role of the representative as a point of contact for ensuring the enforcement of the AI Act is particularly necessary in situations where a breach of the AI Act is at issue. An overly broad interpretation of the obligation to terminate the mandate would lead to the contradictory outcome that, precisely in situations where a contact person within the Union is needed the most, no such person would be available because the representative would have been required to terminate the mandate.68 Moreover, the representative does not face any penalties for non-compliance by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. , and it is therefore reasonable to expect that the representative should not terminate the mandate at the slightest indication of a minor violation by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. .69 There are three pathways to achieve this goal. One could, firstly, read the provision as it would say ‘if it considers or has reason to consider the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to be acting significantly contrary to its obligations pursuant to this Regulation’. Another way to avoid a too narrow interpretation would be to oblige the representative to terminate the contract in all cases where it considers the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. to be acting contrary to its obligations, but to require stronger indications in cases where there are only ‘reasons to consider’ and the representative is not convinced itself. Finally, one could interpret Article 54(4) as establishing merely the right, rather than the obligation, of the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. to terminate its mandate.70

45If one were to adhere strictly to a literal interpretation of Article 54(5), one possible partial solution to the problem could be the conclusion of mutual assistance treaties between the EU and relevant third countries, such as the US and UK, to offer the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. a path to achieve (some degree of) enforcement of the AI Act against providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established (or residing) beyond the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. ’s territorial jurisdiction. Perhaps this could be achieved, or at least facilitated, by the AI Office Article 3(47) AI Act: ‘AI Office’ means the Commission’s function of contributing to the implementation, monitoring and supervision of AI systems and general-purpose AI models, and AI governance, provided for in Commission Decision of 24 January 2024; references in this Regulation to the AI Office shall be construed as references to the Commission. ’s participation in the global network of AI Safety Institutes.

46It also seems problematic that Article 54 contains no provisions governing the change of an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. , as can be found in, for example, Article 12 MDR71 or Article 31(3) EHDS72. Article 12 MDR provides that detailed arrangements in this regard should be made between ‘the manufacturer, where practicable the outgoing authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. , and the incoming authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ’, addressing inter alia the transfer of documents as well as the obligation of the outgoing authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. after the end of the mandate to forward any complaints to the manufacturer or incoming authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. . Since such a provision does not exist in the AI Act, GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries cannot be required to conclude such arrangements with their respective representatives. Nevertheless, it would be advisable for them to take account of the possibility of changes in the role of the representative in their respective mandates.73

47The AI Act leaves open the question of whether termination by the representative for reasons other than those specified in Article 54(5), as well as termination by the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. itself, is possible. Allowing for a termination in such circumstances would be consistent with the purpose underlying the rules on authorised representatives Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. . In particular, the represented party should have the option of terminating the mandate if the representative fails to adequately fulfil their tasks so as to ensure the enforcement of the AI Act.74 If one interprets Article 54(5) to allow the representative the possibility of termination on other grounds,75 it should be borne in mind that it may be difficult for providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries to appoint a new representative without delay. This argues against permitting termination without notice.

2.6. Article 54(6): Exception for free and open-source models

48Article 54(6) provides for an exemption from the obligations on GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries to appoint an authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. where the model is ‘released under a free and open-source licence that allows for the access, usage, modification, and distribution of the model, and whose parameters, including the weights, the information on the model architecture, and the information on model usage, are made publicly available’. A counter-exemption applies – meaning that GPAI model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. established in third countries must nonetheless appoint a representative – where the model is classified as a GPAI model with systemic risk Article 3(65) AI Act: ‘systemic risk’ means a risk that is specific to the high-impact capabilities of general-purpose AI models, having a significant impact on the Union market due to their reach, or due to actual or reasonably foreseeable negative effects on public health, safety, public security, fundamental rights, or the society as a whole, that can be propagated at scale across the value chain. . The exception in Article 54(6) makes sense insofar as according to Article 53(2) providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of open-source models are themselves exempt from the obligations under Article 53(1)(a) and (b).76 However, since the representative’s tasks encompass verifying compliance of the GPAI model provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. with all obligations under Article 53, including those even open-source model providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. are not exempt from, one could ask whether the complete exception for open-source providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. in Article 54(6) makes sense from a policy point of view, since even in those cases a representative could help to ensure enforcement.

49Even though the wording in Article 54(6) is slightly different from that in Article 53(2), the Commission guidelines strongly suggest that both exemptions are the same in content. For a detailed explanation, refer to the commentary on Article 53(2).77

  1. See also Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024 laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) [2024] OJ L 1689/1 (“AI Act”) recital 82 regarding AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. . ↩︎
  2. European Commission, ‘Proposal for a Regulation of the European Parliament and of the Council laying down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) COM (2021) 206 final’. ↩︎
  3. See Section ‎2. ↩︎
  4. Paul Voigt and Hannes Bastians, ‘Pflicht zur Benennung von EU-Vertretern im europäischen Digitalrecht – “Representative” als neues Mittel zur Rechtsdurchsetzung’ (2022) Multimedia und Recht 930. ↩︎
  5. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data Article 3(50) AI Act: ‘personal data’ means personal data as defined in Article 4, point (1), of Regulation (EU) 2016/679. and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1. ↩︎
  6. Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online [2021] OJ L172/79. ↩︎
  7. Regulation (EU) 2022/868 of the European Parliament and of the Council of 30 May 2022 on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act) [2022] OJ L152/1. ↩︎
  8. Directive (EU) 2022/2555 of the European Parliament and of the Council of 14 December 2022 on measures for a high common level of cybersecurity across the Union, amending Regulation (EU) 910/2014 and Directive (EU) 2018/1972, and repealing Directive (EU) 2016/1148 (NIS 2 Directive) [2022] OJ L333/80. ↩︎
  9. Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) [2022] OJ L277/1. ↩︎
  10. Regulation (EU) 2025/327 of the European Parliament and of the Council of 11 February 2025 on the European Health Data Space and amending Directive 2011/24/EU and Regulation (EU) 2024/2847 (European Health Data Space) [2025] OJ L 63/1. ↩︎
  11. Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, amending Directive 2001/83/EC, Regulation (EC) 178/2002 and Regulation (EC) 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC (Medical Devices Regulation) [2017] OJ L117/1. ↩︎
  12. Directive (EU) 2024/2853 of the European Parliament and of the Council of 23 October 2024 on liability for defective products and repealing Council Directive 85/374/EEC [2024] OJ L2853/1. ↩︎
  13. Regulation (EU) 2019/1020 of the European Parliament and of the Council of 20 June 2019 on market surveillance and compliance of products and amending Directive 2004/42/EC and Regulations (EC) 765/2008 and (EU) 305/2011 [2019] OJ L169/1. ↩︎
  14. Also see forthcoming chapter on Interpreting the AI Act through Systematic Analogies in this work. ↩︎
  15. Adrian Schneider, ‘KI-VO Art. 54 Bevollmächtigte der Anbieter von KI-Modellen mit allgemeinem Verwendungszweck’ in Jens Schefzig and Robert Kilian (eds.), Beck’scher Online-Kommentar KI-Recht (5th edn, C.H. Beck 2026) para 29; see also Timo Bosman, ‘KI-VO Art. 22 Bevollmächtigte der Anbieter von Hochrisiko-KI-Systemen’ in Jens Schefzig and Robert Kilian (eds.), Beck’scher Online-Kommentar KI-Recht (5th edn, C.H. Beck. 2026) para 28; see similarly for article 13 DSA (‘Joint and  several liability would require an explicit provision in the DSA‘) Paul-John Loewenthal, ‘Article 13 Legal representatives’ in Folkert Wilman, Saulius Lukas Kalėda and Paul-John Loewenthal, The EU Digital Services Act: A Commentary (Oxford University Press 2024) para 13. ↩︎
  16. Cf. ibid; also see Michael Beurskens, KI-VO Art. 54 Bevollmächtigte der Anbieter von KI-Modellen mit allgemeinem Verwendungszweck‘, in KI-VO: Verordnung über künstliche Intelligenz (David Bomhard, Fritz-Ulli Pieper & Susanne Wende eds., Fachmedien Recht und Wirtschaft 2025) para 22. ↩︎
  17. ibid. ↩︎
  18. From a cross-instrument perspective, one could also point at the fact that a provision like article 13(4) DSA which mandates that providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. shall actively notify the name, postal address, email address and telephone number of their legal representative to the relevant authority cannot be found in the AI Act. ↩︎
  19. Clemens Bernsteiner and Rainer Schmitt, ‘KI-VO Art. 54 Bevollmächtigte der Anbieter von KI-Modellen mit allgemeinem Verwendungszweck’ in Mario Martini and Christiane Wendehorst (eds.), KI-VO: Verordnung über Künstliche Intelligenz (2nd ed, C.H. Beck, 2026) para 11. ↩︎
  20. ibid. ↩︎
  21. More detailed information on which actor is to be considered the provider Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of the GPAI model can be found in the Annex to the Communication to the Commission – Approval of the content of the draft Communication from the Commission – Guidelines on the scope of the obligations for general-purpose AI models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. established by Regulation (EU) 2024/1689 (AI Act)’ C(2025) 5045 final paras 48–71. ↩︎
  22. Timo Bosman, ‘KI-Systeme von Nicht-EU-Anbietern – Wertschöpfungskette, Pflichten, Anbieterfiktion’ (2025) Kommunikation und Recht 217. ↩︎
  23. ibid. ↩︎
  24. Commission Guidelines (n 21) para 102. ↩︎
  25. See recital 97 AI Act. ↩︎
  26. Commission Guidelines (n 21) para 62. ↩︎
  27. In more detail, Commission Guidelines (n 21) paras 60–71. ↩︎
  28. See, more extensively, the analysis on Article 3(5) discussed in forthcoming commentary on Article 3 in this work. ↩︎
  29. In the same sense for article 13 DSA Loewenthal (n 15) para 3; also see Schneider (n 15) para 9. ↩︎
  30. Bosman (n 22) 219. ↩︎
  31. Beurskens (n 16) para 9; agreeing Schneider (n 15) para 9. ↩︎
  32. Jonathan Kirschke-Biller and Anna Lena Füllsack, ‘KI-VO Art. 3 Begriffsbestimmungen’ in Jens Schefzig and Robert Kilian (eds.), Beck’scher Online-Kommentar KI-Recht (3rd edn, C.H. Beck 2025) para 137.1. ↩︎
  33. See also ibid para 137.2; however, one could also use this argument the other way around and argue that ‘physically present’ therefore should not be understood to mean ‘located’. ↩︎
  34. See in the same sense Kirschke-Biller and Füllsack (n 32) para 139; different understanding (established for legal persons; located for natural persons) Bernsteiner and Schmitt (n 19) para 16. ↩︎
  35. Same result at Bernsteiner and Schmitt (n 19) para 17; see also, more extensively, the analysis on Article 3(5) discussed in forthcoming commentary on Article 3 in this work. ↩︎
  36. Voigt and Bastians (n 4) 935. ↩︎
  37. Schneider (n 15) para 9; Bernsteiner and Schmitt (n 19) para 14, who argue that the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. must possess the necessary organisational and professional skills and resources. However, this requirement does not arise from the text of the AI Act and would, moreover, likely be far too vague to establish a breach of article 54(1). ↩︎
  38. Bernsteiner and Schmitt (n 19) para 14; in cases where the representative is a legal person, this competence and expertise will be provided through the representative’s staff, see ibid; differing opinion at Beurskens (n 16) para 5. ↩︎
  39. The requirements concerning the competence of these individuals are further defined in article 15(1) MDR. ↩︎
  40. Keep in mind again the limitations regarding such cross-instrument arguments. See, more extensively, forthcoming chapter on Interpreting the AI Act through Systematic Analogies in this work. ↩︎
  41. Opposing opinion (no acceptance necessary) Schneider (n 15) para 6. ↩︎
  42. Bernsteiner and Schmitt (n 19) para 11; in more detail on the legal nature and the applicable law Susanne Lilian Gössl, ‘KI-VO Art. 22 Bevollmächtigte der Anbieter von Hochrisiko-KI-Systemen’ in Mario Martini and Christiane Wendehorst (eds.), KI-VO: Verordnung über Künstliche Intelligenz (2nd ed, C.H. Beck, 2026) para 17 ff. ↩︎
  43. See for article 27 GDPR Carlo Piltz, ‘DS-GVO Artikel 27 Vertreter von nicht in der Union niedergelassenen Verantwortlichen oder Auftragsverarbeitern’ in Peter Gola & Dirk Heckmann (eds.), Datenschutz-Grundverordnung – Bundesdatenschutzgesetz (C.H. Beck, 2022) para 15; similarly (simple email can suffice) Beurskens (n 16) para 7. ↩︎
  44. Bernsteiner and Schmitt (n 19) para 12. ↩︎
  45. ibid. ↩︎
  46. Bosman (n 15) para 53. ↩︎
  47. Schneider (n 15) para 8; Bosman (n 15) para 54. ↩︎
  48. See recital 44 DSA. Note, however, that article 13(2) DSA does not contain the exact same wording as article 54(2) AI Act but rather states that providers Article 3(3) AI Act: ‘provider’ means a natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge. of intermediary services ‘shall provide their legal representative with necessary powers and sufficient resources to guarantee their efficient and timely cooperation with the Member States’ competent authorities, the Commission and the Board, and to comply with such decisions’. ↩︎
  49. Schneider (n 15) para 10; similar Bernsteiner and Schmitt (n 19) para 11 and 18. ↩︎
  50. Bosman (n 22) 219; see also Bosman (n 15) para 95; agreeing Bernsteiner and Schmitt (n 19) para 18 and Schneider (n 15) para 16; seemingly differing opinion at Beurskens (n 16) para 11 (‘It goes without saying that the representative may also be granted a general power of attorney, or may be assigned specific additional tasks beyond those falling within the competence of the representative pursuant to paragraph 3 — such as distribution and support — in relation to third parties’, translated from German). ↩︎
  51. Commission Notice, The ‘Blue Guide’ on the Implementation of EU Product Rules 2022 [2022] OJ C247/1. ↩︎
  52. Council Regulation (EEC) No 1 determining the languages to be used by the European Economic Community [1958] OJ 17/385, art 1. ↩︎
  53. Beurskens (n 16) para 8; agreeing Schneider (n 15) para 8. ↩︎
  54. Similar for MDR Susanne A Wagner, ‘VO (EU) 2017/745 Art. 11 (Art. 11 IVDVO) Bevollmächtigter’ in Wolfgang A Rehmann and Susanne A Wagner (eds.), MP-VO: Verordnung (EU) 2017/745 über Medizinprodukte mit integrierter erläuterter Verordnung (EU) 2017/746 über Invitro-Diagnostika (C.H. Beck, 2023) para 7; Bosman (n 15) para 63 argues that a distinction between the mandate and the underlying civil law contract needs to be drawn and that, therefore, aspects like salary agreements do not need to be disclosed. ↩︎
  55. Schneider (n 15) para 11. ↩︎
  56. European Commission, ‘Code of Practice for General-Purpose AI Models Article 3(63) AI Act: ‘general-purpose AI model’ means an AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market and that can be integrated into a variety of downstream systems or applications, except AI models that are used for research, development or prototyping activities before they are placed on the market. – Safety and Security Chapter’ (2025) <https://ec.europa.eu/newsroom/dae/redirection/document/118119> accessed 19 September 2025, appendix 4.2(6) and 4.5(2). ↩︎
  57. Wagner (n 54) para 8. ↩︎
  58. Bosman (n 15) para 67. ↩︎
  59. Bernsteiner and Schmitt (n 19) para 20. ↩︎
  60. Blue Guide (n 51) fn 128; see also Bosman (n 15) para 81. ↩︎
  61. See, more extensively, the forthcoming chapter on Product, Model and Entity Regulation in this work. ↩︎
  62. Schneider (n 15) para 14. ↩︎
  63. Bernsteiner and Schmitt (n 19) para 23. ↩︎
  64. However, such arguments must be treated with caution. See forthcoming chapter on Interpreting the AI Act through Systematic Analogies in this work. ↩︎
  65. Schneider (n 15) para 20. ↩︎
  66. Which, as noted above, expressly only addresses high-risk AI systems Article 3(1) AI Act: ‘AI system’ means a machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. . ↩︎
  67. See Section ‎2.3.3. ↩︎
  68. Schneider (n 15) para 21. ↩︎
  69. cf. Bernsteiner and Schmitt (n 19) para 24; see also Bosman (n 15) para 104. ↩︎
  70. Beurskens (n 16) para 23. ↩︎
  71. Article 12 MDR reads: ‘The detailed arrangements for a change of authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. shall be clearly defined in an agreement between the manufacturer, where practicable the outgoing authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. , and the incoming authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. . That agreement shall address at least the following aspects:
    (a) the date of termination of the mandate of the outgoing authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. and date of beginning of the mandate of the incoming authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ;
    (b) the date until which the outgoing authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. may be indicated in the information supplied by the manufacturer, including any promotional material;
    (c) the transfer of documents, including confidentiality aspects and property rights;
    (d) the obligation of the outgoing authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. after the end of the mandate to forward to the manufacturer or incoming authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. any complaints or reports from healthcare professionals, patients or users about suspected incidents related to a device for which it had been designated as authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. .’ ↩︎
  72. Article 31(3) EHDS reads: ‘In the event of a change of the authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. , the detailed arrangements for such change shall address at least the following:
    (a) the date of termination of the mandate of the outgoing authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. and the date of the beginning of the mandate of the incoming authorised representative Article 3(5) AI Act: ‘authorised representative’ means a natural or legal person located or established in the Union who has received and accepted a written mandate from a provider of an AI system or a general-purpose AI model to, respectively, perform and carry out on its behalf the obligations and procedures established by this Regulation. ;
    (b) the transfer of documents, including confidentiality aspects and property rights.’ ↩︎
  73. Schneider (n 15) para 25. ↩︎
  74. ibid. para 23. ↩︎
  75. Beurskens (n 16) para 27. ↩︎
  76. Schneider (n 15) para 26. ↩︎
  77. See, more extensively, the analysis on article 53(2) discussed in commentary on Article 53 paras 110 ff. in this work. ↩︎
Submitted:  
Published:  
Updated:  
Cite
Copied to clipboard
Cite
Hannes Bastians, 'Article 54: Authorised representatives of providers of general-purpose AI models' (Cambridge Commentary on EU General-Purpose AI Law, 1 Mar 2026) <https://cambridge-commentary.ai/article-54/>
Copied to clipboard