There are around 2,000 AI market surveillance authorities in the EU
The AI Act is a uniform regulation on AI applicable to all 27 Member States, but it will have to be interpreted, applied and enforced by around 2,000 market surveillance authorities and by 208 fundamental rights protection authorities, each one with a possible different interpretation.
It would be of course easier to have a unique AI regulator and market surveillance authority in the EU, or at least one for each Member State, as mostly happens with data protection authorities (in all EU Member States except in Germany, and for some regional authorities also in Belgium, Spain[1] and Finland). Nevertheless, this is not possible in AI. The reason is that AI is a transversal technology, meaning that it is (or will soon be) present in all sectors and activities. It is no longer possible to regulate the banking sector without regulating the use of AI by banks (this is why the central banks should be—and indeed are—market surveillance authorities for banks using AI: Recital 158 and art. 74.6 for all financial institutions), and the same can be said of all other regulated sectors, as the AI Act itself recognizes: for insurance (Recital 158), for biometrics for law enforcement, migration, justice and democracy, (Recital 159 and art. 74.8), etc. And the same can be said of products: market surveillance authorities for each of the products having harmonized legislation as listed in Annex I of AI Act should be AI market surveillance authorities when those products have AI components. Given the ubiquitous nature of AI, and its intimate entanglement with the essence of products and services, its market surveillance cannot belong to a single authority in all cases.
Of course, there are exceptions to this dispersion of regulators, and the two main ones are the two: EU bodies and the general-purpose AI models. Indeed, the European Data Protection Supervisor is the only body competent to impose fines on the Union institutions, bodies, offices and agencies falling under the scope of the AI Act (art. 100). And the Commission is the only body competent to impose fines on providers of general-purpose AI models (art. 101), with the help of the AI Office (with powers to monitor and supervise compliance, art. 75, as well as powers of implementation and compliance, art. 89). Sometimes the boundaries of the powers of the AI Office will be difficult to determine: as explained in Recital 161 of the AI Act, “where an AI system is based on a general-purpose AI model and the model and system are provided by the same provider, the supervision should take place at Union level through the AI Office, which should have the powers of a market surveillance authority”. This may create difficult problems as regards agentic AI, when agents are provided by the same provider of the general-purpose AI model: in principle, its surveillance will remain in the AI Office.
In all other cases, the AI Office is expected to help to harmonize enforcement and to provide guidance to national authorities.
Furthermore, some countries may try to centralize part of the surveillance of the compliance with AI Act, as it is the case in Spain, where the existing Draft Bill[2] appoints the Agencia Española de Supervisión de Inteligencia Artificial as the national surveillance authority for most (not all) of the prohibitions of art. 5 of the AI Act, as well as for the high-risk AI systems described in paragraphs 1 to 5 of Annex III of the AI Act, and also for the transparency obligations applicable to all AI systems according to art. 50 AI Act.
In Italy, the recent Legge 23 settembre 2025, n. 132, has appointed the Italian cybersecurity agency (Agenzia per la cybersicurezza nazionale, CAN) as the main national market surveillance authority (art. 20), but there are many others[3].
In Germany, the draft AI Market Surveillance Act appoints the Federal Network Agency (Bundesnetzagentur) as the main market surveillance authority, without prejudice to the existence of many others.
But, still, for the rest of the AI systems included in the scope of the AI Act, and for the rest of the obligations different from the transparency ones under art. 50, a vast number of AI national market surveillance authorities exist in each EU Member State. This decentralized approach may be challenging and may result in regulatory fragmentation.
As the provisions of the AI Act regarding governance structure, including art. 70 on national market surveillance authorities, should apply from 2 August 2025 (Recital 179 and art. 113.b, referring to Chapter VII), all EU Member States were obliged to notify to the EU Commission the list of their own national market surveillance authorities.
And they complied. The EU has recently published the initial list of all notified national market surveillance authorities, and they are around 2,000. The definitive list may include data protection authorities, financial institutions regulators, insurance, reinsurance and intermediaries regulators, authorities in the field of law enforcement, migration, justice and democracy (including judicial and electoral authorities), as well as the designated market surveillance authorities for all harmonized legislation (Annex I), which includes regulators in 38 different categories of products[4]. The European Commission has published the list of national market surveillance authorities for harmonized products by country[5] and by sector[6].
Furthermore, art. 77 AI Act refers to fundamental rights protection authorities and grant them some powers. The list of fundamental rights protection authorities has also been published[7] and there are 208 of them[8].
As a consequence of the above, there will be more than 2,200 authorities with power to interpret and enforce the AI Act.
And the powers granted to those authorities aren’t negligible: all national market surveillance authorities have the same powers granted by Regulation (EU) 2019/1020, including all the activities carried out and measures taken to ensure that the AI systems “comply with the requirements set out” in the AI Act “and to ensure protection of the public interest covered by that legislation”. Art. 10 of the Regulation (EU) 2019/1020 includes “effective market surveillance”, “taking by economic operators of appropriate and proportionate corrective action” and “taking of appropriate and proportionate measures where the economic operator fails to take corrective action”.
The AI Act also provides the market surveillance authorities with additional powers not included in the Regulation (EU) 2019/1020, such as those regarding high-risk systems (testing, notification of incidents, complaints, evaluation, enforcement, etc.), those regarding systems that even if compliant with AI Act may present a risk (art. 82 AI Act), and even the access to source code of high-risk AI systems under certain circumstances (art. 74.13).
The power to be granted access to the source code of the high-risk AI system, attributed to all (approximately) 2,000 EU market surveillance authorities, is conditioned upon the concurrence of certain conditions (necessity to assess the conformity and previous exhaustion or insufficiency of the data and documentation provided), but even with these two conditions it is probably excessive in most cases. The source code of an AI system may change continuously, it may be an enormous amount of code, and it may be too difficult to interpret for most of the European market surveillance authorities. Such a measure wouldn’t be proportionate (or even feasible) in almost any cases, except under the most extreme circumstances and total lack of cooperation. Of course, there will be a possibility to appeal against any such measure.
Those powers are granted to all national market surveillance authorities, including -as an example- the Regional State Administrative Agency of Northern Finland (Occupational Safety and Health Division), the Hellenic Recycling Agency (EOAN) in Greece and the Regional Inspectorate for Economic Activities in the Madeira (Portugal), of course each one within the limits of their own competences.
Having said that, it isn’t surprising that the AI Act established mechanisms of Union safeguard, coordination and cooperation.
Those mechanisms will be less necessary in light of the recent approval of Guidelines and Codes of Practice: the Guidelines on prohibited AI practices established by AI Act , the Guidelines on the AI system definition , the Guidelines on the scope of GPAI obligations, the Template for public summaries of training content, the GPAI Code of Practice and (in the field of data protection) the Guidance for Risk Management of Artificial Intelligence Systems of the European Data Protection Supervisor.
But, still, the possibility of contradictory interpretations remains.
The internal market should have a level playing field and this may be incompatible with different interpretations of the AI Act by different authorities. Therefore, the AI Act has established a Union safeguard procedure in art. 81, according to which when a measure has been taken by a market surveillance authority of a member State and notified to the EU Commission and to the other member States according to arts. 79.5 and 5 AI Act, then either the European Commission or any other market surveillance authority may raise an objection. In this case, the European Commission will enter into consultation with the market surveillance authority of the relevant member State and the operator and shall evaluate the national measure and then decide whether the national measure is justified or not. This will hopefully solve most harmonization problems.
In addition to the above, art. 65 of AI Act establishes the European AI Board, composed of one representative per member State, which according to art. 66 will contribute to the coordination among national competent authorities responsible for the application of the AI Act. One of the two sub-groups within the Board will act as the administrative cooperation group (ADCO) within the meaning of art. 30 of Regulation (EU) 2019/1020. The Board is expected to promote consistency in the AI Act national enforcement.
All these measures will, in the medium term, contribute to the functioning of the internal market as a single market for AI purposes. Nevertheless, unfortunately it will be unavoidable that at least during the first years of application of the AI Act the existence of around 2,000 different market surveillance authorities, together with the evolving nature of AI technology (continuously creating new regulatory challenges), will result in some unpredictability and in the coexistence of different criteria in the interpretation and application of the AI Act.
[1] In Spain we have the central authority, the Spanish Agency for Data Protection (Agencia Española de Protección de Datos), plus 3 authorities in 3 of the 17 Autonomous Communities, plus the CGPJ for the Courts of Justice. In some other EU countries there are also regional authorities, but the scenario is mostly one (relevant) authority per country.
[2] Refer to https://avance.digital.gob.es/es-es/Participacion/Paginas/DetalleParticipacionPublica.aspx?k=468
[3] The same art. 20 mentions the Banca d’Italia, CONSOB and IVASS. Furthermore, the Agenzia per l’Italia digitale is appointed as notifying authority.
[4] Medical devices, cosmetics, toys, personal protective equipment, construction products, aerosol dispensers, simple pressure vessels and pressure equipment, transportable pressure equipment, machinery, lifts, cableways, noise emissions for outdoor equipment, equipment and protective systems intended for use in potentially explosive atmospheres, pyrotechnics, explosives for civil uses, appliances burning gaseous fuels, measurement instruments, non-automatic weighing instruments and pre-packaged products, electrical equipment under EMC, radio equipment under RED, electrical appliances and equipment under LVD, electrical and electronic equipment under RoHS and WEEE and batteries, chemical substances under REACH and classification and labelling regulations, other chemicals, eco-design and energy labelling, tire labelling, recreational crafts, marine equipment, motor vehicles, non-road mobile machinery, fertilizing products, other consumer products under GPSR, biocides, textile and footwear labelling, crystal glass, unmanned aircraft systems, packaging and packaging waste, tobacco, batteries and waste batteries, cyber resilience.
[7] https://digital-strategy.ec.europa.eu/en/policies/fundamental-rights-protection-authorities-ai-act.
[8] 19 for Austria, 20 for Belgium, 7 for Bulgaria, 7 for Croatia, 3 for Cyprus, 2 for Czech Republic, 7 for Denmark, 3 for Estonia, 10 for Finland, 4 for France, 20 for Germany, 4 for Greece, 1 for Hungary, 9 for Ireland, 5 for Italy, 1 for Latvia, 4 for Lithuania, 3 for Luxemburg, 10 for Malta, 6 for Netherlands, 4 for Poland, 14 for Portugal, 9 for Romania, 2 for Slovakia, 10 for Slovenia, 20 for Spain and 4 for Sweden.