General-purpose AI models: Obligations and roles for Providers and downstream modifiers under the EU AI Act
Key contacts
Executive summary – Key takeaways
- Role-based obligations for GPAI models: General-purpose AI models (“GPAI models”) are the foundational building blocks of the current AI value chain and bring targeted, role-based obligations that are distinct from the obligations applicable to AI systems placed on the EU market.
- Separation of roles and obligations: In operational terms, the AI system provider and the GPAI model provider can be different actors, and obligations may apply at both levels.
- When a modifier becomes a provider: The EU Commission’s guidance of 18 July 2025 clarifies that a downstream modifier becomes the provider of a modified GPAI model only where its modification leads to a significant change in generality, capabilities or systemic risk, and that, in such cases, the modifier’s obligations are limited to the modification itself.
- Market-placement triggers: Market placement of a GPAI model can occur when an AI system using the model is offered on the EU market, including access via an API[1], which is one of the main ways this shows up in practice for providers and integrators planning compliance.
Background
This article focuses on how the EU AI Act[2] applies to entities involved in the provision or modification of GPAI models, clarifying role-based obligations, the point at which a modifier becomes a provider and market-placement triggers.
Under the EU AI Act, an AI system is a machine-based system that, for explicit or implicit objectives, infers from input how to generate outputs (for example, predictions, content, recommendations or decisions) that can influence physical or virtual environments, and is designed to operate with varying levels of autonomy and may exhibit adaptiveness after deployment.[3]
GPAI models are AI models that display significant generality, can competently perform a wide range of distinct tasks and that can be integrated into a variety of downstream systems or applications, excluding AI models that are used for research, development or prototyping activities prior to market placement.[4] The definition of GPAI models specifically includes models with at least a billion of parameters, and which have been trained with a large amount of data using self-supervision.[5] Large generative models such as prominent large language models (LLMs) are typical GPAI model examples, as they can generally be used for a variety of tasks and allow for flexible generation of content, such as in the form of text, audio, images or video.
Hence, GPAI models do not, by themselves, constitute AI systems under the EU AI Act. However, GPAI models can be seen as building blocks that become part of AI systems once combined with additional components for use (for example, a user interface).[6]
Market placement and role split – The practical implication for providers and integrators
A GPAI model is considered placed on the EU market when it is offered under a provider’s name or trademark, and importantly, market placement of the GPAI model is also deemed to occur when an AI system using that model is placed on the market, including when access is offered via an API.[7] [8] In operational terms, the AI system provider is the actor that places the AI system on the market or puts it into service under its own name or trademark, for example by offering a user-facing application that integrates a GPAI model. By contrast, the GPAI model provider is the actor that places the model itself on the market, which may be a different entity. The consequence is that obligations can apply at both levels: to the provider of the placed AI system and, where applicable, to the provider of the GPAI model which is integrated in the AI system. Hence, responsibilities in the chain are not always placed on a single actor by default.
When downstream modifiers become GPAI model providers – Thresholds and materiality
Under certain circumstances, actors modifying existing GPAI models may themselves be deemed the provider of the resulting modified GPAI model:
- Downstream modifiers: Downstream modifiers are actors who adapt a GPAI model already placed on the market, typically to specialize it for a sectoral or functional purpose before integrating it into their own AI system.
- When a modifier becomes a provider: A downstream modifier becomes the provider of a modified GPAI model only if its modification leads to a significant change in the model’s generality, capabilities or systemic risk, which is the core materiality test in the Commission’s guidance.[9]
- Indicative compute threshold: As an indicative rule of thumb, if the training compute used for the modification exceeds one-third of the original GPAI model’s training compute, the modifier will generally be considered the provider of the resulting modified GPAI model. If the original compute is unknown, a one-third proxy grounded in the general GPAI criteria (and, where relevant, systemic-risk criteria) may be used.[10]
- Reference for context: A model is presumed to be a GPAI model if the training compute is greater than 10^23 FLOP (floating point operations per second) and it is presumed to be a GPAI model with systemic risk if the training compute is greater than 10^25 FLOP.[11] [12]
In short, becoming a GPAI model provider of a modified GPAI model is reserved for entities making substantial, material modifications to the original GPAI model; otherwise, downstream actors remain solely AI system providers for their deployed applications.
Obligations of the GPAI model provider – Scope and purpose
When a GPAI model is placed on the market, the model’s provider assumes role-specific obligations focused on transparency and documentation designed to enable safe integration by downstream actors and to allow those actors to meet their own regulatory obligations.[13] In practice, this means maintaining up-to-date technical information about the GPAI model and making sufficient information available to prospective AI system providers regarding capabilities and limitations, as clarified by the Commission’s guidelines.[14] Where a downstream modifier qualifies as the provider of a modified GPAI model under the materiality threshold, its obligations are limited to the modification itself, such as complementing the original GPAI model provider’s documentation with details of new data sources and methods used for adaption.[15]
Practical scenarios
The following scenarios illustrate how the materiality test and the placement concept operate together to allocate responsibilities across the chain:
- Integration without material modification: Consider a company that integrates a GPAI model into a customer-facing chatbot and offers it under its own brand: this actor is the AI system provider at the moment of market placement, and the underlying GPAI model is deemed placed on the market as well, including where access is provided via API. If that company only performs modest fine-tuning that does not materially alter generality, capabilities or systemic risk and sits well below the indicative one-third compute threshold, it remains solely an AI system provider, with no additional GPAI model provider obligations beyond those borne by the original GPAI model provider.
- Material modification triggering provider status: By contrast, if modification involves substantial training that crosses the one-third compute threshold and materially shifts generality or raises systemic risk concerns, the modifier becomes the provider of the modified GPAI model and must satisfy GPAI obligations limited to the modification part of the original GPAI model, including complementary documentation.
Compliance roadmap
A GPAI model provider should maintain a documentation package that covers model capabilities, limitations and integration guidance sufficient for downstream actors to implement safe use and meet their own regulatory obligations, keeping this information current as the GPAI model is updated or evolves. Downstream actors who remain solely AI system providers should document their integration decisions and mitigate its limitations, and record why their modifications do not meet the materiality thresholds that would otherwise reclassify them as GPAI model providers. Where a downstream actor crosses the threshold and becomes the provider of a modified GPAI model, their compliance focus should be limited to the modification, complemented by clear records of additional data sources and training methods so that their obligations are scoped to the change actually introduced.
Conclusion
The EU AI Act’s role-based framework ensures that the party placing an AI system bears system-level obligations while the party placing a GPAI model bears model-level transparency and documentation duties, with the Commission’s guidance reserving GPAI model provider status for downstream actors only upon substantial, material modifications. Put simply, two things matter most for downstream modifiers: whether the GPAI model is put on the market and whether downstream modifications of the GPAI model meet the thresholds.
Citations:
[1] Application Programming Interface.
[2] Regulation (EU) 2024/1689 (AI Act).
[3] Article 3(1), Regulation (EU) 2024/1689 (AI Act).
[4] Article 3(63), Regulation (EU) 2024/1689 (AI Act).
[5] Recital 98, Regulation (EU) 2024/1689 (AI Act).
[6] Recital 97, Regulation (EU) 2024/1689 (AI Act).
[7] Article 3(3), Regulation (EU) 2024/1689 (AI Act).
[8] Recital 97, Regulation (EU) 2024/1689 (AI Act).
[9] Paragraph 62, Guidelines on the scope of the obligations for general-purpose AI models established by Regulation (EU) 2024/1689 (AI Act).
[10] Paragraphs 63-34, Guidelines on the scope of the obligations for general-purpose AI models established by Regulation (EU) 2024/1689 (AI Act).
[11] Section 2.1, Guidelines on the scope of the obligations for general-purpose AI models established by Regulation (EU) 2024/1689 (AI Act).
[12] Article 51(2), Regulation (EU) 2024/1689 (AI Act).
[13] Article 53, Regulation (EU) 2024/1689 (AI Act).
[14] Paragraphs 2-3, Guidelines on the scope of the obligations for general-purpose AI models established by Regulation (EU) 2024/1689 (AI Act).
[15] Recital 109, Regulation (EU) 2024/1689 (AI Act).