GPAI Models: What you need to know

On 2 August 2025, new EU AI Act rules for general-purpose AI (GPAI) models took effect. In this first insight of our series, we explore what changed, why it matters, and how it impacts providers, users, and integrators of GPAI systems.

What do we mean by “GPAI Models” anyway?

The Act considers GPAI models to be those trained on vast amounts of data using self-supervision at scale, with “sufficient generality” to “competently perform a wide range of distinct tasks”. Note that it is not enough for the model to perform a narrow range of tasks (even if the model is trained on vast amounts of data).

By way of example, according to the Commission's Guidelines on the scope of the obligations for GPAI Models (“Guidelines”), a model whose functionality is limited to image upscaling would fall outside the definition of a GPAI model. Such a model lacks sufficient generality and therefore does not meet the criteria set out in the Guidelines.

The GPAI model must also generate language, text-to-image, or text-to-video, and satisfy minimum requirements relating to the amount of computational resource (also known as ‘training compute’ and measured in floating point operations per second (“FLOP”)) needed to train the model before it can be categorised as GPAI. The Guidelines specify the relevant threshold to be 1023 FLOP. It is acknowledged within the Guidelines that whilst training compute is currently the most appropriate criterion on which to categorise a model as GPAI, this may be subject to change in the future.

Those within the AI value chain should also be aware that a GPAI model may be further categorised as a GPAI model with systemic risk if the model in question has “high impact” capabilities, or if designated as such by the Commission. Additional compliance obligations are placed on providers of GPAI models with systemic risk due to their potential for widespread impact. The additional obligations aim to mitigate the elevated risks.

So, what changed on 2 August 2025?

Compliance obligations

It is now mandatory for providers of GPAI models to publish a “sufficiently detailed” summary of the data used to train their models using the official template provided by the EU AI Office. Providers are further responsible for:

  • implementing and maintaining policies to comply with copyright and intellectual property rights
  • preparing and maintaining technical documentation relating to their models

Providers may be required to submit the technical documentation to regulatory bodies (such as national competent authorities) or make the documentation available to “downstream providers” (in other words, those who integrate a GPAI model into an AI system).

Certain exceptions apply to compliance obligations under the Act if the GPAI model is open source (though notably, not if the GPAI model presents systemic risk). Further, the fact that a GPAI model is open source does not negate the need for the provider to publish a summary of the data used to train the model.

Oversight & enforcement 

On 2 August 2025, the EU AI Office became operational, assuming a central role in the governance of the EU’s AI ecosystem. The EU AI Office will be supported by the AI Board as well as an independent scientific panel. Member states were also required to appoint and publish details of their national competent authorities by the August deadline. Those within the AI value chain should prepare for 2 August 2026, when enforcement of the rules will commence. Providers of GPAI models placed on the market or put into service prior to 2 August 2025 will have a longer grace period, needing to ensure compliance by 2 August 2027.

Do I need to act now?

Preparation is key in developing robust compliance processes and promoting transparency. First, assess your organisation’s role within the AI value chain – are you developing or procuring AI models, and if so, do these constitute GPAI models? Document your assessment and revisit it regularly to ensure it remains accurate and up to date.

Whilst it may not be immediately obvious that your organisation falls within the scope of a provider under the Act – consider whether modifications you make to any existing GPAI models are significant enough to change this determination. If subsequently deemed to be a provider, consider how the recently published General-Purpose AI Code of Practice (“Code”) can be used to assure compliance with the Act in relation to Transparency, Copyright and Safety and Security. If procuring as opposed to providing GPAI models, consider whether the provider has signed up to the Code, and what implications this may have on your internal diligence or governance processes if they have not.

Looking ahead

As the state-of-the-art advances, be aware that current training compute thresholds may change and/or certain models may be designated as those with systemic risk when they may not have been previously. With an enforcement landscape which is not yet mature, organisations need to keep up to date with the latest developments and activity. 

Disclaimer

This information is for general information purposes only and does not constitute legal advice. It is recommended that specific professional advice is sought before acting on any of the information given. Please contact us for specific advice on your circumstances. © Shoosmiths LLP 2025.

 

Insights

Read the latest articles and commentary from Shoosmiths or you can explore our full insights library.