WATCH

The European Commission publishes the draft delegated act on high-risk AI systems — consultation open until 30 April 2026.

Back to Insights
GPAIFebruary 10, 20269 min read

GPAI and LLMs: your specific obligations under Articles 51 to 55 of the AI Act

GPT-4, Claude, Gemini, Mistral, LLaMA: if you use or integrate a GPAI model in your products or services, you have been subject to specific obligations since August 2025.

What is a GPAI model under the AI Act?

A general-purpose AI (GPAI) model is a model trained on large amounts of data, capable of performing a wide range of distinct tasks. GPT-4, Claude, Gemini, Mistral, LLaMA meet this definition. The AI Act distinguishes two categories: standard GPAI models and GPAI models with systemic risk.

Obligations of GPAI model providers (Art. 53)

  • Establish and maintain up-to-date technical documentation
  • Make information and documentation available to deployers
  • Establish a copyright compliance policy (Art. 53.1.c)
  • Publish a summary of training data used

Additional obligations for models with systemic risk (Art. 55)

A model is presumed to have systemic risk if its computing capacity exceeds 10²⁵ FLOPs. These models are subject to enhanced obligations:

  • Systemic risk assessment (adversarial testing)
  • Reporting of serious incidents to the European Commission
  • Enhanced cybersecurity measures
  • Annual evaluation reports

What this means for deployers

If you integrate a GPAI model into your products or services, you are a deployer under the AI Act. You must verify that your GPAI provider meets its documentation obligations, and ensure that your use of the model is compliant with the risk level of your final system.

Need expert guidance?

Assess your regulatory exposure

A Valyence™ AI Act Strategic Audit in 2 to 4 weeks.

Request an audit