TLDR : The AI Act, European regulation on artificial intelligence (AI), has been in effect since February 2024, despite opposition from some companies. It imposes obligations regarding general-purpose AI models, including comprehensive technical documentation, a copyright compliance policy, and a summary of training data. Non-compliant companies risk fines of up to 15 million euros or 3% of their global turnover.
Table of contents
The first provisions of the AI Act, effective since August 2024, concerning unacceptable risk AI systems began to be enforced last February. Despite the "Stop the Clock" moratorium requested by about fifty companies from the EU AI Champions Initiative on the continued rollout of the regulation, the obligations concerning general-purpose AI (GPAI) models have been effective since last Saturday.
The EU has been a pioneer in establishing a regulatory framework aimed at regulating AI based on its potential to cause harm. The AI Act or AIA's goal is to ensure that AI systems and models marketed within the EU are used ethically, safely, and in respect of the EU's fundamental rights.
Guidelines published by the European Commission on July 18 clarify the scope of the regulation's application for GPAI models. Any AI model displaying a computational capacity exceeding 1023 FLOPs (the volume of floating-point operations mobilized during training), designed without a specific purpose (such as weather forecasts, games...) but which can be reused in a wide variety of contexts, will be presumed to fall into this category.
The obligations cover the entire lifecycle of the models, from pre-training to provisioning, including updates and modifications post-market release. Their providers will have to supply:
- comprehensive technical documentation, intended for downstream providers integrating the model into their AI system and, if requested, to the European AI Office or the competent national authorities.
- a summary of the training data, according to the standardized model that the AI Office will provide them;
- implement a copyright compliance policy, aligned with European law.
The fundamental principle of the regulation remains unchanged: the higher the risk, the stronger the requirements. These obligations are reinforced for GPAI considered to be of systemic risk, models exceeding the cumulative threshold of 1025 FLOPs. These will be subject to enhanced risk management procedures, particularly in cybersecurity, reporting of serious incidents, or continuous testing. A regulatory burden deemed hardly sustainable...
This threshold is not rigid, however: a re-evaluation of the actual risks can be requested by the providers.
Providers, Modifications, and Open Source Status
Any company that markets a model on the European market is considered a provider, regardless of the initial development location. However, a downstream actor who has modified the model using more than a third of its initial computational power is also considered a provider and is subject to obligations.
Models released under a free and open license benefit from a partial exemption regime. Provided they meet certain criteria (no monetization or personal data collection), these models are not subject to documentation obligations to downstream providers or authorities. However, once they cross the systemic risk threshold, no exemption applies.
Code of Good Practices
Despite coming into effect on August 2, these rules will only apply in August 2026 for new models, and in August 2027 for existing models. This gradual approach, led by the AI Office, aims to give companies time to adapt.
To help providers comply, the Commission published a code of good practices a few days before its directives. Those who choose to adhere to it will benefit from reduced administrative burden and increased legal security compared to those who prove compliance by other means. Google, OpenAI, Mistral, Microsoft have already done so while Meta refused, citing legal uncertainties and an unjustified extension of the regulatory framework.
The fines for non-compliance with these obligations can reach 15 million euros or 3% of the companies' worldwide turnover. Each EU member state must notify the Commission of the supervisory authorities that will monitor the companies. While none have yet been designated by France, the CNIL is expected to play a central role, with sectoral bodies like Arcom or Anssi also being considered.