After almost a year of work, the Code of Practice for Generative AI models is now complete. However, companies are not required to follow these rules.
The EU recognized that generative AI models can cause problems within its AI Act. However, the responsibility for handling these issues largely falls on the developers and organizations that use these models. They need to document their training, testing, and evaluation processes and manage any risks properly.
Because of the wide variety of AI models and potential problems, the law doesn't specify exactly what these risks are or how to reduce them. Instead, companies that offer these models are expected to use "Codes of Practice" or similar standards to address the major problems of generative AI. This "Code of Practice" is a central piece and has been released after extensive discussions and a slight delay.
The Code of Practice is divided into three main areas: transparency, copyright, and safety/security measures. Companies should use a Word document to easily explain how they created their models: what data they used, how they trained them, when they did it, and with what resources. This documentation is mandatory for operating in the EU starting August 2, 2025.
In the copyright area, there are certain commitments that GPAI (General Purpose AI) operators would agree to, such as not using pirated content for training or respecting copyright restrictions. They should also try to prevent their AI from producing outputs that violate copyright.
The most difficult part to implement involves reducing the specific systemic risks of each model. These risks depend on the model itself, the data it was trained on, and its capabilities. Article 55 of the AI Act requires companies to address these risks. The Code of Practice provides guidance on the types of risks that might be involved and how to deal with them. This includes things like AI-generated nude images, threats to national security, discriminatory content, inaccurate health advice, and radicalization chatbots. Using AI models to develop weapons or digital attack tools should also be considered.
The Code of Practice serves as a reference for how to handle these issues.
The Code of Practice is not legally binding for any company. Signing the code does not guarantee that a company will not face legal issues if its measures are found to be insufficient. The Code of Practice was developed by independent scientists on behalf of the EU Commission through an extensive process of input.
Susanne Dehmel from the German IT association Bitkom, expressed partial satisfaction with the results. She is critical that companies must constantly look for new risks: "Together with vaguely defined fundamental rights risks and societal risks, for which there are often hardly any established methods for identification and evaluation, new legal uncertainty arises for European AI providers."
Companies have two options: either follow the Code of Practice and show that they are trying to comply, or develop everything from scratch themselves and prove that they are meeting the legal requirements. Because of this, some AI model operators will likely choose to use the Code of Practice.
EU Commission Vice-President Henna Virkkunen sees the Code of Practice as "an important step in making the most advanced AI models available in Europe, while ensuring they are not only innovative, but also safe and transparent." She encourages companies to voluntarily adopt the Code.
However, an important part from the EU Commission is still missing: Guidelines that define what qualifies as AI with general or non-specified usage, or GPAI. These guidelines are expected to be published soon.
Companies do not need to worry about penalties if they do not follow the rules yet. Sanctions for failing to meet the AI Act requirements are not expected until August 2026 at the earliest. Many member states, including Germany, have not yet passed the necessary laws to create national supervisory structures. It is clear that the German law will not be ready by the time the next AI Act regulations come into effect in August 2025. The responsible Digital Ministry will continue to work on it, but the recent change in government means it will not be ready in time.
Green Party member Rebecca Lehnhardt said that Europe has done its part on time, and now the German government needs to move faster: "Delays or a weakening of the obligations would only create legal uncertainty and mistrust." There has been speculation about a partial delay, which some member states and companies would like to see. However, with the publication of the Code of Practice, a general delay is less likely, although individual special regulations could still be discussed.