Search
Contact
Symbolbild zu AI Act
16.07.2024 | KPMG Law Insights

The AI Act is coming: EU wants to get a grip on AI risks

For many people, artificial intelligence (AI) is the great hope for business, healthcare and science. But there are also plenty of critics who fear the risks of AI and call for rules. With the AI Act, the EU Commission now wants to regulate artificial intelligence for the European region and thus get the greatest risks for users under control. The EU Parliament has already approved the Commission’s draft. The law is scheduled to take effect in 2024.

The idea is that the higher the risk of an AI system, the higher the associated requirements and obligations. AI regulation is intended to increase user confidence in AI within the EU and thus also create better conditions for innovation for manufacturers and users of AI applications.

Violations can result in fines of up to 30 million euros or up to six percent of total annual global sales for the previous fiscal year. The sanctions are thus comparable to those of the GDPR.

The AI Act is to be accompanied by an AI liability policy. And the EU Commission also wants to update the Product Liability Directive. The goal: To close liability gaps in the use of AI systems and to address evidentiary difficulties in the event of legal violations in connection with artificial intelligence.

The obligations of the EU AI Act affect manufacturers, suppliers and distributors of AI systems, product manufacturers who incorporate AI systems into their products, and users of AI systems, i.e. virtually every company.

AI with an “unacceptable” level of risk prohibited by the AI Act

The AI Act divides artificial intelligence into three risk classes: “unacceptable,” “high,” and “low/minimal.”

Placing on the market, putting into service, or using AI systems that pose an unacceptable risk is prohibited. These include, in particular, those AI systems that are designed to subliminally adversely influence human behavior. AI that serves to exploit the weaknesses of vulnerable individuals is also unacceptable and thus prohibited. Also prohibited is the use of AI systems by public authorities to assess or classify the trustworthiness of natural persons (“social scoring”). AI systems may likewise not in principle be used for real-time biometric remote identification of natural persons in publicly accessible spaces for law enforcement purposes.

For AI systems with risk class “high”, special requirements apply

AI systems that pose a high risk to the health and safety or fundamental rights of natural persons are referred to as “high-risk AI systems.” These include, for example, human dignity, respect for private and family life, protection of personal data, freedom of expression and information, and freedom of assembly and association.

The AI Act imposes stringent requirements on the design and use of high-risk AI systems, for example, in terms of the quality of the data basis, security, functionality, and also human documentation and oversight, as well as quality and risk management.

Conformity with the AI Act should be made visible with a CE marking.

Lower requirements for systems with “low/minimal” risk class

Unless AI systems are unacceptable and also classified as high-risk AI systems, they fall into the third category. They are then subject to less stringent requirements. However, providers of such systems should still establish codes of conduct and be encouraged to voluntarily apply the regulations for high-risk AI systems. In addition, the EU AI Act requires that even low-risk AI systems must be safe if they are placed on the market or put into service. Security can be ensured in particular by voluntarily observing the regulations for high-risk AI systems.

With AI governance, companies can hedge risks

Organizations should actively evaluate each application and incorporate it into a governance structure.

All AI-based solutions should also be considered. Use cases and the associated risks should be known to companies. Manufacturers of an end product must comply with the vendor obligations set forth in the AI Act and ensure that the AI system embedded in the end product is compliant. Risks also include the liability risk arising from the AI Act.

When building AI governance, the key is who is responsible for grading the risks. To make the assessment as objective as possible, the team should be interdisciplinary.

How AI risk management can succeed

For effective risk management, companies should establish guidelines, processes and monitoring solutions. Various institutions and organizations such as BSI, IDW or DIN are already developing standards for this.

It is advisable not to separate compliance and performance. Management and IT should therefore work closely with the legal and compliance functions. Only if it is ensured that legal regulations are complied with and liability risks are minimized can the potential of artificial intelligence actually be exploited.

Even though the AI Act has not yet been finalized, companies can already start assessing risks and establishing appropriate governance.

Learn more about “AI risks at a glance”: Our whitepaper provides recommendations for AI governance that ensures responsible use of the new technology. Download now.

Authors:
KPMG AG Wirtschaftsprüfungsgesellschaft: Dr. Justus H. Marquardt, Oleg Brodski
KPMG Law Rechtsanwaltsgesellschaft mbH: Francois Heynike

Explore #more

22.10.2025 | In the media

KPMG Law guest article in Das Inverstment: Private debt for the masses: How the FRBG is turning the fund market upside down

Paradigm shift in the fund market: The new FRBG makes private debt retail-capable and creates citizen participation funds. In this article, KPMG Law expert Ulrich

20.10.2025 | KPMG Law Insights

Data centers: Requirements for emergency power generators continue to rise

When the power fails in data centers, the consequences are often severe: Data loss and system failures can cause considerable financial damage to companies. Emergency…

16.10.2025 | In the media

KPMG Law contribution to the anthology “Crypto-Asset Compliance”

KPMG Law experts Ulrich Keunecke and Marc Pussar have contributed chapter 3 on capital market and banking supervisory law aspects of crypto-assets to the anthology…

14.10.2025 | Deal Notifications

KPMG Law and KPMG advise Bühler Motor GmbH on the sale of Bühler Motor Aviation GmbH to Astronics Germany GmbH

KPMG Law Rechtsanwaltsgesellschaft (KPMG Law) and KPMG AG Wirtschaftsprüfungsgesellschaft (KPMG) have advised Bühler Motor GmbH on the sale of all shares in Bühler Motor Aviation…

10.10.2025 | In the media

KPMG Law guest article in NZG: Compliance due diligence in SMEs: Minimum scope and contractual mapping of compliance risks of the target company

In the context of M&A transactions, compliance usually still plays a subordinate role in legal due diligence. The purpose of this article is, on…

10.10.2025 | In the media

KPMG Law honored at the M&A Award Night 2025

KPMG Law has been awarded the “M&A Transaction Advisory” prize at this year’s M&A Award Night of the Bundesverband Mergers & Acquisitions e.V. (BM&A) and…

10.10.2025 | In the media

KPMG Law guest article in CCZ: The guide for compliance management systems in small and medium-sized enterprises (DIN SPEC 91524)

Compliance in SMEs is challenging: the legal responsibility for compliance is undisputed, but the specific tasks are unclear and depend on the specific situation of…

10.10.2025 | KPMG Law Insights

Transformation in legal departments in 2026 – the most important trends and best practices

Three topics in particular are currently driving the transformation of the legal department: AI, the rapid increase in regulation and geopolitical developments. There has always…

08.10.2025 | Deal Notifications

KPMG advised Adiuva Capital GmbH with Fact Books on the sale of KONZMANN Group

KPMG Law Rechtsanwaltsgesellschaft mbH (KPMG Law) and KPMG AG Wirtschaftsprüfungsgesellschaft (KPMG) advised Adiuva Capital GmbH, a Hamburg-based private equity firm (“Adiuva“), in connection with the…

06.10.2025 | KPMG Law Insights

What the Green Claims Directive means for companies – an overview

With the Green Claims Directive, the EU will introduce extensive regulations on the requirements for permissible environmental claims. The aim is to prevent greenwashing so…

Contact

Francois Heynike, LL.M. (Stellenbosch)

Partner
Head of Technology Law

THE SQUAIRE Am Flughafen
60549 Frankfurt am Main

Tel.: +49-69-951195770
fheynike@kpmg-law.com

© 2024 KPMG Law Rechtsanwaltsgesellschaft mbH, associated with KPMG AG Wirtschaftsprüfungsgesellschaft, a public limited company under German law and a member of the global KPMG organisation of independent member firms affiliated with KPMG International Limited, a Private English Company Limited by Guarantee. All rights reserved. For more details on the structure of KPMG’s global organisation, please visit https://home.kpmg/governance.

 KPMG International does not provide services to clients. No member firm is authorised to bind or contract KPMG International or any other member firm to any third party, just as KPMG International is not authorised to bind or contract any other member firm.

Scroll