On August 1, the EU AI Act in force.
It regulates the use of artificial intelligence within the European Union.
As a regulation, the AI Act applies directly without any further act of implementation.
The effects of its provisions may also extend to HR departments.
The HR department must take into account both employment law and other legal requirements when AI is used in the company. The possible applications of artificial intelligence in HR are diverse.
AI can be used to analyze personnel requirements and write a job advertisement to meet these needs.
AI screens and filters applications by means of automated candidate screening.
A recruiting bot conducts initial interviews. An HR service bot can present interested parties with a needs-based job offer within the company.
This could soon be how the recruitment process works.
AI can also support personalized onboarding and act as a digital mentor or coach.
Artificial intelligence could also assess whether and how well new employees meet the requirements and, on this basis, make a recommendation for passing the probationary period, promotions and salary increases.
In theory, an AI could also determine the likelihood of individual employees being dismissed.
Reporting, skills management, risk assessments and many other HR processes offer potential applications for artificial intelligence.
It could also be used to compare gender-neutral remuneration or in the target agreement process for employees.
But what does employment law allow and what legal requirements still need to be observed?
The AI Act is particularly relevant.
The AI regulation divides AI systems into risk classes.
Depending on this, stricter or less strict requirements apply.
Artificial intelligence that entails an unacceptable risk is prohibited.
Article 5 of the AI Act lists a number of AI systems that fall into this category.
This includes, for example, AI for inferring the emotions of a natural person in the workplace.
Many of the above-mentioned AI systems in the HR sector are likely to be high-risk systems.
These are systems that endanger safety or fundamental rights.
High-risk AI systems are supplemented in Annex III of the AI Act. Digit. 4 of Annex III explicitly covers AI systems for personnel selection as well as systems that make decisions in connection with the conditions of employment, the promotion or termination of employment contracts.
It also covers AI systems that assign tasks on the basis of individual behavior or personal characteristics or traits or that monitor or evaluate the performance and behavior of individuals in such relationships.
High-risk AI systems must meet the requirements set out in the AI Act, which are subject to legal review.
Among other things, this applies to them: The company must ensure an appropriate risk management system over the entire life cycle of the AI.
It must also ensure appropriate data governance and data management procedures.
Companies must technically document compliance with the obligations and log the results.
Uses, data and employee recognition must also be logged.
High-risk AI systems must be registered in an EU database of providers and operators.
If employers use chatbots, these are normally classified as AI with limited risk.
However, transparency obligations must also be observed here.
In particular, the company must disclose that it is communicating with an AI.
Deep fake content must be labeled.
However, the implementation should be legally reviewed together with IT in each individual case.
The HR department works with a lot of personal data, sometimes even sensitive data.
If this data is to be processed using artificial intelligence, it must be ensured that the data cannot leave the company’s IT environment and that no unauthorized persons within the company can access the data.
The employer must also ensure that the AI does not collect any data that the company does not need.
This also applies if the HR department has no intention of evaluating the data.
This is because the GDPR stipulates the principle of data minimization.
If employers want to use artificial intelligence in the company, they should always involve the works council.
Even when planning the use of AI, the company must inform the works council in accordance with Section 90 Para.
1 no. 3 of the BetrVG.
The use of AI is also generally subject to § 87 Para.
1 No. 6 BetrVG.
At least if the employer prescribes its use or provides its own AI systems.
This also applies to guidelines on the use of AI.
Only the voluntary use of ChatGPT via private accounts is not subject to co-determination, according to the Hamburg Labor Court.
Employees could inadvertently violate data protection or infringe copyrights if they use AI at work.
As a rule, the employer will be liable for this in the external relationship.
For this reason alone, companies should always draw up rules for the use of AI.
AI is often perceived as objective.
However, the decision it makes is based on the respective language model.
This is generally not based on German labor law and the decisions made could discriminate against people and violate the General Equal Treatment Act (AGG).
Anyone using AI in HR processes must ensure that the language model used is adapted to labor law regulations.
This applies to recruiting, in particular job advertisements and applicant selection, as well as to the analysis of gender-neutral remuneration.
A breach of the prohibition of discrimination can lead to claims for damages.
The list of regulations to be observed was already long.
The AI Act has now added numerous obligations that employers must comply with if they wish to use AI in their company.
The implementation deadlines vary depending on the AI; however, companies should start preparing now.
From an employment law perspective, the works council should always be brought on board at an early stage if the use of AI is planned.
Close cooperation between the IT department, legal department and HR department is important so that the relevant regulatory issues can be properly assessed.
Internal training on AI systems or the design of application examples can be beneficial for acceptance within the company.
© 2024 KPMG Law Rechtsanwaltsgesellschaft mbH, associated with KPMG AG Wirtschaftsprüfungsgesellschaft, a public limited company under German law and a member of the global KPMG organisation of independent member firms affiliated with KPMG International Limited, a Private English Company Limited by Guarantee. All rights reserved. For more details on the structure of KPMG’s global organisation, please visit https://home.kpmg/governance.
KPMG International does not provide services to clients. No member firm is authorised to bind or contract KPMG International or any other member firm to any third party, just as KPMG International is not authorised to bind or contract any other member firm.