Labor law and the implementation of AI in the workplace
Authors
Artificial Intelligence is a huge challenge for employers, with enormous implications on labour law.
In Italy, the employment law may have difficulties in “catching up” with the fast pace of technological developments in AI.
However, the European law makers are now taking a pioneering role in the comprehensive regulation of AI systems and the European Committee of Permanent Representatives (Coreper) has agreed on the draft of AI Act COM (2021) 206 on February 2, 2024. Parliamentary approval is expected soon.
In light of the above, also considering that companies working with AI may have competitive advantage, there is an increasing pressure on employers and HR to adopt and handle AI, although the regulatory picture is still uncertain.
Some of the main potential issue that an employer using AI system may have to face are:
1. Remote controls on employees (and what will be the impact of art. 4 Law No. 300/1970);
2. protecting employees privacy;
3. performance evaluation systems;
4. employee monitoring (e.g., how often employees take breaks);
5. disciplinary process (What disciplinary sanction should be taken if the employee makes a mistake due to an AI "hallucination"?)
6. liability in the case of damages due to a fault in the AI system;
7. protection of the company know how, trade secrets and private data (increased cyber ransom risk?);
8. training of employees (the older employees may statically have a problem in adapting to new technologies) and emergence of new roles as an adaption to AI systems.
In Italy, there are already several regulations that impose requirements on the use of AI systems and more certainly will be adopted in the coming months/years.
First of all, it should be noted that the employer must inform employees on whether AI systems will be implemented, how will they impact the employees’ work and most of all how their performance will be monitored via the AI (according to Legislative Decree No. 152/1997, as recently amended by Law Decree No. 48/2023). The Workers Statute provide additional requirements and limits to the implementation of AI (according to articles 4 and 8 of Law No. 300/1970). For example, it might be necessary or advisable to involve work councils in the AI implementation process.
Further, to protect employees’ rights, the AI system must conform to the data protection requirements (according to the GDPR and Legislative Decree No. 101/2018).
As an example, a potential issue is constituted by the fact that in order to create an account with ChatGPT, the employees must provide their names, e-mail address and telephone number. Insofar, OpenAI does not offer company accounts. In addition, OpenAI, the owner of ChatGPT, reserves the right to use all chat histories for training purposes.
To approach these and other issues, the EU is currently working on two important instruments: AI Act and AI Liability Directive. These two regulations are especially intended to clarify the responsibility of AI systems and ensure ethically acceptable use also from an employer’s point of view.
The proposed AI Act COM (2021) 206 imposes employers using or operating AI systems, with specific obligations which will depend on how risky the AI system operated or used, is. This risk-based approach distinguishes between AI systems with unacceptable risk, AI systems with high risk and other AI systems.
For example, AI systems used in HR matters (e.g., used for recruitment, promotions or dismissals) can be classified as high-risk AI systems. In this regard, the AI Act defines high-risk AI systems as those which pose significant risks to the health or fundamental rights of employees.
Due to the current draft, classifying an AI system as high risk, establishes a variety of obligations on the part of the employer, such as:
- Evaluating the AI system during the entire period of use (art. 29 of the AI Act);
- Using the AI system only in accordance with the instructions for use and monitoring if the AI systems are used in a way according to its instruction for use (Art. 29 of the AI Act);
- Ensuring, that the input data is consistent with the intended purpose of the system and retain the automatically generated logs of the system;
- Carrying out a fundamental right assessment before using or operating AI systems such as taking the group of employees into account, that will be affected by the used AI system (Art. 29a of the AI Act);
- Consulting employee representatives prior to the use of high-risk AI in the workplace;
- Informing the employees that they will be subject to the system (Art. 29 (5a) of the AI Act).
Besides that, the EU is working on the Draft AI Liability Directive COM (2022) 496. This Directive introduces new rules specific to damages caused by AI systems. According to its current version, the AI Liability Directive should not apply to contractual liability.
The Draft AI Liability Directive regulates the employer – who uses or operates AI systems - as the correct defendant if someone suffers damage due to a fault in the AI system. Further, the directive facilitates the assertion of tortious claims, e.g., by providing disclosure of evidence. This means for example, that records and documentations may have to be created in accordance with the AI Act, which would then have to be disclosed in the event of a claim – to prove that the employer complied with the obligations fully that the AI Act forces the employer to. Otherwise, the lack of counter-evidence would be of benefit to the claimant. Companies working with high-risk AI systems are therefore advised to establish the necessary infrastructure and organisational conditions now so they can familiarise themselves with the EU-level regulation.
However, it will be some time before the AI Liability Directive comes into force. In this regard, it is impossible to rule out significant changes of these regulations before then.
However, the above can still allow us to acknowledge that labor law regulations are complex and, due to technological developments, in a continuous state of evolution. Our experts are therefore available to assist you in these matters.