13/06/2024
The impact of the AI Act on Tech M&A due diligence
In mergers and acquisitions (M&A), due diligence is the cornerstone of a careful analysis of the commercial, legal, tax and financial aspects of a company. For this purpose, acquirers and their advisors take a deep dive into a company's key documents, contracts, books and records. In the land of Tech M&A, the rise of artificial intelligence (AI) raises the question of what is important in a due diligence investigation of AI companies. In our recent exploration of the AI Act we introduced the framework and its implications. The AI Act requires companies engaged in AI-related activities to undertake a comprehensive compliance assessment. This has implications for the focus of due diligence on such companies. The AI Act has been approved by the European Parliament, on 13 March 2024, and given the final green light by the European Council on 21 May 2024. The application of the AI Act will be staged over two years. Now that unacceptable risk AI systems will already be banned as per December 2024, it is time to zoom in on the various players distinguished in the AI Act from an M&A due diligence perspective. AI Act players & due diligence Any company that uses AI may be subject to the AI Act. Each provider, importer, distributor and deployer of AI is required to demonstrate its compliance with the AI Act in a manner unique to its own operations. This highlights the need for a tailored due diligence approach. Particular attention should be paid by companies which envisage acquiring:providers placing on the market, or putting into service, AI systems or placing on the market general-purpose AI models in the EU, irrespective of whether those providers are based within the EU or in a third country;deployers of AI systems that are based within the EU; orproviders and deployers of AI systems that run their businesses from a country outside the EU but where the output produced by the AI system is used in the EU. This also means that the M&A due diligence must cover not only the AI systems developed or provided by the target company, but also those imported, distributed or deployed by it or its customers or partners in the EU. As the AI Act adopts a risk-based approach, the first step of any due diligence should be to determine or verify the appropriate classification of the AI systems: unacceptable risk, high risk, limited risk or minimal risk. AI systems classified as unacceptable risk are prohibited. AI systems classified as minimal risk is unregulated, and limited risk AI systems are subject only to light transparency obligations. From the perspective of the AI Act, the focus of due diligence should therefore predominantly be on high risk AI systems. Depending on the risk appetite of the acquiring company as well as potential impact on the reputation and competitiveness of the target company, the possibility of voluntary conformity assessment exists for AI systems that are not classified as high-risk, but which may nevertheless affect the rights or interests of individuals. The AI Act enlists prohibited AI practices in Chapter 2, and requirements for providers, deployers and other parties in respect of high risk AI systems in Chapter 3. We will consider the appropriate approach for each of these parties concerned below:Providers: companies involved in the development of AI systems under their own name and trademark are considered providers. Their obligations are the most extensive of all parties concerned and include the establishment of quality management systems, preparation of technical documentation, CE conformity marking and maintenance of logs. Any distributor, importer, deployer or other third party who makes a substantial modification to the high risk AI system or changes the intended purpose of an AI system and markets such AI system under its own name or trademark will be considered a provider under the AI Act. Furthermore, all documentation of providers needs to be kept for a period of 10 years. Importers: importers placing AI systems on the market under the name or trademark of companies outside the EU, must ensure that the required assessments, documentation and CE conformity marking are in place. This should be verified as part of the due diligence. If these assessments and documentation are not in place, the importer is not allowed to place the product on the EU market and risks significant fines. Distributors: a distributor is any company in the supply chain that makes an AI system available within the EU without affecting its characteristics and is not a provider or importer. Distributors are responsible for ensuring that the AI system is accompanied by the appropriate technical documents and instructions, including the CE conformity marking, when providers or importers deliver these products to them. In addition, the distributors should verify that the provider and the importer have fulfilled their obligations under the AI Act. If not, corrective actions may be imposed, such as bringing the system into conformity or withdrawing the product. Deployers: companies that only use AI systems in their business must take appropriate technical and organisational measures to comply with the instructions given by the provider. These compliance obligations are lighter than those of the entities responsible for providing the instructions and related documentation. However, deployers should not only follow the instructions, but are also obliged to consider potential risks and notify a provider or distributor and the relevant market surveillance authority thereof. Furthermore, when notifying a provider or distributor, they should also suspend the use of a product. Deployers should also keep the logs (automatic recording of events) available, which should be reviewed as part of the due diligence. Parties using AI for personal, non-professional activities, do not have to comply with these rules. Due diligence requires a thorough investigation to assess the compliance of all of these parties with their obligations under the AI Act. Moreover, the AI Act introduces additional obligations and restrictions on the providers and deployers of high risk AI systems that process personal data or that are used for purposes such as identification, detection, risk assessments or prediction of certain behaviour. Conclusion The introduction of the AI Act will reshape due diligence in AI-related transactions. Compliance with the AI Act is essential for all players. Step 1 of any due diligence should be to determine or verify the appropriate classification of the AI systems: unacceptable risk, high-risk, limited risk or minimal risk. As part of this first step, the acquiring companies should also develop their ‘AI literacy’: the skills and knowledge to better understand how providers and deployers are taking into account their rights and obligations under the AI Act , and to become aware of the opportunities and risks of the relevant AI systems,. For high risk AI systems, step 2 of the due diligence requires a tailored and thorough investigation to assess the target’s compliance with its obligations under the AI Act.
Social Media cookies collect information about you sharing information from our website via social media tools, or analytics to understand your browsing between social media tools or our Social Media campaigns and our own websites. We do this to optimise the mix of channels to provide you with our content. Details concerning the tools in use are in our Privacy Notice.