Navigating the UPC: Insights from Year One
Join us on 10 July at 3:00 pm (CET) for an insightful online seminar as we delve into the inaugural year of the Unified Patent Court (UPC). Since its opening in June 2023, the UPC has changed the patent enforcement landscape in Europe and affected patent strategies for many companies. 12 months into its operation, we will analyse the trends emerging from the first decisions and the implications for overall strategies for patent en­for­ce­ment. Du­ring the webinar, you will hear from CMS patent experts (lawyers and patent attorneys) from several jurisdictions, including Austria, France, Germany, the Netherlands, and the UK. They will provide a comprehensive overview of the UPC’s first year in numbers and discuss the issues relating to preliminary injunctions, security for cost and enforcement. You will also hear about the pros and cons of opting out of the system and why you might want to reconsider this de­ci­si­on. Whe­ther you are a patent holder or a licensee, whether you have opted out of the Unitary Patent System or not, this webinar will provide valuable insights into the impact of the UPC on all companies with innovative activities in Europe.  Topics covered:  UPC’s year one in numbers – overview in sta­tis­tics Pre­li­mi­na­ry injunctions – Landmark decisions in the UPC’s first year Victory – what happens next? Should you revisit your opt-out decision?The webinar will be held in English language and attendance is free of charge. Please register by clicking on the button below.
2024 Insurance sector webinar programme
Notwithstanding the extraordinary times we’ve all been operating in, the insurance sector continues to deal with fast-paced changes. Insurance companies should keep up with changing regulations and market trends that will impact on their day-to-day operations and long-term business. It has become crucial for (re)insurers, brokers, their risk managers and general counsels to get the right insight and advice across a wide range of claims and coverage, and regulatory and corporate issues. To deliver the expert responses that the insurance market needs, the CMS Insurance Group has developed a comprehensive programme for 2024.
Een half jaar Omgevingswet: update voor de vastgoedsector
Wij nodigen u van harte uit voor het interactieve webinar ‘Een half jaar Omgevingswet: update voor de vast­goed­sec­tor’ op donderdag 4 juli van 11.00-12.00 uur. Nieuwe Om­ge­vings­wet Sinds 1 januari...
Webinar: From Local to Global: Hiring Internationally in the Dutch Mobility...
We are pleased to invite you to the webinar "From Local to Global: Hiring Internationally in the Dutch Mobility Sector" on Tuesday, 2 July 2024 from 15:00 - 16:00 CEST, online via GoTo Webinar. In today's...
AI Regulations in the EU, UK, and Asia: Essential Insights for HR and Employers
Join the members of the International CMS Employment and Pensions team for a webinar designed specifically for HR professionals, corporate leaders, and legal experts who need to stay ahead of the developments in the AI regulations across jurisdictions. Why Attend? The EU AI Act is poised to reshape the framework within which businesses operate. Understanding its implications is crucial for managing compliance and leveraging AI technology effectively within your HR practices. This webinar, led by employment law experts from CMS, will cover:Overview of the most important regulations in the EU AI Act for em­ploy­ers­Spe­ci­fic impacts of the EU AI Act on roles, AI skills, AI-policies and the relationship with the works councilInsights on AI regulation in the UK and AsiaThe session will be conducted in English and is free of charge. We look forward to seeing you.
Cannabis law and legislation in the Netherlands
Medical use The Dutch Opium Act distinguishes drugs with a low risk of harm (‘soft drugs’) from drugs with a high risk of harm (‘hard drugs’). Cannabis is listed under the soft drugs category...
Real estate finance law in Netherlands
A. Mortgages 1. Can security be granted to a foreign lender? Yes, a mortgage can be granted to a foreign lender. 2. Can lenders take a mortgage over land and buildings on the land? Yes, lenders can take...
International arbitration law and rules in the Netherlands
In the Netherlands, arbitration has generally been the most important form of dispute resolution after court litigation. This is particularly the case for the resolution of construction and numerous other...
EU anti-dumping investigation of Chinese decor paper imports
On 14 June 2024, the EU opened an anti-dumping in­ves­ti­ga­ti­on con­cerning EU imports of decor paper originating in the People’s Republic of China, which could lead to substantial anti-dumping duties...
Erf­dienst­baar­heid opnieuw vestigen? Wenk onder Rb. Limburg, 08-11-2023
Dient een erf­dienst­baar­heid die door herinrichting teniet is gegaan op grond van on­ge­recht­vaar­dig­de verrijking opnieuw te worden gevestigd?Wenk in RN 2024/43, ECLI:NL:RBLIM:2023:6643De Ruil­ver­ka­ve­lings­wet...
What’s in a name? The use of a product code in a clinical trial protocol...
The use of product or sponsor codes in clinical trial related documents is common practice in the pharmaceutical space. However, such codes may not always be sufficient to disregard a publication as relevant...
The impact of the AI Act on Tech M&A due diligence
In mergers and acquisitions (M&A), due diligence is the cornerstone of a careful analysis of the commercial, legal, tax and financial aspects of a company. For this purpose, acquirers and their advisors take a deep dive into a company's key documents, contracts, books and records. In the land of Tech M&A, the rise of artificial intelligence (AI) raises the question of what is important in a due diligence investigation of AI companies. In our recent exploration of the AI Act we introduced the framework and its implications. The AI Act requires companies engaged in AI-related activities to undertake a comprehensive compliance assessment. This has implications for the focus of due diligence on such companies. The AI Act has been approved by the European Parliament, on 13 March 2024, and given the final green light by the European Council on 21 May 2024. The application of the AI Act will be staged over two years. Now that unacceptable risk AI systems will already be banned as per December 2024, it is time to zoom in on the various players distinguished in the AI Act from an M&A due diligence perspective. AI Act players & due diligence Any company that uses AI may be subject to the AI Act. Each provider, importer, distributor and deployer of AI is required to demonstrate its compliance with the AI Act in a manner unique to its own operations. This highlights the need for a tailored due diligence approach. Particular attention should be paid by companies which envisage ac­qui­ring:pro­vi­ders placing on the market, or putting into service, AI systems or placing on the market general-purpose AI models in the EU, irrespective of whether those providers are based within the EU or in a third coun­try;de­ploy­ers of AI systems that are based within the EU; orproviders and deployers of AI systems that run their businesses from a country outside the EU but where the output   produced by the AI system is used in the EU. This also means that the M&A due diligence must cover not only the AI systems developed or provided by the target company, but also those imported, distributed or deployed by it or its customers or partners in the EU. As the AI Act adopts a risk-based approach, the first step of any due diligence should be to determine or verify the appropriate classification of the AI systems: unacceptable risk, high risk, limited risk or minimal risk. AI systems classified as unacceptable risk are prohibited. AI systems classified as minimal risk is unregulated, and limited risk AI systems are subject only to light transparency obligations. From the perspective of the AI Act, the focus of due diligence should therefore predominantly be on high risk AI systems. Depending on the risk appetite of the acquiring company as well as potential impact on the reputation and competitiveness of the target company, the possibility of voluntary conformity assessment exists for AI systems that are not classified as high-risk, but which may nevertheless affect the rights or interests of individuals. The AI Act enlists prohibited AI practices in Chapter 2, and requirements for providers, deployers and other parties in respect of high risk AI systems in Chapter 3. We will consider the appropriate approach for each of these parties concerned below:Providers: companies involved in the development of AI systems under their own name and trademark are considered providers. Their obligations are the most extensive of all parties concerned and include the establishment of quality management systems, preparation of technical documentation, CE conformity marking and maintenance of logs. Any distributor, importer, deployer or other third party who makes a substantial modification to the high risk AI system or changes the intended purpose of an AI system and markets such AI system under its own name or trademark will be considered a provider under the AI Act. Furthermore, all documentation of providers needs to be kept for a period of 10 years. Importers: importers placing AI systems on the market under the name or trademark of companies outside the EU, must ensure that the required assessments,  documentation and CE conformity marking are in place. This should be verified as part of the due diligence. If these assessments and documentation are not in place, the importer is not allowed to place the product on the EU market and risks significant fines. Distributors: a distributor is any company in the supply chain that makes an AI system available within the EU without affecting its characteristics and is not a provider or importer. Distributors are responsible for ensuring that the AI system is accompanied by the appropriate technical documents and instructions, including the CE conformity marking, when providers or importers deliver these products to them. In addition, the distributors should verify that the provider and the importer have fulfilled their obligations under the AI Act. If not, corrective actions may be imposed, such as bringing the system into conformity or withdrawing the product. Deployers: companies that only use AI systems in their business must take appropriate technical and organisational measures to comply with the instructions given by the provider. These compliance obligations are lighter than those of the entities responsible for providing the instructions and related documentation. However, deployers should not only follow the instructions, but are also obliged to consider potential risks and notify a provider or distributor and the relevant market surveillance authority thereof. Furthermore, when notifying a provider or distributor, they should also suspend the use of a product. Deployers should also keep the logs (automatic recording of events) available, which should be reviewed as part of the due diligence. Parties using AI for  personal, non-professional activities, do not have to comply with these rules. Due diligence requires a thorough investigation to assess the compliance of all of these parties with their obligations under the AI Act. Moreover, the AI Act introduces additional obligations and restrictions on the providers and deployers of high risk AI systems that process personal data or that are used for purposes such as identification, detection, risk assessments or prediction of certain behaviour. ConclusionThe introduction of the AI Act will reshape due diligence in AI-related transactions. Compliance with the AI Act is essential for all players. Step 1 of any due diligence should be to determine or verify the appropriate classification of the AI systems: unacceptable risk, high-risk, limited risk or minimal risk. As part of this first step, the acquiring companies should also develop their ‘AI literacy’: the skills and knowledge to better understand how providers and deployers are taking into account their rights and obligations under the AI Act , and to become aware of the opportunities and risks of the relevant AI systems,. For high risk AI systems, step 2 of the due diligence requires a tailored and thorough investigation to assess the target’s compliance with its obligations under the AI Act.