Key contacts
The year 2025 was characterised by dynamic developments in EU and German data protection law. New statutory initiatives and regulatory amendments, particularly in the area of EU digital regulation and the use of artificial intelligence (AI), have significantly changed the data protection framework conditions. National legislation and political innovations in Germany are also paving the way for data economy.
This article highlights the most important developments of the past year.
What's new in EU digital regulation?
In November 2025, the European Commission published its draft "Digital Omnibus". It aims to simplify the system of European digital regulation, including with regard to the frameworks on data, AI, cybersecurity and platforms, without reducing the safeguards. Changes to the EU General Data Protection Regulation (GDPR), the Data Governance Act and the AI Act, among others, are expected. For example, the obligation of providers or deployers to teach AI literacy to persons using AI systems on their behalf, which has been in effect since 2 February 2025, is to be dropped. Instead, the EU Commission and the Member States are to bear this responsibility as part of their overall policies.
In addition, the applicability of the provisions on important transparency obligations for providers and deployers of low-risk AI systems and high-risk AI systems is to be postponed. The transitional periods continue in 2026 (find out more here: AI Act – new transitional periods and German market surveillance).
Data Act applicable since 12 September 2025
The obligations under the Data Act have been binding since 12 September 2025. The Data Act aims to create a comprehensive legal framework that harmonises access to and use of data across the EU. Among other things, users of IoT products have a right to access the data generated by the use of their products and services.
In 2025 the European Commission also published the "Final Report of the Expert Group on B2B data sharing and cloud computing contracts". The Commission's non-binding Model Contractual Terms (MCTs) for various scenarios (e.g. data controller to user) and Standard Contractual Clauses (SCCs) for key aspects of cloud computing contracts (e.g. switching and termination) under Article 41 Data Act, which are now available to the public, are of particular relevance.
In Germany, the national implementing acts are in the draft stage (Data Regulation Application and Enforcement Act (DADG)). The Federal Network Agency (BNetzA) is to be the central supervisory authority for the application and enforcement of the Data Act. The CDU/CSU and SPD government elected in 2025 also wants to take action on other data issues.
Data protection and data use under the new federal government
The coalition agreement was announced on 9 April 2025. This contains several statements on data protection and data use: The coalition parties have set themselves the goal of reducing bureaucracy and reforming data protection. At the same time, use of data in accordance with the German Federal Data Protection Act (BDSG) is to be simplified in favour of innovation and research. The data economy already envisaged by the EU also found its way into the coalition agreement: In order to make Germany a strong digital location, a "culture of data utilisation and data sharing" is to be established and data treasures are to be harvested. The coalition agreement envisages that a basis will be created for consolidating regulations in a Data Code (Datengesetzbuch).
AI remains a hot topic
The federal government elected in 2025 has also flagged AI as one of its focus areas. It wants to make Germany an "AI nation" and place AI at the centre of its economic and technology policy strategy. Plans include massive investments, a national gigafactory and the establishment of regulatory sandboxes. Although the federal government wants to place the emphasis on openness to innovation, there are also data protection regulations that need to be taken into account when using AI, particularly in areas where sensitive data are involved. How this will all actually be implemented in practice remains to be seen.
The legislature is also taking action in relation to AI. On 12 September 2025, for example, the draft of an "AI Market Surveillance and Innovation Promotion Act" (KI-MIG) was published. This builds on a draft from last year. The KI-MIG provides guidance on how Germany could structure its supervisory framework and organise its authorities to implement the AI Act. For all areas without an existing or legally assigned supervisory authority, the Federal Network Agency will be the competent market surveillance authority and notifying authority. It also provides for provisions on cooperation and collaboration between the competent market surveillance authorities and the involvement of other authorities. This concerns, in particular, the national data protection authorities. They take a critical view of the draft and express this in various press releases: Here is an example of the press release from Berlin's Commissioner for Data Protection and Freedom of Information.
But it is not just the federal government and the legislature who are addressing the topic of AI. As in previous years, the use of AI is a hot topic throughout the data protection scene. The data protection authorities now provide data controllers with all kinds of guidelines and tools in this regard.
Guidance for risk management of AI systems
In order to protect personal data, the European Data Protection Supervisor (EDPS) published Guidance for Risk Management of AI Systems on 11 November 2025. This guidance aims at providing insights and practical recommendations to help identify and mitigate common technical risks associated with AI systems. The EDPS Guidance provides a checklist for each phase of the AI lifecycle development.
Guidance on the data protection implications of generative AI systems using the RAG method
The Conference of Independent Data Protection Supervisory Authorities in the Federal and State Governments (Data Protection Conference) has published version 1.0 of its Guidance on the data protection implications of generative AI systems using the Retrieval Augmented Generation (RAG) method. RAG systems combine large language models with targeted access to proprietary knowledge sources to provide context-specific responses and increase the accuracy and reliability of AI output, e.g. internal chatbots that access current business data. This allows companies and public authorities to leverage the advantages of modern AI while reducing risks to data subject rights, provided that transparency, purpose limitation and rights of data subjects are maintained. However, data protection challenges such as transparency and the evaluation of individual processing operations remain and require ongoing technical and organisational measures (TOMs).
AI and data protection before Cologne Higher Regional Court: Training AI with user data permitted
AI is also already playing an important role in the courts. In May 2025, for example, Cologne Higher Regional Court made an important decision in summary proceedings on the subject of AI training and data protection (Cologne Higher Regional Court, judgment of 23 May 2025 – 15 UKl 2/25). In the proceedings, the Higher Regional Court rejected an application by the NRW consumer advice centre to prohibit Meta from using publicly shared Facebook and Instagram data of adult users to train a large language model (LLM). The consumer advice centre complained, among other things, of breaches of the General Data Protection Regulation (GDPR) and of the prohibition on combining data pursuant to Article 5 (2) (b) Digital Markets Act (DMA) and sought an injunction. Cologne Higher Regional Court rejected the application and held that the data processing was necessary in particular because a legitimate interest was being pursued in accordance with Article 6 (1) (f) GDPR.
Further important data protection decisions from 2025
In 2025, the national and international courts once again dealt extensively with the General Data Protection Regulation (GDPR) and issued decisions in many relevant proceedings. However, not every case came to a conclusion. After the General Court (EGC) dismissed the action for annulment of the EU adequacy decision on the transfer of personal data between the EU and the US in its decision of 3 September 2025 (T-553/23) and confirmed that the US guaranteed an adequate level of data protection at the time of the decision, it is now confirmed that the CJEU will rule on the matter. The MEP and claimant in the Latombe case has lodged an appeal with the CJEU against the decision of the General Court. The Latombe v Commission case is now pending before the CJEU under case number C-703/25 P. What is noteworthy about the General Court's decision is that the Court did not rule on permissibility, namely the claimant's right to bring an action, but expressly left this matter open and ruled exclusively on the merits of the action. The right to bring an action is questionable in these proceedings, as an individual is challenging an adequacy decision. The transfer of data to the US therefore remains an ongoing issue and the decision of the CJEU on the merits (whether the legal action is admissible and well-founded) remains to be seen.
CJEU clarifies the concept of personal data
In case C-413/23 P, the CJEU clarified the concept of personal data in the context of the transfer of pseudonymised data to third parties at the beginning of September 2025. In its decision, the CJEU clarified that pseudonymised data do not automatically constitute personal data for all recipients. Rather, it depends on whether third parties can identify the data subject with reasonable effort. The decisive perspective for the assessment of identifiability is that of the data controller at the time of data collection, whereby the circumstances of the individual case must always be taken into account.
Responsibility of the operator of an online marketplace website within the meaning of Article 4 (7) GDPR
Of particular note is the decision of the CJEU of 2 December 2025 (C-492/23). In this case the CJEU ruled that the operator of an online marketplace website is responsible for the processing of personal data contained in advertisements published on its platform. In particular, the controller is required, before the publication of the advertisements, to identify those which contain sensitive data and to verify whether the advertiser is actually the person whose data is contained in the advertisement or whether the advertiser has the explicit consent of the data subject. If such consent has not been given, publication of the advertisement must be refused. According to the CJEU, the aforementioned obligations must be ensured by appropriate TOMs.
This decision of the CJEU has far-reaching implications for operators of online marketplaces, as it considerably expands their data protection obligations and establishes clear guidelines for handling (sensitive) personal data. The CJEU is thus setting new standards for control and prevention on digital platforms and increasing the requirements for TOMs.
Important for newsletters: the CJEU rules on "soft opt-ins"
With its decision of 13 November 2025 (C-654/23), the CJEU also issued an important ruling on the permissibility of newsletters in connection with the sale of a product or service, the so-called "soft opt-in". Firstly, the CJEU interprets the term "direct marketing" broadly, such that an email address is also deemed to have been obtained "in connection with the sale of a product or service" if the service is free of charge for the user but other paid services can be ordered via this service. The same applies if the service is financed by advertising. Secondly, the CJEU ruled in this case that, in particular, the conditions for the basis of processing set out in Article 6 GDPR do not apply (in parallel) to Article 13 (3) Directive on privacy and electronic communications, as this supersedes the GDPR as a more specific standard. According to the CJEU, the conditions for the lawfulness of processing set out in Article 6 (1) GDPR do not apply if the controller uses a user's email address to send them an unsolicited communication in accordance with Article 13 (2) Directive on privacy and electronic communications. Article 13 of this "ePrivacy Directive" was implemented in Germany in section 7 German Unfair Competition Act (UWG).
Right of consumer protection associations and competitors to pursue data protection breaches in competition law proceedings
In its judgment dated 27 March 2025 (I ZR 186/17) the German Federal Court of Justice ruled that a social media provider had breached Article 12 (1) sentence 1, Article 13 (1) (c) and (e) GDPR because it had not provided users with sufficient information in a generally understandable form about the nature, scope and purpose of the collection and use of personal data, the legal basis for the processing and the recipients of the personal data. According to the German Federal Court of Justice, this constitutes a violation of competition law pursuant to section 5a (1) German Unfair Competition Act (UWG), which can be pursued by consumer protection associations before the civil courts.
On the same day, the German Federal Court of Justice ruled in two other cases (I ZR 222/19, I ZR 223/19) that pharmacists who sell medicine on an online platform breach data protection provisions if they collect and use their customers' order data without their express consent. According to the Federal Court of Justice, the order data constitutes health data within the meaning of Article 9 GDPR. According to the Federal Court of Justice, these breaches can be prosecuted by competing pharmacists by taking legal action under the law on unfair competition. The court confirmed that Article 9 (1) GDPR is a market conduct rule within the meaning of section 3a German Unfair Competition Act (UWG).
Transmission of "positive data" to credit agencies is permissible
Recently, the courts have been dealing with a large number of cases in which the claimants were seeking compensation from a mobile phone company in accordance with Article 82 GDPR due to the disclosure of personal data by the mobile phone company to a credit agency. The majority of the claims were dismissed according to our analysis on grounds of the General Data Protection Regulation (GDPR) not having been breached (and/or lack of damage) as most courts considered the processing to be permitted by Article 6 (1) (f) GDPR.
The German Federal Court of Justice has now confirmed this assessment in its judgment of 14 October 2025 (VI ZR 431/24). The case decided by the German Federal Court of Justice did not concern compensation in accordance with Article 82 GDPR, but rather the permissibility of data processing. The action was not brought by a data subject, but by a consumer organisation that objected to the transfer of positive data. Its action for an injunction was dismissed by the lower court. This decision has now been upheld by the German Federal Court of Justice. The German Federal Court of Justice considers the transfer of the master data of consumers required for identity verification and the transfer of information that a contractual relationship has been established or terminated to the credit agency in accordance with Article 6 (1) (f) GDPR to be justified. The defendant's legitimate interest is sufficient fraud prevention. This decision by the German Federal Court of Justice will have an impact on compensation proceedings in similar cases.
Compensation under the GDPR continues to be one of the top issues before the courts
In 2025, the issue of compensation in accordance with Article 82 GDPR has again been one of the top issues for the courts. Of the approximately 180 sets of proceedings that we included in our overview of case law on Article 82 General Data Protection Regulation (GDPR) in 2025, our analysis shows that around 120 legal actions have been dismissed so far in 2025. The amount of compensation awarded ranges from EUR 100 to EUR 5,000.
On 28 January 2025 the German Federal Court of Justice ruled that sending an advertising e mail without the consent of the recipient does not give rise to a claim for compensation (VI ZR 109/23). According to the German Federal Court of Justice, the claimant's pleading did not allow the Court of Appeal to conclude that he had suffered non-material damage as a result of the unauthorised use of his email address. The court did not find that there was a loss of control, nor could the claimant's fear of such loss be sufficiently substantiated.
On 28 January 2025, the German Federal Court of Justice also ruled on the transfer of data to a credit agency by a mobile phone company (VI ZR 183/22). At the time of the transfer, the payment claim asserted was disputed between the parties. In a counterclaim against the asserted claim for payment, the person concerned asserted GDPR claims for compensation. The German Federal Court of Justice's decision primarily concerned the amount of compensation Article 82 GDPR, which does not have as a purely compensatory function, the German Federal Court of Justice held that this did not have a negative effect on the defendant. Overall, a claim for compensation of EUR 500 was deemed appropriate.
Negative feelings as non-material damage
In response to a referral from the German Federal Court of Justice (order of 26 September 2023 – VI ZR 97/22) the CJEU ruled by way of its decision of 4 September 2025 (C-655/23) that the term "non-material damage" within the meaning of Article 82 GDPR could include negative feelings experienced by the data subject as a result of the unauthorised transfer of their personal data to a third party, such as worry or annoyance caused by a loss of control over these data, its possible misuse or damage to reputation. However, according to the CJEU, this presupposes that the data subject proves that they have such feelings and the negative consequences, due to the relevant GDPR breach. The degree of fault is not to be taken into account when calculating the amount.
The underlying case concerned a bank erroneously forwarding applicant data to an uninvolved third party during the job application process. The applicant was not informed promptly that the data had been forwarded erroneously. In the proceedings, the claimant asserted that he had not only suffered an abstract loss of control over the data, but that the data had been passed on to a third person known to him and working in the same industry. At first instance, Darmstadt Regional Court (judgment of 26 May 2020 – 13 O 244/19) awarded the data subject EUR 1,000, while Frankfurt a.M. Higher Regional Court (judgment of 2 March 2022 – 13 U 206/20) dismissed the action at the next instance.
Compensation for duties to provide information not met due to search engine searches
In its judgment of 5 June 2025 (8 AZR 117/24), the German Federal Labour Court upheld the decision of the lower court Düsseldorf Labour Court (judgment of 10 April 2024 – 12 Sa 1007/23) and ruled that in the case in question the compensation in accordance with Article 82 GDPR, as awarded by the lower court, was appropriate.
The case to be decided by the German Federal Labour Court concerned information that was not properly provided to an applicant who was rejected by the company. The potential employer had found out about the applicant's criminal conviction (which was not yet final and absolute) through a search engine search, but had not informed him about this. According to the German Federal Labour Court, the claimant had not proven any loss beyond the EUR 1,000 already awarded by the lower court. Claims for compensation by employees in the case of data protection breaches are a recurring topic in case law and will continue to be relevant in 2026.
2026: the suspense continues with regard to GDPR compensation
In the large number of proceedings in which claimants are seeking compensation from a mobile phone company pursuant to Article 82 GDPR due to the disclosure of positive data to a credit agency, a referral was made to the CJEU in 2025. While most national courts had previously dismissed the claims on the assumption that the transfer was covered by Article 6 (1) (f) GDPR or that no damage had occurred in any case, Lübeck Regional Court referred the question to the CJEU in its order of 4 September 2025 (15 O 12/24) as to whether the provision applies to these cases and whether this provision should be interpreted as meaning that it "cannot justify the transfer of positive data from mobile phone companies to credit agencies organised under private law without the consent of the data subjects, at least if the credit agencies then also use the transferred data for profiling (scoring)."
Another question referred by Lübeck Regional Court concerns Article 82 GDPR. The Regional Court would like to know from the CJEU whether Article 82 (1) and (2) GDPR are to be interpreted to the effect that a loss of control can also exist "if positive data was transmitted by mobile phone companies to credit agencies organised under private law without the consent of the data subject and not deleted there until after well over a year at the earliest and the consumer concerned was informed of the data transmission when the contract was concluded". In the pending proceedings (C-594/25), the CJEU now has to make a decision.
Enforcement tracking: GDPR fines exceed the EUR 5 billion mark
According to the latest edition of the CMS Enforcement Tracker Report 2025, a total of 2,560 fines were recorded in the period under review up to the editorial deadline of 1 March 2025. That is almost 160 fines more than in the previous assessment period. The average fine for the entire reporting period of the CMS Enforcement Tracker Report was around EUR 2.36 million in all countries. In addition, the total amount of fines first exceeded the EUR 5 billion mark for the first time in the spring of 2025 and totalled around EUR 5.65 billion at that time. Some of the highest individual fines of the year came to between EUR 125 million and EUR 530 million.
Europe-wide review of the right to erasure and subject of the 2026 coordinated action
After the right to erasure under Article 17 GDPR was the subject of the coordinated action of the data protection authorities in 2025, the EDPB announced the topic of the coordinated action for 2026 on 14 October 2025. This will concern compliance with the transparency and information obligations set out in Article 12 to Article 14 GDPR. These provisions of the GDPR ensure, among other things, that data subjects are informed when their data are processed. The aim of the coordinated actions is to assess the implementation of the GDPR by companies and authorities. In addition, the data protection authorities hope to gain an overview of best practices by controllers. The report on the results of the coordinated action 2025 on the right to erasure will follow shortly.
The year 2025 has shown that: Data protection law is characterised by dynamism and adaptation to new technical developments. The case law of the CJEU and the national courts has concretised the requirements for proof of non-material damage and the interpretation of central GDPR provisions. The increasing number and amount of fines emphasise the growing importance of data protection for companies. Overall, data protection law remains an exciting and constantly evolving field.
In this blog post, we have prepared and summarised the most important news of the year from our quarterly newsletter for you. You can subscribe to the newsletter here (German).