The European Commission published on 14 July 2025 the final version of its highly anticipated Guidelines on the protection of minors online (the Guidelines)1. The Guidelines clarify which concrete measures are expected from platforms that are accessible to minors in order to comply with one of their vaguest obligations under the Digital Services Act (DSA): to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service” (Article 28(1)).
The Guidelines reveal the Commission’s intention to reshape the regulatory model for online platforms accessible to minors, with the exception of micro and small enterprises. They aim to shift the compliance paradigm away from a reactive approach focused on takedown of illegal content, towards an ex-ante, risk-based ‘by design’ and ‘by default’ approach.
Age assurance measures, i.e. measures enabling online platforms to verify the age of their users, are central to this new framework. While the French media regulator Arcom expects the Guidelines to “transition the internet to the era of age assurance”2, the Guidelines however acknowledge that such tools may not always be appropriate and proportionate, warranting the adoption of less far-reaching measures in some cases.
Finally, the Guidelines confirm France’s leading role in steering EU regulatory standards for the protection of minors online. The Guidelines follow many of the recommendations formulated by Arcom during the Guidelines’ public consultation process, notably the requirement that adult websites (including pornographic websites) implement effective age verification to check their users’ age3, as mandated by France’s ‘SREN Law’ adopted in May 2024 and in Arcom’s accompanying technical standard of October 20244.
A risk-based and holistic approach
Echoing the DSA, the Guidelines propose a risk assessment model. Platforms accessible to minors must conduct a risk assessment of their service, at least annually or whenever making significant changes to the platform, to determine what appropriate and proportionate measures to implement.5
The Guidelines provide an extensive and non-exhaustive list of recommended measures to be implemented if proportionate and appropriate according to the platform’s risk assessment.
One of their most striking characteristics is their holistic nature, addressing all aspects of platform engagement and consumption, including:
- Account settings: Children’s accounts should be set by default to the highest level of privacy, safety and security, such that by default only accepted contacts can interact with children’s accounts, and no account can take screenshots of content posted by minors. Certain features should also be turned off by default, such as the autoplay of videos, push notifications, the number of ‘likes’, the ‘... is typing’ function, filters associated to negative effects on body image, self-esteem and mental health and recommendations of other accounts.
- Online interface design: Minors should not be exposed to persuasive design features aimed predominantly at engagement, such as the possibility to scroll indefinitely, notifications artificially timed to regain minors’ attention, virtual rewards for performing repeated actions, and prominently displayed AI chatbots and filters. Minors should also be provided with effective time management tools.
- Recommender systems and search features: Recommender systems should prioritise explicit over behavioural signals from children, prevent recommendations of content promoting illegal or harmful content (such as unrealistic beauty standards or dieting) and should enable children to reset their recommended feeds.
- Manipulative or addictive commercial practices: Minors should not be exposed to excessive volumes, frequency and recommendation of commercial content, to integrated AI systems such as chatbots that nudge children for commercial purposes, nor to practices that can lead to excessive or unwanted spending such as paid loot boxes and virtual currencies.
- Moderation, reporting and parental control tool: Measures should be introduced to improve moderation and reporting tools, e.g. ensuring that human review is available in addition to automated content moderation, guaranteeing prompt feedback and appropriate response times, effective parental control tools, support resources and appropriate warnings.
- Governance: Internal policies and procedures should be implemented to ensure and monitor minor protection. Dedicated human resources should be assigned to minor protection and the relevant staff should be trained.
- Terms and conditions: Transparency disclosures must explain the measures deployed to protect minors, and information presented to children must be child-friendly, age-appropriate, and easily understandable.
A focus on age assurance measures
Age assurance as part of a broader toolkit
While the drive to implement age-based access restrictions have raced to the political agenda across the EU in recent years, most recently with the Commission unveiling a blueprint for an EU-wide age verification solution for determining whether users are over 186, the Guidelines are a reminder that pulling up the drawbridge on access is only one piece of the compliance jigsaw, and not the only possible solution.
Indeed, the Guidelines provide that before deciding whether to implement access restrictions supported by age assurance methods, platforms should always conduct an assessment to determine whether such measures are appropriate and proportionate, or whether other less far-reaching measures described in the Guidelines may already achieve a high level of protection, for example, instead of age-restricting the service as a whole, by only age-restricting access to the functions, sections or content which pose a risk to minors7. Age assurance should be considered as a complementary tool to other measures, rather than substitutes8.
What type of age assurance measures should be implemented?
The Guidelines distinguish between three broad categories9 of age assurance measures:
- Self-declaration methods rely on individuals to confirm their age or age range, typically by ticking a box online to declare to be above a certain age;
- Age estimation methods allow a provider to establish that a user is ‘likely’ to be of a certain age, fall within a certain age range or be over or under a certain age; and
- Age verification systems rely on physical identifiers or verified sources of identification that provide a ‘high degree of certainty’ in determining the user’s age.
Under the Guidelines, self-declaration is not considered to be an appropriate measure, due to its lack of robustness and accuracy. Instead, platforms should assess whether age verification and/or age estimation is appropriate.
The Commission recommends the most accurate form of age assurance – access restriction supported by age verification – where:
- Products or services pose a high risk to minors, e.g. (i) sale of alcohol, tobacco or nicotine-related products, drugs (ii) access to pornographic content, (iii) access to gambling content;
- The service’s T&Cs prescribe a minimum age of 18 years or older, even if there is no age requirement established by law;
- EU or national law set a minimum age to access certain services, e.g. defined categories of online social media services; and
- The provider has identified risks to minors which cannot be mitigated by other less intrusive measures.
Since the publication of the Guidelines, Arcom confirmed that the level of the standard set by the Guidelines with respect to verifying minors’ age on adult websites across the EU is equivalent to that of its own technical standard regarding pornographic website dated October 2024, providing welcomed certainty for platforms already compliant with Arcom’s standard in France.
Finally, age estimation supporting access restriction is recommended in cases where:
- The service’s T&Cs prescribe a minimum age lower than 18, e.g. social media platforms which prescribe a minimum age of 13; and
- The provider has identified medium risks to minors which cannot be mitigated by less restrictive measures.
Who should implement age assurance measures?
The DSA states that “providers of online platforms accessible to minors” are responsible to comply with Article 28(1). The Guidelines add that where a third party is used to carry out age assurance, the online platform remains responsible to ensure that the age assurance method is effective, including where the provider relies on solutions provided by operating systems or device operators.
Nonetheless, an ongoing fierce debate continues to divide technology companies over who should implement age assurance systems. Major online platforms are advocating to push age verification obligations down in the tech stack, to the app stores or the operating system providers. On the other hand, device makers argue that the onus should rest with online platforms which are “closer” to the content, thus better placed to implement tailored gating solutions.
Beyond age assurance measures, the EU and/or member states may also impose complementary requirements (to the extent they are compliant with EU law) on different types of tech operators in order to protect minors online. For instance, the French legislator decided to introduce a requirement for manufacturers to install parental control systems on devices connected to the internet10.
Conclusion and Outlook
The Guidelines provide a long-awaited path towards a harmonised regulatory framework for the protection of minors online across the EU. Rather than adopting an overly prescriptive one-size-fits-all model, they take a flexible, risk-based approach, cognizant that different online platforms may pose different types of risks to minors.
Still, implementation will be no small feat. For many platforms, the Guidelines introduce significant new processes, such as carrying out risk reviews, implementing protective measures, developing and updating policies and complying with governance and transparency obligations. The challenge will be particularly steep for platforms which are neither Very Large Online Platforms nor Very Large Online Seach Engines, and which are not already subject to the systemic risk assessment requirement under Article 34 DSA. Risk assessments under Article 28(1) are also likely to overlap with other risk assessment procedures under the GDPR and the AI Act.
Despite the Guidelines’ aim to adopt a unified approach, national divergences may also persist and raise compliance challenges, particularly regarding the establishment of a minimum age for accessing social media.
French officials claim that the Guidelines leave the door open for national laws to set their own minimum age for accessing social media, a political priority in several member states including France, Denmark, Spain and Greece. In order to achieve such ‘digital majority’, France enacted a law11, in July 2023, requiring users to be at least 15 to register on social media platforms (unless parents give their consent) and requiring platforms to verify their age based on an age verification standard set by Arcom. The law’s entry into force depended on a decree being adopted, but the decree was never passed after the then-EU Commissioner Thierry Breton expressed concerns that that law seemed contrary to the DSA in a letter to the French Minister for Europe and Foreign Affairs in 202312.
French officials claim that the Guidelines give France the green light to implement a minimum age for social media access13. During her State of the Union annual address on September 10th 2025, Commission President Ursula von der Leyen signalled that the EU was also looking into an EU social media ban for minors; she pledged to commission a panel of experts to advise her “on the best approach for Europe” to restrict minors’ access to social media, by the end of 202514. Stakeholders will be watching closely for the panel’s findings and the developments that follow, at both EU and national levels.
In any case, the Guidelines will provide a compliance framework for future technical standards adopted by national regulators or legislators. In France, this includes the forthcoming technical standard on age assurance solutions aimed at protecting minors from gambling commercial communications which Arcom is expected to adopt15.
Finally, the Commission indicated that, although following with the Guidelines is voluntary and does not automatically guarantee compliance, the Guidelines will be considered a significant and meaningful benchmark when determining the compliance of platforms with Article 28(1) DSA. This should prompt platforms accessible to minors to review their compliance frameworks.
1 The Guidelines follow a period of public consultation on draft Guidelines which began in May 2025.
2 Arcom’s contribution to the public consultation on the guidelines on the protection of minors online under the Digital Services Act
3 Arcom-Referentiel-technique-sur-la-verification-de-age-pour-la-protection-des-mineurs-contre-la-pornographie-en-ligne.pdf
4 Law No. 2024-449 of 21 May 2024 to secure and regulate the digital space, so-called ‘SREN Law’
5 Platforms should also make the results of the review publicly available on the online interface of the service.
6 The blueprint will be tested by five member states (Denmark, France, Greece, Ireland and Spain), online platforms and end users. Once finalised, the model will provide a reference standard for age verification solutions in the EU, which can then be implemented through apps (provided by public or private entities) or integrated in the upcoming EU Digital Identity Wallet (due to become available by December 2026).
7 Such as adult-restricted sections of a social media platform, or sections containing adult-restricted commercial communications or product placement
8 The Commission considers that platforms that deem such measures to be necessary and proportionate should publish information about the identified age assurance measures, their adequacy and effectiveness, as well as an overview of the performance metrics used to measures this.
9 Mapping age assurance typologies and requirements, European Commission, 2024
10 Law No. 2022-300 of 2 March 2022 aimed at reinforcing parental control on internet access means and its implementation Decree No. 2023-588 of 11 July 2023
11 Law No. 2023-566 of 7 July 2023 aimed at establishing digital majority and combating online hate
12 1696663577344-Lettre Thierry Breton.pdf
13 https://www.lexpress.fr/societe/reseaux-sociaux-le-gouvernement-relance-son-projet-dinterdiction-pour-les-moins-de-15-ans-GRK44PEA2JBFPDQHM3OX7AAELE/
14 2025 State of the Union Address by President von der Leyen
15 Law No. 2023-451 of 9 June 2023 aimed at regulating commercial influence and combating the abuses of influencers on social networks