On May 13, 2025, the European Commission (EC) published draft guidelines on the protection of minors online. The guidelines outline the proposed measures that the EC expects online platforms accessible to minors to take to protect minors’ privacy, safety, and security in line with requirements under the Digital Services Act (DSA).
The guidelines come at a formative time in the development of the legal landscape relating to the use of online services by minors. In April 2025, Ofcom published its Protecting Children from Harm Online Statement, which provides guidance on the steps that online platforms and search services should take to comply with the UK Online Safety Act. Taken together, these developments highlight the need for platforms to adopt coherent and flexible compliance strategies.
Requirement Under the DSA to Protect Minors
The DSA requires providers of online platforms that are accessible to minors to implement measures aimed at ensuring their protection. An online platform is considered accessible to minors when: i) its terms and conditions permit minors to use the service; ii) its service is directed at or predominantly used by minors; or iii) the provider is otherwise aware that some of the recipients of its service are minors. “Minors” and “children” are defined as individuals under the age of 18 for the purposes of the guidelines.
The guidelines, which have been highly anticipated, elaborate on the language used in the DSA, clarifying that the range of harms minors should be protected from include those relating to illegal or harmful content, unwanted contact, cyberbullying, harmful communications, and the extensive use or overuse of a platform.
What Is the Role of Age Assurance?
The guidelines clarify that age assurance will not always be necessary and that it may be proportionate in some cases to manage risks to minors through the use of alternative safeguards. Providers should therefore assess whether age assurance is appropriate, before moving to implement such technologies.
The guidelines build on the European Data Protection Board’s statement on age assurance, which we discussed here, and reiterate that where implemented, the onus is on the provider to estimate or verify the age of their users. There are generally three types of age assurance measures:
- Age verification, which uses physical identifiers or verified sources of ID (such as government-issued IDs, including the EU Digital Identity Wallet) that provide a high degree of certainty in determining the age of the user. This is required for certain high-risk services, such as those designed for an adult audience only or where terms and conditions require a minimum age of 18.
- Age estimation, which allows a provider to establish that a user is likely to be of a certain age or fall within a certain age range. This is required for medium-risk services and those which require a minimum age lower than 18 to access the service.
- Self-declaration, which relies on the individual to supply their age. The EC confirms in this guidance that self-declaration alone is not an appropriate age assurance method.
Providers should make more than one age assurance method available (e.g., two different age verification or estimation methods, or one verification and one estimation method). This should help avoid users being excluded if they cannot avail themselves of a specific age assurance method. To assess the effectiveness of the age assurance method, companies should consider factors such as the accuracy, reliability, and the robustness of the method, balanced against the users’ rights and freedoms.
What Is Required?
The guidelines state that providers of online platforms should carry out a risk review focused on identifying i) the likelihood of a minor accessing their service; ii) the risks posed by the online platform to minors; iii) measures that are or could be taken to mitigate those risks; and iv) the impact that those measures could have on the rights of minors. The guidelines provide an extensive list of measures that could be taken to protect minors, if proportionate. These include:
- Taking steps to protect minors against economically exploitative practices. The guidelines discourage commercial practices aimed at encouraging design-led purchases. These are often driven by hidden or disguised advertising (such as product placements by influencers), practices that lead to excessive spending or addictive behaviors (such as gambling-like features), and marketing of products that may have an adverse impact on the user. Providers are expected to take proactive measures listed in the guidelines to ensure a safe online environment for minors, such as implementing a responsible marketing and advertising policy.
- Setting accounts for minors to the highest level of privacy, safety, and security by default. Default settings should be implemented to prevent unwanted contact by individuals seeking to harm minors. For instance, only accounts that a minor has previously accepted as contacts should be able to see content and posts from that minor, and any geolocation, microphone, or camera features should be turned off. Minors should also be prevented from being exposed to features such as the ability to scroll indefinitely, or notifications artificially timed to regain minors’ attention.
- Implementing effective and minor-friendly user reporting, feedback, and complaint tools. The guidelines expand on the DSA’s notice and takedown requirements, including by recommending that all users are able to report content or activities that are inappropriate or undesirable for minors. Reports and complaints submitted by minors should also be prioritized.
- Adopting good platform governance. Providers should establish governance practices such as i) internal policies and ii) dedicated personnel or teams responsible for minors’ privacy, safety, and security. Platforms should also adopt monitoring and evaluation practices.
- Tailoring terms and conditions for minors. Platforms should ensure that their terms and conditions detail how minors are protected from harmful content and behavior. The terms should be searchable, easy to find, and enforced.
Next Steps
The guidelines are subject to public consultation until June 10, 2025, with the final version expected to be published in summer 2025. Providers of online platforms that are subject to regulation under both the DSA and the OSA should consider how to benefit from areas of potential overlap between the two regimes.
Join Wilson Sonsini’s Data, Privacy, and Cybersecurity practice for a timely webinar exploring the latest legal developments impacting online services in the U.S., EU, and UK. This session will feature a special focus on children’s privacy and online safety, highlighting key regulatory changes, compliance deadlines, and best practices. Register for the webinar, “Children’s Online Protection: An Update on Key Legal Developments Across the US, EU, and UK,” scheduled for June 10 at 12 p.m. ET / 6 p.m. CET, here.
Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex digital regulation and privacy compliance in the UK and EU. For more information, please contact Cédric Burton, Laura De Boel, Tom Evans, or any member of the Data, Privacy, and Cybersecurity practice.
Marie Catherine Ducharme, Claudia Chan, Jessica O’Neill, and Talya Dostes contributed to the preparation of this post.