Developments in law, regulatory guidance, and enforcement practice across Europe are leading to meaningful changes in how online services are offered to minors. A steady stream of announcements in recent months makes clear that this area will continue to develop at pace, requiring providers of online services to keep their approach to age assurance under regular review.

Maturing Regulatory Regimes

At a high level, age assurance requirements in Europe are driven by a combination of privacy and online safety laws, supplemented by sector-specific rules in certain countries. Although the core legal frameworks underpinning these regimes have been in force for some time, it is increasingly regulatory guidance, supervisory engagement, and enforcement activity that are driving how these requirements are interpreted in practice:

  • Privacy and consent requirements. Where a company relies on consent as its legal basis for processing minors’ data under the General Data Protection Regulation (GDPR), it must take reasonable steps to ensure the consent is provided by a parent or person with parental authority. In practice, this may require age verification. Regulatory guidance, such as that issued by the Irish Data Protection Commission (IDPC) and the UK Information Commissioner’s Office (ICO), stresses that age assurance methods must be effective for their intended purpose and proportionate to the risks present in a service. So far in 2026, the ICO has issued two fines related to parental consent, and the regulator has also issued an open letter calling on companies to strengthen their age-assurance practices.
  • Online safety and platform regulation. Providers of “online platforms” are required by the Digital Services Act (DSA) to take steps to protect the privacy, safety and security of minors, and providers subject to the UK’s Online Safety Act (OSA) can be required to implement age assurance if certain types of harmful content can be accessed on their services. Guidance issued by the European Commission and Ofcom under both the DSA and OSA largely echoes the approach of privacy regulators, stressing the need for a proportionate, risk-based approach, with providers of high-risk services required to meet more prescriptive standards. Both regulators have active ongoing investigations into services’ compliance with their age assurance duties.
  • Sector and product-specific restrictions. Most countries have national legislation in place, which prohibits access by minors to certain types of high-risk content, including, for example, pornography and online gambling.

The Emergence of the “Social Media Ban”

Inspired by Australia’s “social media ban”, which came into force at the end of 2025, EU lawmakers are now calling for the introduction of a framework that would require a digital minimum age of 16 for access to social media, video sharing, and AI companions, but with an exception for 13-16 year olds where parental consent is obtained. This comes alongside a number of other developments for online platforms, including the proposed introduction of a Digital Fairness Act, which would tackle online consumer protection issues, including by requiring mandatory age assurance for digital products accessible to minors that contain certain commercial practices.

At a national level, a number of European governments are considering implementing restrictions that may have a similar effect. For instance:

  • In France, the government is pursuing an accelerated legislative procedure to enact a new bill before September 2026, that would require a minimum age of 15 for access to certain social media, with exceptions for encyclopedias, educational and scientific content, as well as open source software development and sharing platforms.
  • In the UK, the government is currently consulting on a range of proposals to strengthen the company’s online safety regime, including a minimum age for social media services, and a potential change to the age of digital consent (currently set at 13 in the UK).
  • The Spanish Government has recently approved the draft Organic Law on the protection of minors in digital environments, which is now being examined by Parliament. The draft law raises the minimum age for consent in digital services (including social media) from 14 to 16. In parallel, the Government has also announced a separate review of the law governing the right to honor and self-image, which is still in an early drafting stage. This law focuses on protecting personal rights, such as image and voice. The draft law specifies that teens over 16 are able to provide consent for the use of their image, and teens below 16 need to obtain parental consent unless they have a sufficient degree of maturity. Consent required under this law is not the same as consent under data protection law, but rather a commercial consent based on an individual’s right to image.
  • In Norway, following a public consultation, the government recently announced that it will introduce a new law that would prohibit social media platforms from offering their services to minors under the age of 16, and will make platform providers responsible for age verification. Services such as video games, platforms used for communication related to school or extracurricular activities, and services used for buying and selling goods and services, or to provide housing or job advertisements would not be part of the prohibition.
  • In Austria, the government has committed to publishing a draft law by June 2026 that would establish a minimum age of 14 for access to social media services, along with mandatory age-verification requirements for platforms.

Meeting Age Assurance Requirements

Both the DSA and OSA are technology neutral, and while Ofcom and the European Commission have each issued guidance addressing some common methodologies, most companies have discretion over the most appropriate methodology to implement, in view of the nature of their service and regulatory exposure. For many companies, this may involve adopting a range of measures, including age estimation tools, alongside more robust age verification mechanisms such as document checks or digital identity solutions. The effectiveness of these measures should be assessed on an ongoing basis, taking into account evolving user behavior, known risks, and technological developments.

Most current approaches to age assurance rely on a combination of commercial solutions and bespoke in-house tools. This will likely evolve over time, as a push across jurisdictions—in particular, the EU—toward digital identity frameworks creates opportunities for more standardized and interoperable approaches to age verification. As an early example, in April 2026, the European Commission announced the launch of a new age verification app, which has been designed to work on any device and is fully open source.

Considerations When Implementing Age Assurance

Companies should consider the following when implementing an age assurance strategy:

  1. Identify applicable regulatory frameworks. Expectations to implement age assurance can arise under the GDPR (including through instruments such as the ICO’s Age Appropriate Design Code or IDPC’s Fundamentals), as well as under the OSA (for user-to-user services) and the DSA (for online platforms). Mapping which regimes apply to a service is a key first step, as the applicable standard for age assurance will vary.
  2. Select and design an appropriate methodology. In practice, this will often involve taking a risk-based approach to determine the level of age assurance required and implementing a combination of measures. Companies should also consider how age assurance interacts with the user journey (e.g., whether it should be required only when accessing certain features or elements of a service).
  3. Carry out a DPIA. Data Protection Impact Assessment (DPIA) requirements continue to apply even where age assurance is mandated by law. If you are relying on a third-party solution, or you are building one in house, carrying out a DPIA is likely to be necessary to identify any risks associated with the processing.

Wilson Sonsini has extensive experience with both data protection and platform regulation. We help clients design practical strategies that meet DSA and GDPR requirements, reduce compliance risks, and support business goals in the EU. If you have any questions, please contact Cédric BurtonLaura De BoelYann PadovaNikolaos TheodorakisTom EvansMarie Catherine Ducharme, or any member of the Data, Privacy, and Cybersecurity practice.