Legislators and regulators across the European Union (EU) and the United Kingdom (UK) are intensifying efforts to enhance the protection of minors online, responding to growing concerns about children’s safety in the digital space. Recent regulations (including the EU Digital Services Act) and guidance impose increasingly strict obligations for providers to restrict access to harmful content for children.
To restrict access to such content, many providers, including social media, content sharing, and gaming platforms, use age verification systems relying on various technologies. These technologies can rely on payment card verification, ID verification, and age estimation based on a face scan, among others. While these technologies can contribute to the protection of children, they also need to comply with privacy and data protection laws which include principles such as data minimization and limited retention periods.
Key Takeaways
The European Data Protection Board (EDPB), composed of all national data protection authorities (SAs) of the EU, is currently working on a set of age verification guidelines. In the absence of pan-EU guidance, different regulators across the EU are moving in to fill the void by publishing regulations and guidance on age verification.
For example, Spain published guidance on age verification and protection of minors from inappropriate content in December 2023 setting forth 10 principles for systems to protect minors from inappropriate content. Similarly, regulators in Germany (including privacy, media, and the Digital Services Act regulators) jointly published guidance (in German) in October 2024 suggesting nine specific cornerstones for building an age verification system. The French Data Protection Authority (CNIL) has also published recommendations on the protection of minors online which provide that age verification of minors and collection of parental consent should comply with privacy principles.
In addition, France has taken a more aggressive approach by adopting several laws imposing strict standards on providers of specific platforms with respect to age verification measures. In July 2023, France adopted a law1 to prevent minors under the age of 15 from creating social media accounts without parental consent, although this law is currently not applicable due to potential inconsistencies with European Union law. More recently, France adopted another law aimed at securing and regulating the digital space (SREN Law)2 under which providers of adult content (PAC) located in France, outside the EU and, under specific conditions, in other EU Member States, must verify that users are over the age of 18 before granting them access to such content. In October 2024, France published a detailed mandatory technical standard3 applicable to age verification systems (AVS) used by PACs subject to the SREN Law. This standard became applicable in January 2025 and failure to comply with it can result in fines up to 150,000 euros or two percent of the worldwide annual turnover for a first offense, whichever is higher.
This technical standard is only legally binding for PACs, but it provides interesting insight as to what would be considered an effective and compliant AVS. The core principles are:
- Reliability. AVS should be reliable, which can include being able to detect liveness when estimating the age of an individual using their facial features (i.e., ensuring the user is not showing the photograph of an adult) or determining that an ID card is real and not a copy. Specific measures to prevent the circumvention of the AVS should also be implemented.
- Independence of the AVS provider. AVS providers must be legally and technically independent of any PACs and must guarantee that PACs do not under any circumstances have access to the data used to verify the user’s age.
- Safeguarding rights. AVS providers should allow users to choose between various age verification methods and to challenge the age verification result with the AVS.
- No discrimination. AVS should be effective for the whole population and should not be discriminatory.
In the UK, the Online Safety Act (OSA) will enter into force in phases over the course of the coming year. Compliance with the legislation will in practice require many providers of regulated services to implement “highly effective” age verification or age estimation measures. Ofcom (the UK regulator for communications services) introduced draft guidance in 2024 setting out examples of the kinds of age verification and age estimation that are, and are not, highly effective. Highly effective measures include open banking, photo ID matching, facial age estimation, and digital identity wallets. The OSA specifically states that relying on users to self-declare their age cannot be considered an effective means of age assurance. Ofcom’s guidance provides that for an age assurance method to be considered highly effective, it should be technically accurate, robust, reliable, and fair. Ofcom is required to publish reports about the use of age assurance and effectiveness of such measures, so organizations which are likely to be accessed by children may expect inquiries relating to age assurance by Ofcom.
In addition, the UK Data Protection Authority (ICO) updated its opinion on age assurance under the Age Appropriate Design Code (AADC) in January 2024 to provide guidance on how organizations can meet their data protection obligations whilst also complying with their duties under the Online Safety Act. The opinion assesses the key risks and benefits of four approaches to age assurance, including age verification, age estimation, self-declarations, and waterfall techniques. The ICO has also approved the “age check certification scheme” as a certified framework for assessing age verification solutions under the UK GDPR.
Conclusion
These recent developments on protecting children and age verification systems signal a critical need for organizations to scrutinize their existing compliance strategies to ensure adherence to evolving standards.
Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex digital regulation and privacy compliance in the UK and EU. For more information, please contact Yann Padova, Cédric Burton, Tom Evans, or Marie Catherine Ducharme.
Sebastian Thess and Claudia Chan contributed to the preparation of this Alert.
[1]Law No. 2023-566 of July 7, 2023, aimed at establishing a digital majority and combating online hate.
[2]Law No. 2024-449 of May 21, 2024, to secure and regulate the digital space.
[3]Referential determining the minimum technical requirements applicable to age verification systems set up for access to certain online public communication services and video sharing platforms making pornographic content available to the public.