On April 3, 2024, the UK Information Commissioner’s Office (ICO) released a statement setting out its priorities for protecting children’s privacy online. The priorities reflect the ICO’s strategy for the next phase of implementing its Children’s code of practice (also known as the “AADC”) and signal a focus by the regulator on the operations of social media and video-sharing platforms (platforms). The ICO will look at platforms’ default settings for children’s profiles, recommender systems and how they obtain consent to the processing of children’s data. The statement also indicates that the ICO will conduct audits of EdTech providers to identify privacy risks and potential noncompliance with applicable legislation.

Implementing the Children’s Code to Date

The privacy of children online has been high on the ICO’s agenda since the Children’s code was first published in 2021, but has been identified as a focus area for the regulator in 2024. The Children’s code sets out 15 standards of age-appropriate design that online platforms and services that are likely to be accessed by children are expected to comply with. To date, the ICO’s strategy for implementing the Children’s code has focused on producing guidance, and auditing and assessing how some of the world’s largest gaming and social media platforms process data about children.

ICO’s Focus Areas Going Forward

During the next phase of the Children’s code strategy, the ICO will focus on the following areas:

  1. Default settings for children’s profiles. Children’s profiles (the Children’s code defines a child as a person under the age of 18) should be private and geolocation settings should be off by default. Profiling children to target ads should also be off by default, unless there is a compelling reason to profile children for this purpose.
  1. Recommender systems. Algorithmic recommender systems that use children’s data e.g., search results or behavioral profiles, increase the risk of children being shown inappropriate content such as that relating to self-harm, suicide ideation, and eating disorders. Recommender systems can also be designed to encourage children to spend more time online and provide more information than they would otherwise.
  1. Age assurance and consent collection. In the UK, children under the age of 13 are unable to consent to the processing of their personal data. Where a platform relies on consent as its lawful basis for processing, they must obtain consent from an adult with parental authority. The ICO will focus on how platforms use age assurance technologies and implement appropriate safeguards to protect children from harms that arise as a result of the use of personal data. Age assurance tools are rapidly evolving and the ICO recently published an updated opinion on age assurance for the Children’s code to reflect the latest developments in the area. 

Next Steps

The ICO will identify key platforms to engage with further as part of its supervision and engagement efforts, with the focus being on working to reduce the most serious risks to children’s privacy rights. Meanwhile, the ICO will also gather input from a range of other stakeholders, e.g., parents, and provide further guidance where required. The ICO indicates that it will also complete audits on how EdTech solutions are developed, provided, and used in schools. Platforms and EdTech solutions should review their practices against the Children’s code, focusing on the key priorities of the ICO. It will be important to document compliance measures and strategies in case of questions. Adopting these measures have wider relevance as children’s privacy is also a priority for data protection authorities in the European Union, such as in France (CNIL) and Spain.[1] 

Safeguarding Children Online Is High on the Agenda for Other Digital Regulators in the UK

As it begins to exercise its new powers under the Online Safety Act 2023 (OSA), the UK’s online safety regulator, Ofcom, is focusing on children’s safety online. It is likely that a number of the platforms that will be in the ICO’s spotlight for the Children’s code will also fall in scope of the OSA. Ofcom is expected to publish draft guidance on how platforms can comply with children’s safety duties under the OSA in the coming months. It is likely that there will be significant overlap between Ofcom’s expectations for the OSA and the ICO’s approach to implementing the Children’s code. Children’s safety duties under the OSA are expected to start applying in mid-late 2025 but implementing the principles of the Children’s code can help Platforms prepare to comply with the duties.

A Global Trend

Regulators and legislatures in the United States are also increasingly focused on implementing privacy and safety protections for children and teenagers. At the federal level, the Federal Trade Commission has proposed significant changes to the Children’s Online Privacy Protection Rule. Congress is also considering children’s privacy legislation, though nothing has passed yet. States have also been active in this space. For example, Utah and Florida recently passed legislation regulating minors’ use of social media services. And during the IAPP Global Privacy Summit, enforcers from California and Connecticut stated that they would focus on children’s data practices.

Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex digital regulation and privacy compliance in the UK and EU, especially in the area of children’s data. For more information, please contact Nikolaos Theodorakis, Libby Weingarten, or Tom Evans.

Hattie Watson contributed to the preparation of this blog post.


[1] Spanish data protection authority (AEPD), ‘The Agency presents its Global Strategy on minors, digital health and privacy’ (January 29, 2024), available at: https://www.aepd.es/prensa-y-comunicacion/notas-de-prensa/la-agencia-presenta-su-estrategia-global-sobre-menores-salud-digital-y-privacidad.