The UK Online Safety Bill (OSB or the Bill) cleared an important hurdle in January 2023 after passing its third reading in parliament. The Bill was first published in May 2021 and has been subject to intense scrutiny. If enacted, it will place extensive obligations on providers of search engines and online platforms that enable user-to-user sharing, with the overall aim of improving safety online.

The OSB will be one of the most significant laws impacting the digital sector to be passed in the UK in recent times. Its focus on protecting users online aligns with that of the EU’s Digital Services Act (DSA), which has recently entered into force. The scope of obligations introduced by the OSB will depend upon the size and nature of the platform or search engine, with the most requirements applying to the largest and most high-risk platforms.

The OSB’s progress has been closely followed in the UK’s mainstream media, with a focal point being the duties that will be imposed on services likely to be accessed by children. In January 2023, the UK government confirmed its intention to introduce potential criminal liability for senior managers of companies that fail to comply with the OSB’s child safety rules and thereby risk serious harm to children. This criminal liability could give rise to criminal fines and imprisonment. This is in addition to the substantial administrative penalties that may be imposed by UK’s Office of Communications (Ofcom) under the OSB, including fines of up to 10 percent of worldwide revenue.

Duties to Protect Children

The OSB will apply to user-to-user services that have links with the UK; this will be assessed by reference to the service’s target markets, the location of its users, and the likelihood of users in the UK looking to access its services. Services in scope of the Bill will be required to:

  • Conduct a children’s access assessment. Regulated service providers will need to assess whether their service, or parts of their service, are likely to be accessed by children. This assessment will take into account any systems or processes in place (such as age verification methods) to prevent access by individuals under the age of 18. Ofcom has stated that it expects to start engaging with companies for input and feedback on children’s access assessment requirements in 2023, with final guidance to be released in Spring 2024. Services will likely need to have completed their assessments by mid-2024.
  • Conduct a children’s risk assessment. Regulated services that are likely to be accessed by children will be required to carry out a children’s risk assessment (CRA), keep that CRA up to date, and notify Ofcom of any significant change to the risks identified. Carrying out the CRA will be a significant undertaking and will require service providers to have a detailed understanding of their user base, the content available on their services, and how that content may be amplified by any algorithms deployed. The OSB contains detailed requirements as to the timeframe within which CRAs must be completed; services that are likely to be accessed by children will have three months to complete their CRA once Ofcom’s guidance and codes of practice are published, which is expected to take place in Autumn 2024.
  • Protect children through the design and operation of services. Providers of services that are likely to be accessed by children will be required to comply with duties to design and operate their systems such that:
  • they mitigate and manage the risks of harm to children in different age groups, and the impact of harm presented by content on the service; and
  • they prevent children of any age from encountering certain types of “priority content” (a concept to be expanded on by Ofcom) and protect children in age groups judged to be at risk from encountering other content that is harmful to children.
  • Provide mechanisms to report content that is illegal or harmful to children.Users and affected others, including parents or adults responsible for child users, must be able to easily report content that they consider to be illegal or otherwise harmful to children.
  • Offer a complaints procedure. Services will be obliged to offer easily accessible and transparent complaints procedures that allow users and affected persons to submit complaints about content considered to be harmful to children, or other perceived failures to comply with the duties outlined above. It must also be possible to submit a complaint where content has been removed or restricted on the basis that it is harmful to children.
  • Report child sexual exploitation and abuse content to the National Crime Agency. All user-to-user services will be required to report all detected and otherwise unreported child sexual exploitation and abuse content found on their service.

Next Steps

It is anticipated that the OSB will be enacted in 2023. Ofcom will then consult on and publish a range of guidance and codes of practice, supporting interpretation of the duties outlined above. A “roadmap to regulation,” published in July 2022, details the broad timeline that Ofcom intends to follow.

Companies that have taken steps to comply with the EU’s DSA and the UK’s Age-Appropriate Design Code are likely to be well placed to begin work on complying with OSB. At this stage, a priority should be to consider the extent of the risk assessments they are likely to need to carry out, the resources and supporting information that will be required, and whether there is any appetite to engage with the consultations to be run by Ofcom.

For more information, please contact Cedric Burton, Laura De BoelManeesha Mithal or others at the firm’s privacy and cybersecurity or antitrust practices.

Laura De BoelManeesha Mithal, Tom Evans, and Hattie Watson contributed to the preparation of this post.