The Online Safety Bill (OSB or Bill) passed its final reading in the UK’s Parliament in September 2023. The Bill will become law in the coming weeks, ushering in a new era for the regulation of digital services in the UK. Online platforms and search services that fall within the scope of the legislation will be subject to proactive content risk assessment and mitigation duties oriented at protecting users, regardless of where those services are established. The Bill has attracted considerable media attention due to its anticipated impact on the operation of online services in the UK, as well as the potential for it to interfere with freedom of speech.
The OSB will apply to providers of i) online platforms that allow users to generate, upload, or share content with others, and ii) search services (together, “providers”). Online platforms or search services will fall within the scope of the legislation if they: i) have a significant number of users in the UK, ii) target the UK market, or iii) otherwise present a material risk of significant harm to individuals in the country.
The most burdensome requirements under the OSB relate to the protection of children, the completion of risk assessments, and the removal of illegal content. The Bill bears similarities to the EU’s flagship Digital Services Act (DSA), but also has important differences. For example, it will require all in-scope providers, irrespective of their size, to be proactive in identifying risks of harm to children, and take steps to mitigate and manage those risks. For pan-European platforms and services, the OSB therefore imposes a further layer of regulation in an increasingly complex field.
Duties Imposed on Online Platforms and Search Services
Providers of in-scope services are required to fulfil a number of duties under the OSB. Ofcom, the regulator charged with enforcing the legislation, is required to issue guidance and codes of practice in the coming months, setting out recommended measures to comply with these duties. The requirements will only apply once the relevant guidance has been published.
- Duties to protect children online. The extensive duties introduced by the OSB to protect children are arguably the centerpiece of the legislation. These obligations, which were subject to considerable debate in Parliament, will require most providers to take action to assess the risk that their services pose:
- Children’s risk assessments. Providers of services that are “likely to be accessed by children” must carry out a risk assessment to identify key characteristics of their user base (including the number of users in specific age ranges); the level of risk of them being exposed to content, features, functionalities, or behaviors that are harmful to children; and the risk of harm resulting. To determine whether their platform is “likely to be accessed by children,” providers must carry out a children’s access assessment. Providers will only be able to conclude that their platforms or search services cannot be accessed by children if they implement age verification or estimation techniques. Ofcom will publish guidance on the topic of carrying out children’s access assessments, and launched a call for evidence in January 2023 in which it sought views on topics including age verification technologies currently being used by companies, and the cost and impact of using these. Guidance will also be published on age verification and age estimation techniques that are effective in determining whether a user is a child (this guidance will focus on the use of such technologies to prevent children from encountering pornographic content).
- Measures to protect children from harm. Providers will have a duty to mitigate and manage the risk of harm to children, including those harms identified through their children’s risk assessment. Providers must also take measures to prevent at-risk age groups from encountering certain types of harmful content, features, functionalities, or behaviors (for example, content that is abusive, incites hatred, amounts to bullying, or depicts real or realistic violence against people or animals).
- Requirements relating to all categories of users. Providers will be subject to a range of obligations in relation to all users of their service:
- Illegal content risk assessments. All providers must carry out a risk assessment that accounts for the user base of their service. This assessment must consider the level of risk of individuals encountering certain forms of illegal content. Online platforms must also assess the risk that the platform could be used to facilitate certain offenses being committed. The risk assessment duty is wide-ranging and must take into account how the platform or search service is used, as well as its design. Guidance will be made available on how to conduct risk assessments, with Ofcom stating that while some services will be able to rely heavily on the resources it publishes, others “will need to do much more work to assess risks accurately and adequately.”
- Protecting users from illegal content and other risks, and taking down content. Online platforms and search services should be designed and operated to proactively (i) prevent individuals from encountering “priority illegal content” (broadly, terrorism content, child sexual exploitation content, or content amounting to certain criminal offenses), and (ii) mitigate the risk of harm to individuals identified through the most recent illegal content risk assessment. Where proportionate, content moderation is one of the measures that online platforms will be expected to implement to protect users from such risks. In addition, online platforms should mitigate the risk of a platform being used to commit certain criminal offenses. Providers should use systems and processes to minimize these harms and act when provided with notice of illegal content.
- Content reporting and complaints procedures. Providers must allow users or other affected persons to report content that they consider to be illegal or harmful to children. Users should also be able to submit complaints in relation to matters including the presence of illegal content on the service, or action taken by the provider to remove content or restrict a user’s access.
- Additional obligations for the largest providers. Under the legislation, the largest providers may be designated as either “Category 1” (largest online platforms), “Category 2A” (largest search services), or “Category 2B” (anticipated to include large high-risk, high-reach online platforms that do not meet Category 1 user number thresholds). The most stringent obligations include requirements to:
- Implement content control for adult users. Category 1 services must provide tools for adults to increase their control over being shown certain types of content (“user empowerment”). If applied, these tools should either reduce the likelihood that the user will see content of a particular nature, or alert the user to such content when it is present on a service. This includes content that encourages suicide, self-harm, eating disorders, or is otherwise abusive or may incite hatred.
- Identity verification. Category 1 services must offer all adults the option to verify their identity, and explain how these verification systems work in their terms of service.
- Maintain transparent terms of service. Category 1 services must summarize the outcome of illegal content and children’s risk assessments in the terms of service. Providers must not act against user-generated content, restrict access, or ban users except in accordance with the terms of service. This requirement will not prevent services from removing content that is illegal or could harm children.
- Protect against fraudulent advertising. Category 1 and Category 2 services must take measures to prevent individuals from encountering content that amounts to fraudulent advertising, and minimize the length of time that such material is available on their service.
Significant Penalties
The Bill provides Ofcom with a range of formal powers to investigate in-scope services and disrupt their business model in certain circumstances. This may include seeking orders (subject to court approval) to block access to services, or alter the operation of the services insofar as they are directed at the UK. Ofcom also has the power to issue substantial fines up to the greater of £18,000,000 or 10 percent of worldwide revenue. In addition, the Bill provides that senior managers and company officers may face criminal liability for certain transgressions, particularly where there has been a failure to protect children online.
Preparing for Compliance with Online Regulation in Europe
It is likely that many services in scope of the Bill will also be required to comply with the EU DSA. The requirements in the DSA will become applicable to all intermediary services on February 17, 2024 (for more on the DSA, see our client alert here). This timing roughly coincides with when the first requirements of the OSB are expected to become applicable.
Phased Next Steps
The OSB introduces novel legal requirements in the UK that will take some time to fully develop. Provisions will become applicable in phases, as Ofcom publishes its guidance and codes of practice. The top priority is guidance on tackling illegal harms, including how all providers should conduct their own risk assessments and identify illegal content. This guidance is expected to be published shortly after the OSB enters into force, and will become applicable three months after publication, likely early 2024. Guidance on child safety duties and requirements specific to the largest platforms will follow in phases two and three, respectively.
In the meantime, providers of online services that could fall within scope of the OSB should prepare UK compliance programs that will center around risk management, and be ready to conduct risk assessments of illegal content hosted on the service by early 2024.
Please join Wilson Sonsini’s European data regulatory team for insights on the UK’s Online Safety Bill and what it means for providers of online platforms. Register here for the virtual webinar, to be held on Thursday, October 19.
Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex digital regulation and privacy compliance in the UK and EU. For more information, please contact Laura De Boel, Cédric Burton, Yann Padova, Nikolaos Theodorakis or Tom Evans.
Hattie Watson contributed to the preparation of this Wilson Sonsini Alert.