In recent months, politicians and regulators across a number of jurisdictions have called on operators of online platforms to take seriously their legal obligations to promote a safe online environment. The safety of children online has continued to dominate this conversation, with a recent joint UK-U.S. statement (Statement) declaring that online platforms should “go further and faster in their efforts to protect children.”
This alert sets out the regulatory focus areas of the European Commission (EC), the Irish Coimisiún na Meán (CNAM), and the UK’s online safety regulator Ofcom.
Background
The EU Digital Services Act (DSA) was enacted to create a safer online environment for individuals in the EU. It seeks to do this by imposing duties on providers of online services to, for example, moderate illegal content, be transparent in dealings with users, and offer redress where content is disabled or removed. In the UK, the Online Safety Act 2023 (OSA) was enacted to achieve broadly similar aims and will impose a range of novel duties on online services.
The EC acts as the regulator of the largest online platforms and search engines under the DSA. In recent months the EC has issued requests for information to several of these companies. Smaller online services are subject to local enforcement. In September 2024, the Irish CNAM announced that it would follow the EC’s lead by issuing requests for information to a number of online platforms that are based in Ireland, questioning them about their compliance with specific DSA obligations. Ofcom, the UK’s online safety regulator, has stated that it will take “strong action” where necessary to enforce the OSA once it comes into force.
The EC’s Approach to Large Platforms and Search Engines
The EC is required to issue guidance on certain duties under the DSA and has taken early action to assess compliance by the largest online platforms (VLOPs) and search engines (VLOSEs) with their obligations. Focus areas for the EC currently include:
- The Impact of Algorithms on Mental Health. The DSA requires VLOPs and VLOSEs to assess any systemic risks arising from the design or functioning of their service, including algorithmic systems. On October 2, 2024, it was announced that the EC had written to a number of VLOPs asking that they explain how their algorithms comply with these obligations.
- Generative AI Risk Assessments. The DSA requires VLOPs and VLOSEs to carry out assessments to identify systemic risks arising from the design and functioning of their services. In March 2024, the EC sent requests for information to six VLOPs and two VLOSEs asking for details of steps taken to assess and mitigate risks associated with Generative AI, including potentially negative impacts of the technology on electoral processes, dissemination of illegal content, and protection of personal data.
- Online Protection of Minors. In July 2024, the EC launched a call for evidence to inform its approach to drafting guidelines on the protection of minors online. The EC expects to adopt these guidelines before summer 2025. They will provide platforms subject to the DSA with guidance on the measures they should take to ensure a high level of privacy, safety, and security for minors, including appropriate age assurance measures.
- Dark Patterns. The DSA prohibits online platforms from designing, organizing, or operating their interfaces in a way that would deceive or manipulate users. In recent months, the EC has indicated that platforms that allow any user to make use of a “confirmed” status might be acting in a misleading manner, given the wider use of such a feature in the industry (e.g., to indicate that a profile belongs to a prominent/public figure). Providers will therefore need to look beyond their own platforms when considering design updates.
- Advertising Transparency. VLOPs and VLOSEs must keep a public advertising repository where individuals can view ads and related information. In September 2024, it was announced that the EC had commissioned a study aimed at assessing the compliance of VLOPs and VLOSEs with this requirement, in order to identify any gaps.
Announcement by the Irish Coimisiún na Meán
In September 2024, the CNAM (the authority responsible for monitoring DSA compliance in Ireland), announced a formal review of online platforms’ systems in response to complaints made by users to the authority about difficulties in reporting illegal content.[1] An initial review of platform practices carried out by the CNAM, along with complaints forwarded by other European regulators, indicated potential compliance issues.
The CNAM’s announcement indicates that it will issue targeted requests for information relating to the following DSA obligations:
- Reporting Mechanisms. Online platforms are required to provide users a mechanism to report content that they believe is illegal. When an online platform receives a report from a user, they must review it without undue delay, and make a decision as to whether any action should be taken in relation to the content at issue, or any user responsible for it. This “notice and takedown” process is highly structured under the DSA.
- Points of Contact for Users. To improve transparency and accountability, users must be provided with a single point of contact at online platforms that they can communicate with easily and rapidly.
An FAQ accompanying the CNAM’s announcement indicates that the regulator is mindful of parallel investigations being carried out by the EC and will carry out its enquiries accordingly.
Developments in the UK
The OSA imposes requirements on online platforms and search engines to identify and mitigate the risks posed by unlawful content. In October 2024, Ofcom announced that the first obligations under the OSA would come into force in December 2024, with child-specific duties to follow in early 2025. Ofcom has been vocal about its plans to police the OSA, stating in an open letter to online service providers that rather than waiting for the legislation to come into force, companies should “act now.” This message was repeated in the Statement, with a call on online platforms to use the resources available to them to develop “innovative solutions” to child safety issues.
Looking Ahead
Online platforms and search engines operating in the EU and UK face an increasingly complex regulatory environment, with new guidance developing rapidly. Recent announcements made by the EC and CNAM may be helpful to companies in identifying their compliance priorities in the coming months.
DSA noncompliance can, in the most severe cases, be sanctioned with fines of up to six percent of the global turnover of a service provider. Under the OSA, Ofcom will have the power to impose fines of up to 10 percent of a service provider’s qualifying worldwide revenue. Regulators also have a host of other enforcement powers at their disposal.
Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex digital regulatory and privacy compliance in the UK and EU. For more information, please contact Cédric Burton, Laura De Boel, Tom Evans, Chris Olsen, or Laura Brodahl of the firm’s data, privacy, and cybersecurity practice.
Sonia Mjati contributed to the preparation of this Wilson Sonsini Alert.
[1] https://www.cnam.ie/coimisiun-na-mean-opens-review-of-online-platforms-compliance-with-eu-digital-services-act/.