On October 27, 2022, the Digital Services Act (DSA) was published in the Official Journal of the European Union, sweeping in a new era in the regulation of digital services. (See Wilson Sonsini’s DSA Fact Sheet.)
The DSA applies to providers of digital services, including those based outside the EU that provide services to users in the region. At more than 100 pages, the legislation imposes a raft of obligations on these companies, with some of the most burdensome relating to content moderation, online advertising, and trader transparency.
Most companies will have until February 17, 2024, to comply with the DSA. However, for very large online platforms (VLOPs) and very large online search engines (VLOSEs), the DSA will apply four months after their designation by the European Commission (EC), which could take place as early as in the first half of 2023.
What Is the DSA?
The DSA complements the Digital Markets Act (DMA) which enters into force on November 1, 2022. Together, the DSA and the DMA are designed to regulate digital services in the EU, along with existing laws:
- Content Moderation/Notice and Take-Down. The DSA will apply in parallel to other pieces of legislation that impose moderation and transparency requirements. The regulation on terrorist content online1 recently became applicable and requires online platforms to remove terrorist content within one hour of receiving a removal order from a competent authority. Other related draft laws include regulations on political advertising2 and child sexual abuse material.3
- Ecommerce. The DSA builds on, but does not replace, the existing eCommerce Directive. For example, the DSA adds to the trader transparency requirements in the eCommerce Directive.
For further background information on the development of the DSA, please see previous Wilson Sonsini Client Alerts.4 For more information about the DMA and its obligations, see the Wilson Sonsini Client Alert here.
Who Does the DSA Apply to?
The DSA applies to “intermediary services,” a catch-all term that includes providers of conduits, caching services, and hosting services (including online platforms and search engines). The obligations to which a company will be subject vary according to their nature and size. Online platforms (such as social media platforms) are subject to more extensive obligations than other intermediary service providers. VLOPs and VLOSEs are online platforms or search engines that have more than 45 million users in the EU and will be designated as such by the EC. These businesses are subject to the highest degree of regulation.
Companies that are not established in the EU but fall within the scope of the DSA because they offer services to individuals or companies in the region will need to designate a representative.5 This representative could be held individually liable for cases of noncompliance with the DSA, independently of any liability and legal actions against the intermediary service provider.6
Core obligations. The core obligations to which intermediary service providers are subject include:
- Acting on orders against illegal content and orders to provide information. These orders, issued by national judicial or administrative authorities, may require content to be removed, or information about individual recipients of a service to be provided.
- Implementing and enforcing enhanced terms and conditions. Terms and conditions (T&Cs) must be sufficiently detailed, particularly in relation to any content moderation activities, including algorithmic decision-making, and must provide details of complaint procedures that are available. Services that are primarily used by children will need to accommodate their level of understanding. Any significant changes must be communicated to users.7
- Publishing yearly reports. Companies must publish reports on content moderation at least once a year. The report must be made available to the public in a machine-readable format and must be easily accessible. The information to be provided depends on the nature of the provider’s business.8
- Implementing a notice and takedown mechanism. Any company that stores or shares information (a “hosting service” for the purpose of the DSA) must implement a means for users to flag illegal content online.9 After a notice has been reviewed, the notifier will usually need to be informed of the outcome and the affected user informed of why the action was taken.10
Online platforms. In addition to the core obligations outlined above, online platforms will be required to:
- Comply with new advertising requirements. The DSA prohibits targeted advertising based on the profiling of sensitive data11 or children’s data.12 In addition, online platforms will need to present clear information about each ad they display, including who paid for the ad and the main variables that determine who sees it.13
- Disclose information about the use of recommender systems. Online platforms that use fully or partially automated systems to recommend content must detail in their T&Cs how content is recommended, including which criteria are most significant for determining what information is presented. Online platforms will also need to disclose any options for the user to modify these parameters.14
- Refrain from using dark patterns. The DSA prohibits online platforms from designing and organizing their interface in a way that seeks to shape user behavior in a particular way, e.g., by giving prominence to certain choices when asking a user to make a decision.15
- Offer a complaint mechanism, adopt measures against misuse, and provide information about out-of-court dispute settlements. Online platforms will be required to offer a complaint mechanism for notifiers and affected users to challenge decisions made in relation to content or their accounts (such as decisions to remove or disable access to content).16 Online platforms that find a user repeatedly providing clearly illegal content, or frequently submitting unfounded notices or complaints, must issue warnings to those individuals. If that warning is ignored, online platforms can suspend the user, or refuse to process its complaints or notices for a reasonable period.
- Comply with Know Your Customer requirements. Online marketplaces will need to collect background information from traders before permitting them to use their service. Traders will need to provide information such as payment account details before they can offer goods on the online marketplace.17
- Notify customers of illegal products. Online marketplaces must take reasonable steps to check official online databases to ensure that the products and services on offer on their platforms are not illegal. If they become aware of any illegal products or services, they will need to notify the purchasers directly, or where that is not possible, provide public notice.18
- Comply by design. Online marketplaces will also need to design their interface to allow traders to comply with their obligations and clearly identify the products and services that they offer to customers in the EU.19
VLOPs and VLOSEs. VLOPs and/or VLOSEs are required to comply with extensive obligations under the DSA, including the following additional requirements:
- Advertising transparency requirements for VLOPs. VLOPs will be required to create an accessible ad search function on their user interface. Users should be able to consult this search tool, using multicriteria functions, to discover information including the content of the ad, how long it was presented for, whether the ad was specifically targeted at one group and if so, the criteria for excluding groups from viewing the ad. The information about each ad will need to be available for one year.20
- Recommender system requirements for VLOPs. VLOPs that use recommender systems will need to offer users at least one option for a non-personalized service i.e., a recommender system that is not based on profiling of personal data.21
- T&C requirements. VLOPs and VLOSEs will also need to provide users with a concise, easily accessible, and machine readable-format summary of their T&Cs and publish them in the official language of each EU country where they offer services.22
Intermediary services will be regulated by national regulators, coordinated by one Digital Services Coordinator in each Member State.23 VLOPs and VLOSEs will be regulated by the EC.24 The DSA also creates a new “European Board for Digital Services” to help ensure consistent enforcement of the DSA across the EU.25 Fines for noncompliance are high and will reach a maximum of six percent of a company’s annual worldwide turnover.26
Regulation (EU) 2021/784 of the European Parliament and of the Council of April 29, 2021, on addressing the dissemination of terrorist content online (Text with EEA relevance), found at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A32021R0784
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on the transparency and targeting of political advertising, found at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52021PC0731.
Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL laying down rules to prevent and combat child sexual abuse, found at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=COM%3A2022%3A209%3AFIN&qid=1652451192472.
 European Commission Proposes New Rules for Digital Platforms, January 12, 2021, found at https://www.wsgr.com/en/insights/european-commission-proposes-new-rules-for-digital-platforms.html; EU Parliament and Council Take Next Steps to Advance Major New Rules for Digital Platforms, found at https://www.wsgr.com/en/insights/eu-parliament-and-council-take-next-steps-to-advance-major-new-rules-for-digital-platforms.html; EU Reaches Political Agreement on Additional New Rules for Digital Platforms in the Digital Services Act, found at https://www.wsgr.com/en/insights/eu-reaches-political-agreement-on-additional-new-rules-for-digital-platforms-in-the-digital-services-act.html.