In a decision with far-ranging implications for federal administrative law, the United States Supreme Court issued its long-awaited ruling in Loper Bright Enterprises v. Raimondo (Loper Bright).1 The Supreme Court’s six-Justice majority held that the Administrative Procedure Act (APA) requires courts interpreting agency regulations to determine independently whether the agencies have acted within their statutory authority, even where the statute at issue is ambiguous. In so holding, the Court overruled its 1984 decision in Chevron USA v. Natural Resources Defense Council, which for the last four decades had governed thousands of cases involving federal agency interpretations of ambiguous laws.Continue Reading “Chevron is overruled”: How Loper Bright Will Change the Regulatory Law Landscape

On June 18, 2024, the California Attorney General and the Los Angeles City Attorney (collectively, “the People”) announced a settlement with Tilting Point Media LLC (Tilting Point). The settlement resolves allegations that Tilting Point violated the Children’s Online Privacy Protection Act (COPPA), the California Consumer Privacy Act (CCPA), and the Privacy Rights for California Minors in the Digital World Act (Digital Privacy for Minors Act).Continue Reading Video Game App Developer Agrees to Pay $500,000 for Children’s and Minors’ CCPA, COPPA, and Ads Violations

On May 9, 2024, Maryland Governor Wes Moore signed HB 603, the Maryland Age-Appropriate Design Code (Maryland AADC). The Maryland AADC builds on Maryland’s Online Data Privacy Act, which was signed into law the same day and requires companies to provide certain protections for personal data of a consumer when the company knows or has reason to know the consumer is a child under the age of 13.1 The Maryland AADC layers on additional requirements for “covered entities” and expands the definition of “child” to include individuals under the age of 18.Continue Reading Maryland Passes Age-Appropriate Design Code

Despite national efforts over the past decades, child sexual abuse material (CSAM) and online child sexual exploitation are still unfortunately prevalent. In 2023, the National Center for Missing and Exploited Children (NCMEC) received over 35.9 million reports of suspected CSAM.[1] This is more than a 20 percent increase over the previous three years. Notably, NCMEC’s 2023 report highlighted concern about the significant increase in reports involving generative artificial intelligence, noting that the Center received 4,700 reports of CSAM or other sexually exploitative content related to these technologies.Continue Reading New Minor Safety Obligations for Online Services: REPORT Act Expands Child Sexual Exploitation Reporting Requirements

On April 7, 2024, Representative Cathy McMorris Rogers (R-WA) and Senator Maria Cantwell (D-WA) announced that Congress will once again consider a comprehensive federal data privacy bill that, if passed, would dramatically alter the privacy landscape across the United States.Continue Reading Congress Proposes New Comprehensive Privacy Legislation: The American Privacy Rights Act

On April 3, 2024, the UK Information Commissioner’s Office (ICO) released a statement setting out its priorities for protecting children’s privacy online. The priorities reflect the ICO’s strategy for the next phase of implementing its Children’s code of practice (also known as the “AADC”) and signal a focus by the regulator on the operations of social media and video-sharing platforms (platforms). The ICO will look at platforms’ default settings for children’s profiles, recommender systems and how they obtain consent to the processing of children’s data. The statement also indicates that the ICO will conduct audits of EdTech providers to identify privacy risks and potential noncompliance with applicable legislation.Continue Reading UK Privacy Regulator Details Next Stages of Its Strategy to Protect Children Online

On March 25, 2024, Governor Ron DeSantis signed Florida’s HB 3. The law requires that social media platforms prohibit users under 14 years old from creating accounts and requires these platforms to obtain parental consent for account registrants who are 14 or 15 years old. The law also imposes age verification requirements for online services that knowingly distribute a significant amount of “harmful” content.Continue Reading State Social Media Law Patchwork Emerging: Florida Passes Law to Restrict Minors’ Use of Online Services