On June 7, 2023, the New York legislature passed the Stop Addictive Feeds Exploitation (SAFE) for Kids Act (SAFE Act or the Act) and the New York Child Data Protection Act (CDPA), both aimed at protecting children online. The SAFE Act prohibits covered social media companies from providing individuals under 18 (minors) with “addictive feeds” (as defined in the SAFE Act) and overnight notifications, absent parental consent. The CDPA is intended to complement the SAFE Act by limiting the extent to which providers of internet websites, online and mobile applications, and connected devices (service) can collect, use, share, and sell minors’ personal data. If signed into law by Governor Hochul, the SAFE Act and CDPA would create new, onerous requirements for entities doing business in New York. The key provisions of each act are highlighted below.Continue Reading New York Legislature Passes a Pair of Bills to Protect Children’s Privacy Online

On May 16, 2024, the U.S. Securities and Exchange Commission (SEC) announced that it had adopted final amendments to its Regulation S-P (the Rule or Amended Rule), which governs “covered financial institutions’” treatment of consumers’ nonpublic personal information, to ensure that these entities implement incident response programs and notify consumers when their information has been compromised. Brokers, dealers, investment companies, investment advisers, crowdfunding portals, and transfer agents registered with the SEC or another appropriate regulatory agency are all considered covered institutions (CIs) under the Amended Rule.Continue Reading SEC Expands Security and Breach Notification Requirements for Investment Firms

On May 21, 2024, the Council of the European Union (the Council) formally signed off on the latest draft of the European Union’s (EU) Artificial Intelligence Act (AI Act) (see the press release here). This marks the final seal of approval from the EU legislators. The text will officially become law once it is signed by Presidents of the European Parliament and of the Council and published in the Official Journal of the EU. This could take place within the next two to four weeks. However, the law will have phased effective dates, with the first obligations (i.e., the rules on prohibited AI systems) becoming effective at the end of this year.Continue Reading EU AI Act Is Now Adopted

On May 9, 2024, Maryland Governor Wes Moore signed HB 603, the Maryland Age-Appropriate Design Code (Maryland AADC). The Maryland AADC builds on Maryland’s Online Data Privacy Act, which was signed into law the same day and requires companies to provide certain protections for personal data of a consumer when the company knows or has reason to know the consumer is a child under the age of 13.1 The Maryland AADC layers on additional requirements for “covered entities” and expands the definition of “child” to include individuals under the age of 18.Continue Reading Maryland Passes Age-Appropriate Design Code

On May 17, 2024, Governor Jared Polis signed the Colorado Artificial Intelligence Act (SB 24-205) (CAIA), regulating the development, deployment, and use of artificial intelligence (AI) systems. Colorado is the first state to enact comprehensive AI legislation. The law becomes effective February 1, 2026.Continue Reading Colorado Passes First-in-Nation Artificial Intelligence Act

On April 26, 2024, the Federal Trade Commission (FTC) announced a Final Rule that amends the Health Breach Notification Rule (HBNR or Rule) to significantly broaden the FTC’s enforcement power in the area of digital health. Under the Final Rule, many developers of everyday health and wellness apps (Developers) will now constitute “health care providers” subject to the HBNR. The consequences of failing to comply with the HBNR could be steep—failure to comply with the Rule could subject a company to civil penalties of $51,744 per violation. Below, we provide a summary of the Final Rule and highlight some of the key challenges it presents.Continue Reading FTC Final Rule Officially Broadens Health Breach Notification Rule, Targets Health and Wellness Apps

Despite national efforts over the past decades, child sexual abuse material (CSAM) and online child sexual exploitation are still unfortunately prevalent. In 2023, the National Center for Missing and Exploited Children (NCMEC) received over 35.9 million reports of suspected CSAM.[1] This is more than a 20 percent increase over the previous three years. Notably, NCMEC’s 2023 report highlighted concern about the significant increase in reports involving generative artificial intelligence, noting that the Center received 4,700 reports of CSAM or other sexually exploitative content related to these technologies.Continue Reading New Minor Safety Obligations for Online Services: REPORT Act Expands Child Sexual Exploitation Reporting Requirements