On April 28, 2025, Congress passed the “TAKE IT DOWN Act.” In addition to criminalizing intentional publication of non-consensual intimate imagery, including computer-generated intimate imagery (collectively, NCII), the bill requires “covered platforms” to develop a process for removing NCII within 48 hours of a valid report. Covered platforms are those that primarily provide a public forum for user-generated content. The term does not include ISPs, email providers, online services that consist primarily of non-user-generated content, or services for which chat, comment, or interactive functionality is directly related to the provision of non-user-generated content. The bill now awaits President Trump’s signature and is expected to be signed in light of receiving bipartisan support and an endorsement from the First Lady.

A summary of the bill’s key provisions are highlighted below.

Criminal Liability for the Knowing Publication of NCII

The bill prohibits any person from using an interactive computer service to knowingly publish an authentic “intimate visual depiction” (depictions of intimate body parts or certain sexual activities) of an identifiable individual who is an adult if i) the intimate visual depiction was obtained or created under circumstances in which the person knew or reasonably should have known the identifiable individual had a reasonable expectation of privacy; ii) what is depicted was not voluntarily exposed by the identifiable individual in a public or commercial setting; iii) what is depicted is not a matter of public concern; and iv) the publication is intended to cause harm or causes harm, including psychological, financial, or reputational harm, to the identifiable individual. For computer-generated intimate visual depictions (“digital forgeries”), the test is similar, other than the first prong. To establish liability for digital forgeries, the depiction would have to be published without consent of the identified individual.

With respect to NCII involving minors, the law sets forth stricter criminal prohibitions. The bill makes it unlawful to publish NCII of a minor with the intent to “abuse, humiliate, harass, or degrade the minor or arouse or gratify the sexual desire of any person.”

Certain exceptions apply to, for example, disclosures to law enforcement, reporting as part of a professional obligation, disclosure to assist the identifiable individual, or publication of one’s own images. Child pornography is also explicitly excluded and remains subject to criminal prohibitions. The law explicitly states that the fact that: i) an individual provides consent for the creation of an image; or ii) voluntarily shares it with another individual does not constitute consent to publication.

Civil Liability for Failing to Comply with Notice and Removal Obligations

The bill also requires covered platforms to set up a notice-and-takedown mechanism for NCII. In particular, the bill requires that, within one year of the date of enactment, covered platforms establish a process that allows an identifiable individual or their agent to notify and request that the covered platform remove an intimate visual depiction of the individual that was published without their consent. Reports must be in writing, with a physical or electronic signature, and include: i) enough information to locate the depiction; ii) a statement of the individual’s “good faith belief” that the depiction was not consensual; and iii) the reporter’s contact information.

The bill requires that covered platforms provide a clear and conspicuous notice of this removal process, which should detail the responsibilities of the platform and an individual’s ability to request removal in easy-to-understand language.

Upon receipt of a valid removal request, covered platforms must remove the intimate visual depiction and make reasonable efforts to identify and remove any known identical copies within 48 hours. The bill gives platforms a safe harbor for content removal, immunizing them from liability claims from content posters based on their good faith removal pursuant to the law.

The bill does not create a new private right of action, but instead provides the Federal Trade Commission with enforcement authority over failures to comply with the notice and removal obligations, which would constitute an unfair or deceptive act or practice under the Federal Trade Commission Act.

Next Steps

This bill imposes significant obligations on covered platforms to establish what is effectively a new notice-and-takedown regime, with an additional “reasonable efforts” obligation to identify and remove “known identical copies.” While the takedown obligation does not take effect for a year after the bill’s effective date, complying with these provisions will require significant work for covered platforms. All companies that host user-generated content should take several immediate steps to determine whether and how they would need to comply:

  • Determine whether you’re a covered platform and for which of the services you offer you’d qualify as a covered platform.
  • If you’re covered, figure out how you will notify consumers about the process. Where will you include the notice? How will you ensure it’s clear and conspicuous and easy to understand?
  • Consider and prepare for the volume of requests you’ll likely receive, especially considering that the bill requires a response within 48 hours. Do you have adequate resources, operating procedures, escalation procedures, and training to effectuate the bill’s requirements? What efforts would be required for you to identify identical copies of reported NCII?

For more information or advice about the developments mentioned above please contact Maneesha Mithal or another member of the firm’s Data, Privacy, and Cybersecurity practice or Brian M. Willen or Ian F. Sprague from the Internet Strategy and Litigation practice.

Taylor Stenberg Erb contributed to the preparation of this post.