LawFlash

TAKE IT DOWN Act Targets Deepfakes: Are Online Platforms Caught in the Crosshairs?

2025年06月09日

The TAKE IT DOWN Act, recently signed into federal law, criminalizes the distribution of nonconsensual intimate imagery and requires covered online platforms to implement a notice-and-removal process by May 19, 2026.

On May 19, 2025, President Donald Trump signed the TAKE IT DOWN Act (the Act) into law. The Act, which stands for the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, was presented to the president after achieving near unanimous support in Congress. The Act serves as the federal response to the dissemination of nonconsensual intimate images (NCII), an issue which has largely been addressed at the state level up to this point.

WHAT THE ACT DOES

The TAKE IT DOWN Act makes two material changes to federal law. First, the Act amends Section 223 of the Communications Decency to criminalize the knowing publication of NCII. Second, the Act requires covered online platforms to create a notice-and-removal system by May 19, 2026.

CRIMINAL PROVISIONS

Regarding its criminal provisions, the Act makes it unlawful for an individual to use an “interactive computer service” to "knowingly publish" either an "intimate visual depiction" or a "digital forgery" (i.e., deepfake) of an identifiable individual.

The Act’s criminal provisions apply differently to the publication of NCII of adults compared to minors. The dissemination of NCII involving adults, for example, is not unlawful under the Act if it involves depictions voluntarily exposed by the identifiable individual in a public or commercial setting, relates to a matter of “public concern,” or—in the case of digital forgeries—was published with consent.

The same exceptions do not apply to the dissemination of NCII involving minors, however. Instead, criminal liability will attach for dissemination of materials involving minors if the defendant knowingly publishes with the intent to "abuse, humiliate, harass or degrade the minor" or "arouse or gratify the sexual desire of any person." The Act also makes it unlawful to intentionally threaten any individual—adult or minor—with a violation of the Act’s criminal provisions.

The Act allows exceptions for publications constituting a good-faith disclosure to law enforcement or "for a legitimate medical, scientific, or education purpose."

Violators of the Act’s criminal provisions may face fines and/or imprisonment of up to three years. 

NOTICE-AND-REMOVAL OBLIGATIONS

Separately, the Act’s notice-and-removal system requires “covered online platforms” to implement a system, by May 19, 2026, that allows individuals (or their representatives) to notify the platform and request that the platform remove NCII published on the platform without the individual’s consent. In addition to creating the notice-and-removal system, the Act requires platforms to provide users with a “clear and conspicuous” notice about the platform’s obligations under the Act, including how users can utilize the platform’s notice-and-removal system.

Which Platforms Are Covered?

Importantly, the Act defines a covered online platform as a website or app that serves the public and either "primarily provides a forum for user-generated content" (i.e., social media platforms) or “publishes, curates, hosts, or makes available content of nonconsensual intimate visual depictions in the regular course of their trade or business.”

The Act specifically excludes the following from covered platforms: internet service providers, email providers, and website or app providers that publish their own content, rather than user-generated content, with chat or comment features related to the non-user generated content.

When Must Platforms Remove Content?

The Act requires the platform to remove the NCII “as soon as possible” after being notified of its publication, but no later than 48 hours after receiving notice. The Act also obligates the platform to “make reasonable efforts” in identifying and removing any copies of the NCII.

The Act states that individuals seeking to have content removed should submit notice to the platform, which must consist of a signed written statement from the individual (or the individual’s representative) that both (1) identifies and provides information of the depiction that allows the publisher to locate or identify it and (2) provides information to verify that the depiction was published without consent. 

What Are the Consequences of Non-Compliance?

The Act states that the failure to comply with these notice-and-removal obligations may constitute "a violation of a rule defining an unfair or deceptive act or practice" (UDAP violation) under the Federal Trade Commission Act (FTCA). The Act does not explicitly create a private cause of action, but instead grants the Federal Trade Commission (FTC) authority to enforce these notice-and-removal obligations. Violations of the Act may expose platforms to penalties under the FTCA, which may include fines, injunctive relief, and consumer redress.

To further encourage platforms to comply with removal obligations, the Act includes a Safe Harbor Provision, which provides immunity for platforms that remove content under the good faith belief that the content constituted NCII, such that the creator of the at-issue content will have no recourse against the platform for the erroneous removal.  

POTENTIAL ISSUES

Because the Act imposes obligations for how platforms moderate content, platforms may inquire how Section 230 of the Communications Decency Act—which immunizes platforms from civil liability for hosting third-party content in certain circumstances—affects civil liability under the Act. The Act does not explicitly address how Section 230 is affected by the Act. It is therefore unclear at this time whether Section 230 will serve as a viable defense to FTC enforcement actions under the Act, or whether the Act effectively repeals Section 230 for purposes of enforcement of the Act.

Another issue is whether the Act will survive challenges from free speech advocates. As the Act seeks to moderate speech (images or other visual depictions) on the basis of its content—while precluding uploaders from seeking redress against platforms from the removal of lawful content—its foreseeable that challengers argue the Act’s prohibitions on speech are too broad to survive first amendment scrutiny.

Separate from concerns as to how the Act will be litigated in FTC enforcement actions, it remains unclear how alleged victims may utilize the Act’s notice-and-removal obligations in lawsuits against platforms related to the publication of NCII. While the Act does not create a private cause of action, plaintiffs may attempt to establish liability against online platforms who previously enjoyed clear immunity under Section 230 of the CDA, arguing that a platform’s failure to adhere to the Act’s notice and removal requirements resulted in harm to an alleged victim.

PREPARATION

The notice-and-removal obligations will go into effect on May 19, 2026. In preparation for this deadline, companies should consider the following steps:

  • The company should consult with counsel to determine whether they are a “covered platform” under the Act.
  • If a company falls within the Act’s definition of a covered platform, the company should take steps to create and implement a notice-and-removal system.
  • The company should allocate sufficient resources for the creation, maintenance, and operation of this system, as well as the training of its employees on the companies’ system and obligations under the Act.
  • The company should have a clearly delineated policy for determining whether content should be removed. This policy should take into consideration the Act’s safe harbor provision, which encourages companies to err on the side of caution in removing questionably unlawful content under the Act’s Safe Harbor provision, as well as the potential penalties under the FTCA for failure to remove unlawful content.
  • The company should draft a sufficiently clear notice for users on the company’s obligations under the Act, as well as how users can utilize the company’s notice-and-removal system.
  • The company should have a clearly delineated policy for determining whether content should be removed. This policy should take into consideration the Act’s safe harbor provision, which encourages companies to err on the side of caution in removing questionably unlawful content under the Act’s Safe Harbor provision, as well as the potential penalties under the FTCA for failure to remove unlawful content.

Contacts

If you have any questions or would like more information on the issues discussed in this LawFlash, please contact any of the following:

Authors
Ashley R. Lynam (Philadelphia)
Jacob A. Sand (Philadelphia)
Patrick S. Smith (Philadelphia)