On May 19, 2025, U.S. President Donald J. Trump signed into law the TAKE IT DOWN Act ("Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act"), enacted by the 119th Congress with overwhelming bipartisan support. The Act establishes a new legal framework to combat the online publication of non-consensual intimate visual depictions, whether authentic or artificially generated through deepfake technology or AI tools.
This legislation represents a decisive federal response to the increasing misuse of image-generation technologies, particularly AI-assisted digital forgeries and their viral dissemination across online platforms.
The TAKE IT DOWN Act criminalizes the intentional online disclosure of non-consensual intimate visual depictions of identifiable individuals. This includes both:
The statute further distinguishes between content involving adults and that involving minors, with enhanced criminal penalties applicable in the latter case.
The Act introduces several new federal crimes under the Communications Act of 1934 (47 U.S.C. § 223), as amended:
Penalties include:
Courts are further required to impose mandatory restitution (Sec. 2, §223(h)(8)) and forfeiture of proceeds and instrumentalities used in the commission of these offenses (Sec. 2, §223(h)(7)).
The Act also creates new compliance obligations for “covered platforms”, defined broadly to include public websites, services, or applications that “primarily provide a forum for user-generated content” (Sec. 4(3)(A)).
Pursuant to Section 3 of the Act, covered platforms must:
Failure to comply with these obligations constitutes a deceptive or unfair trade practice, enforceable by the Federal Trade Commission (FTC) under the Federal Trade Commission Act (Sec. 3(b)).
The law provides a liability shield for platforms acting in good faith when disabling access to reported content (Sec. 3(a)(4)).
The Act includes carefully defined exceptions, ensuring that its provisions do not apply to:
Crucially, publication consent must be explicit: “the fact that the identifiable individual provided consent for the creation of the intimate visual depiction shall not establish that the individual provided consent for the publication” (Sec. 2, §223(h)(5)(A)).
While broadly supported across the political spectrum and by advocacy groups such as SAG-AFTRA, IBM, and the National Center for Missing & Exploited Children, the Act has sparked First Amendment concerns from civil liberties organizations including the Electronic Frontier Foundation, Center for Democracy & Technology, and others. Key criticisms relate to:
Nonetheless, the TAKE IT DOWN Act marks a major legislative step in addressing the legal vacuum surrounding AI-generated intimate imagery and sets a federal precedent for regulating deepfake technologies in online environments.