The TAKE IT DOWN Act: A federal response to non-consensual intimate imagery becomes law

Introduction

The rapid advancement of AI and proliferation of media platforms have led to an alarming increase in the circulation of non-consensual intimate imagery (NCII). These images—often shared without the subject’s consent—can include AI-generated "deepfakes" that depict individuals in sexually explicit scenarios.

In response, Congress passed the TAKE IT DOWN Act (the Act), formally titled the "Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act." With overwhelming bipartisan support (409–2 in the House, unanimous in the Senate), the Act largely criminalizes the publication of NCII and empowers victims by providing a swift content removal method. More than 120 organizations supported the Act, including Google, TikTok, Amazon, the National Center for Missing and Exploited Children (NCMEC) and Meta. On May 19, 2025, President Trump signed the Act into law. We analyze the Act’s scope and nuances below.

Who does the Act apply to?

The Act applies to “covered platforms,” which is defined as a:

[W]ebsite, online service, online application, or mobile application that[:]

(i) Serves the public; and

(ii) (I) Primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files; or

(II) for which it is in the regular course of trade or business of the website online service, online application, or mobile application to publish, curate, host, or make available content of nonconsensual intimate visual depictions.

The term, however, does not include email, ISPs, online services that consist primarily of non-user-generated content, or services for which any chat, comment, or interactive functionality is incidental to, directly related to, or dependent on the provision of non-user-generated content.

Key definitions
  • Consent:” An “affirmative, conscious, and voluntary authorization made by an individual free from force, fraud, duress, misrepresentation, or coercion.”
  • Deepfake:” A “video or image that is generated or substantially modified using machine-learning techniques or any other computer-generated or machine-generated means to falsely depict an individual’s appearance or conduct within an intimate visual depiction.”
  • Intimate Visual Depiction:” Follows the existing definition provided by 15 U.S.C. § 6851(a)(5).
  • Identifiable Individual:” An individual (i) “who appears in whole or in part in an intimate visual depiction;” and (ii) “whose face, likeness, or other distinguishing characteristic ... is displayed in connection with such intimate visual depiction.”
Key Provisions

Generally, the Act:

  • Empowers and protects victims of NCII. The Act also protects those acting in good faith to assist victims of NCII, such as medical professionals or law enforcement.
  • Criminalizes the publication of NCII. The Act makes it a federal offense to knowingly publish, or threaten to publish, intimate images without the subject’s consent. This encompasses both authentic and AI-generated content. For NCII involving adults, the Act provides a four-prong test to determine culpability: (1) The intimate visual depiction was obtained or created under circumstances in which the person knew or reasonably should have known the identifiable individual had a reasonable expectation of privacy; (2) What is depicted was not voluntarily exposed by the identifiable individual in a public or commercial setting; (3) what is depicted is not a matter of public concern; and (4) publication of the intimate visual depiction is (i) intended to cause harm or (ii) causes harm, including psychological, financial, or reputational harm, to the identifiable individual. With respect to NCII involving minors, the Act sets forth more stringent conditions—one only requires the intent to either (1) “abuse, humiliate, harass, or degrade the minor” or (2) “arouse or gratify the sexual desire of any person” is sufficient.
  • Mandates notice and removal obligations upon “covered platforms.” Covered platforms, as defined above, are mandated to “establish a process” to remove reported NCII within one year upon the Act’s date of enactment. The Act requires covered platforms to provide a “clear and conspicuous” notice of the removal process on the platform that is “easy to read[,]” “in plain language[,]” and “provides information regarding the responsibilities of the covered platform under” the Act. Take down requests must be “in writing[,]” include “an identification of the intimate visual depiction of the identifiable individual;” and include a “good faith” statement that the image is “not consensual[.]” Upon receipt of a valid request, covered platforms must remove the intimate visual depiction “as soon as possible, but not later than 48 hours” of notification by the victim or their representative. Covered platforms must take “reasonable efforts” to remove the offending material and “any identical copies of such depiction[.]”  
  • Imposes civil liability upon offenders. Covered platforms that fail to reasonably comply with the aforementioned takedown requests in a timely manner can face an enforcement action by the FTC, unless protected under certain exemptions.
  • Empowers the Federal Trade Commission (FTC) to enforce violations of the Act. While the Act does not create a private right of action, the FTC is empowered to enforce compliance, treating violations as deceptive trade practices. Penalties include fines and imprisonment: up to two (2) years for adult victims and up to three (3) years for offenses involving minors.
What does this mean for you?

Covered platforms should proactively think about how they will comply with the prerogatives outlined in the Act. First, organizations need to determine whether they fall within the scope of the Act as a covered platform. If covered, organizations will then need to work to implement the notice and removal processes, including authentication and documentation requirements, within the prescribed one-year window. Additionally, covered platforms may consider updating their community guidelines or platform policies to address the prohibition of deepfakes.

Conclusion

The Act—now law—sets a new federal standard under the supervision of the FTC, aiming to bridge the gaps in current state privacy and digital safety laws, including the standardization of criminal and civil penalties. Nonetheless, the Act raises questions surrounding enforcement, the role of platforms to monitor and regulate user content (especially smaller entities who may lack the infrastructure to address such obligations), and the balance between privacy and free speech rights.

For the full text of the Act, please click here.

For more legislative updates on data privacy law from McDonald Hopkins, please subscribe to receive our publications or view the links below for recent updates on other state data privacy legislative updates. If you have questions about your company’s compliance with cyber regulations, concerns about vulnerability to a ransomware attack or other breach, or if you want to learn more about proactive cybersecurity defense, contact a member of McDonald Hopkins’ national data privacy and cybersecurity team

Jump to Page

McDonald Hopkins uses cookies on our website to enhance user experience and analyze website traffic. Third parties may also use cookies in connection with our website for social media, advertising and analytics and other purposes. By continuing to browse our website, you agree to our use of cookies as detailed in our updated Privacy Policy and our Terms of Use.