Fleshbot Loading...
Loading...

“Take It Down Act” Now Law: Federal Crackdown on Revenge Porn and Deepfake Exploitation Begins

LEGAL NEWS STRAIGHT

With the signature of President Donald Trump, the Take It Down Act has officially become federal law, marking the most sweeping legislative response yet to the growing threat of non-consensual intimate imagery and AI-generated deepfakes. The law imposes criminal penalties, expands victim protections, and establishes new compliance mandates for websites and platforms that host user-generated content.

President Trump Signs "Take It Down Act," Sparking Debate Over Online Privacy and Free Speech

Key Provisions of the Take It Down Act

The new law, formally titled the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, introduces:

Criminal liability for anyone who knowingly publishes or threatens to publish intimate images (real or AI-generated) without consent, under circumstances where the subject had a reasonable expectation of privacy.

Enhanced penalties when minors are involved or when content is published with intent to harm, abuse, or coerce.

    • Up to 2 years in prison for adult-related violations.
    • Up to 3 years in prison for offenses involving minors.
    • Threatening to publish such content can carry up to 18–30 months, depending on age.

The absence of consent and harmful intent are core factors in determining criminality.

By May 19, 2026, platforms that host user-generated content must:

Establish a plain-language takedown process for victims or their authorized agents.

Accept electronically or physically signed requests with:

    • Information sufficient to identify the content.
    • A good-faith statement asserting lack of consent.
    • Contact information for response.

Remove the flagged content within 48 hours of a valid request.

Make reasonable efforts to remove duplicate content, including in direct messages and affiliated services.

No counter-notification is permitted — once removed, content cannot be reinstated.

Platforms that fail to comply may face FTC enforcement under unfair or deceptive practice provisions. While the law offers limited safe harbor protections for platforms acting in good faith, it does not shield them from liability for hosting or failing to remove the illegal content.

The law was co-sponsored by Sen. Ted Cruz (R-TX) and Sen. Amy Klobuchar (D-MN), and received bipartisan support. It was also championed by First Lady Melania Trump, who lobbied for stronger protections against digital exploitation.

“This is a major victory for victims of online abuse,” said Klobuchar. “These images can ruin lives and reputations. Now, victims will be able to have this material removed from platforms, and law enforcement can hold perpetrators accountable.”

Cruz added, “Predators who weaponize new technology to post this exploitative filth will now rightfully face criminal consequences.”

The law was partly inspired by a high-profile case involving a 14-year-old victim whose AI-generated explicit image remained online for nearly a year.

Pushback and Concerns

Digital rights groups, including the Electronic Frontier Foundation (EFF) and the Cyber Civil Rights Initiative, have expressed concern over the law’s breadth, vagueness, and potential for abuse:

The lack of due process, such as a counter-notice system, may lead to over-removal of lawful content.

The 48-hour compliance window may not provide platforms enough time to verify the legitimacy of claims.

Automated moderation tools, often imprecise, may flag legal content, including:

  • Journalistic imagery (e.g., protests or public nudity).
  • Law enforcement alerts involving exposed offenders.
  • Commercially produced, consensual pornography.

EFF warned that smaller platforms may be especially vulnerable, opting to remove flagged content preemptively rather than risk legal consequences, which raises broader concerns about free speech and censorship.

Any platform that allows users to upload, share, or distribute images should now review its compliance policies, takedown systems, and moderation processes in light of the Take It Down Act. Legal experts strongly recommend:

  • Consulting with First Amendment and tech counsel.
  • Updating community guidelines and terms of service.
  • Preparing backend systems to detect, track, and remove flagged content swiftly.

Failure to act could result in federal enforcement, litigation, and criminal liability.

The Take It Down Act marks a turning point in how U.S. law treats digital privacy and consent, setting a precedent with real consequences for both offenders and the platforms that enable them.


Live Sex view more

AmberSpanks Preview
AmberSpanks US
48 years old
WestJessie Preview
WestJessie US
29 years old
NinaHartley Preview
NinaHartley US
65 years old
msprincess312 Preview
msprincess312 US
26 years old