Producing and distributing AI-generated pornography may soon carry far greater legal risk in the United States. Well, not all AI porn, but the nonconsensual explicit imagery kind is about to get rather costly.
The U.S. Senate on January 13 passed the DEFIANCE Act by unanimous consent, clearing the way for a final vote in the House. The bipartisan legislation would allow victims of nonconsensual AI-generated explicit imagery to sue those responsible for creating and distributing the content, with statutory damages of up to $150,000 per violation.
The bill is co-sponsored in the House by Rep. Alexandria Ocasio-Cortez and backed by a bipartisan group of senators. It now heads to the House of Representatives, where an earlier version passed the Senate in 2024 but stalled before final approval.

If enacted, the DEFIANCE Act would establish a new federal civil cause of action for victims of what the legislation calls “intimate digital forgeries.” These include AI-generated or digitally altered sexually explicit images and videos that depict identifiable individuals without their consent and appear authentic.
Under the bill, victims would be able to sue individuals who knowingly produce, distribute, solicit, or possess such material with the intent to distribute. Remedies include statutory damages, actual damages, including profits earned from the content, and court-ordered injunctions requiring removal and cessation of distribution.
Notably, the legislation focuses liability on individual bad actors rather than broadly targeting platforms or AI developers, preserving existing legal protections such as Section 230 while expanding avenues for victims to seek relief.
The bill’s renewed momentum follows mounting concern over the rapid spread of sexually explicit deepfakes. A December 2024 report from the American Sunlight Project documented more than 35,000 instances of nonconsensual AI-generated intimate imagery targeting 26 members of Congress.
According to the report, 25 of the 26 targeted lawmakers were women, making women in Congress roughly 70 times more likely than men to be victimized. Nearly one in six female members of Congress were affected. The researchers declined to name specific victims to avoid amplifying the abusive content.
The findings were widely cited by national media outlets and lawmakers as evidence that existing laws had failed to keep pace with generative AI tools capable of producing realistic sexual imagery at scale.
The DEFIANCE Act builds on protections included in the 2022 reauthorization of the Violence Against Women Act, which addressed nonconsensual intimate images involving real photographs and videos.
Claims would be subject to a long statute of limitations, generally up to 10 years, with provisions that pause the clock until a victim discovers the violation or reaches adulthood.
Plaintiffs would need to show that the conduct occurred in or affected interstate or foreign commerce and that the defendant knew or recklessly disregarded the lack of consent.
The legislation complements the Take It Down Act, passed in 2025, which criminalized the posting of nonconsensual intimate imagery and imposed rapid takedown requirements on platforms. Together, the two laws would give victims both criminal enforcement tools and a private right to sue.
Public reaction to the Senate vote has been mixed but intense. Supporters argue the bill provides long-overdue protections for victims of digital sexual abuse and creates meaningful consequences for those who weaponize AI tools.
Critics, including some online activists and tech policy groups, have called for broader liability that would extend to AI platforms and model developers. Others have raised concerns about free speech implications and the challenges of proving intent in civil court.
For now, the bill’s narrow focus has helped maintain bipartisan support, which was underscored by its passage through the Senate without objection.
Whether the House will move quickly remains uncertain. If it does pass and is signed into law, the DEFIANCE Act would mark the first major federal civil remedy specifically tailored to the harms caused by AI-generated nonconsensual explicit imagery, significantly raising the stakes for anyone producing or trafficking in such content.
There's a lot of great, realistic AI out there. I mean, some of it looks amazingly real, and you have to look twice to see if the girl is fake or not. But as with real porn, the key to generating AI porn is all about CONSENT.
Consent will always be key with any sort of production of intimate images. And that's what this new bill is about.
We'll keep you updated as new information about this bill and those like it becomes available.