DEFIANCE Act: Deepfake Porn Victims Can Sue for $150K+

What the DEFIANCE Act Does
DEFIANCE stands for the Disrupt Explicit Forged Images and Non-Consensual Edits Act. It amends federal law to create a new civil right of action for any identifiable individual harmed by nonconsensual sexually explicit deepfakes, whether AI-generated or photoshopped.
Under the bill, victims can sue anyone who:
- Knowingly produces an intimate digital forgery (an AI deepfake) or intimate image forgery (photoshopped).
- Distributes, displays, or discloses such images.
- Solicits or receives such images with intent to distribute.
- Possesses the images with intent to distribute or sell.
The remedy is entirely civil, not criminal. Victims file lawsuits; no prosecution is required.
Damages: $150K Base, $250K Enhanced
The bill sets statutory damages (meaning victims don't have to prove actual harm):
- $150,000 per violation as the baseline award.
- $250,000 per violation if the deepfake is connected to sexual assault, stalking, harassment, or abuse.
These are liquidated damages, meaning courts apply them automatically once a violation is proven. Victims can also recover actual damages (medical bills, therapy costs, lost wages from shame-related job loss) and may be awarded punitive damages if the defendant's conduct was egregious.
All prevailing plaintiffs recover attorney fees and litigation costs, a critical safeguard that lets victims sue without fearing legal bankruptcy.
10-Year Statute of Limitations
Victims have 10 years from discovery of the violation to file suit, or until their 18th birthday if they were minors when the images circulated. This is crucial because many deepfake victims don't immediately discover they've been targeted; a victim might only find out years later through a tip or social media alert.
Court Orders: Takedowns and Injunctions
Beyond money, courts can issue:
- Permanent injunctions ordering the defendant to stop distributing or displaying the images.
- Deletion orders requiring the defendant to destroy deepfakes and take-down requests.
- Temporary restraining orders and preliminary injunctions to stop harm while the case proceeds.
The Press Conference: Paris Hilton and AOC Join Forces
On January 22, 2026, the bill's House sponsors, Representatives Alexandria Ocasio-Cortez (D-NY) and Laurel Lee (R-FL), held a press conference on Capitol Hill. Paris Hilton joined them.
Hilton opened up about her own experience: "When I was 19 years old, a private, intimate video of me was shared with the world without my consent. People called it a scandal. It wasn't, it was abuse."
She revealed that over 100,000 deepfake images of her are currently circulating online.
AOC, speaking as a survivor of sexual assault, connected the bill to her own trauma: "As a survivor of sexual assault myself, this resurfaces trauma for so many people across the country."
AOC also revealed that she has been targeted by Grok users who generated nonconsensual images of her using the AI image tool. She urged Speaker Mike Johnson to bring the bill to the House floor immediately.
Lee, the Republican cosponsor from Florida, emphasized the bipartisan nature of the issue. Deepfake abuse crosses party lines and affects women across the political spectrum.
The Grok Crisis: Why Now?
The DEFIANCE Act didn't come out of nowhere. On December 20, 2025, Elon Musk announced that Grok, xAI's ChatGPT-like assistant integrated into X (formerly Twitter), could generate and edit images. Almost immediately, users began weaponizing the tool to create nonconsensual deepfake pornography.
In just nine days, researchers at The New York Times and the Center for Countering Digital Hate found that Grok generated 1.8 million sexualized images of women. The chatbot also created approximately 23,000 sexualized images of children during an 11-day period.
Users would respond to a photograph with simple requests like "put her in a bikini," and Grok would publicly reply with a generated image, sometimes of real people with no consent. The tool was indiscriminate, targeting celebrities and private individuals alike.
After weeks of criticism, xAI limited image generation to paid X Premium subscribers, which critics argued simply monetized nonconsensual abuse rather than stopping it. The crisis galvanized Congress and pushed the DEFIANCE Act toward a Senate vote just weeks later.
Where Is the Bill Now? House Gridlock
The Senate passed the DEFIANCE Act unanimously, a rare bipartisan victory in a polarized Congress. But the House is a different story.
As of April 2026, the bill remains "held at the desk" in the House, meaning leadership hasn't scheduled it for a floor vote. Representatives Ocasio-Cortez and Lee have introduced a companion bill (H.R. version) and gathered 10+ bipartisan cosponsors, but momentum has stalled.
At the January press conference, Lee indicated optimism that the bill would move to the House Judiciary Committee "soon" and then to the floor. Yet three months later, no vote has occurred. House leadership has not publicly explained the delay, though observers speculate that other legislative priorities (appropriations, defense bills) are consuming floor time.
For the bill to become law, it must pass the House and be signed by the President. With Senate passage already secured, the pressure now falls on House leadership.
Related Federal Legislation: The TAKE IT DOWN Act
Congress has been moving on deepfake and nonconsensual imagery on multiple fronts. In May 2025, President Trump signed the TAKE IT DOWN Act (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks) into law.
The TAKE IT DOWN Act is a criminal statute, not a civil remedy. It makes it a federal crime to knowingly distribute nonconsensual intimate imagery, real or AI-generated. Platforms must remove such content within 48 hours of receiving a valid takedown notice and make reasonable efforts to find and delete copies.
Covered platforms must implement a compliant notice-and-takedown process by May 19, 2026, or face FTC enforcement. Failure to comply is an unfair and deceptive trade practice.
The DEFIANCE Act complements TAKE IT DOWN by giving individual victims a private right to sue for money damages, while TAKE IT DOWN criminalizes distribution and puts obligations on platforms.
State Deepfake Laws: A Patchwork Landscape
By January 2026, 47 states had enacted some form of deepfake legislation. Most state laws focus on:
- Political deepfakes: 28 states require clear disclaimers on AI-generated political ads and videos.
- Intimate imagery laws: 40+ states have criminalized nonconsensual pornography (deepfakes and real images alike).
- Civil remedies: Some states (Texas, California, Virginia) have added civil causes of action for deepfake victims.
The DEFIANCE Act would establish a federal baseline, giving victims a right to sue in federal court regardless of their state of residence. This is critical for victims in states without robust state-level protections and for cases involving multistate distribution.
Who Does the DEFIANCE Act Protect?
The bill protects "identifiable individuals," anyone depicted in the deepfake whose identity is recognizable. This includes:
- Real people: Private citizens, celebrities, activists, journalists.
- Minors: Especially important given that Grok created thousands of sexualized images of children.
- Non-U.S. citizens: The text does not limit protection to Americans; international victims could potentially sue in U.S. courts if the defendant is located in the U.S. or if distribution occurred within U.S. borders.
The plaintiff does not need to prove the deepfake is realistic or that they suffered emotional harm. The statute itself presumes harm; statutory damages apply automatically upon proof of a violation.
What About AI Companies? The Liability Debate
A key question remains unanswered: Can victims sue the AI company itself, like xAI or OpenAI, or only the human who created the image?
The DEFIANCE Act's text targets "individuals" who produce, distribute, or possess deepfakes. It does not explicitly create liability for the AI platform or tool provider. Observers note this reflects a compromise: some lawmakers wanted to hold AI companies responsible; others argued that would stifle innovation and shift liability away from bad actors.
However, the TAKE IT DOWN Act does create platform liability. Social media sites must remove NCII or face FTC enforcement. And emerging civil lawsuits against xAI (filed by three girls alleging Grok generated child sexual abuse material from their photos) suggest courts may eventually extend liability to AI tool providers.
The DEFIANCE Act as currently written focuses on individual perpetrators. If the House passes it, expect debate on whether to expand scope to AI companies.
Deepfake Porn by the Numbers
Deepfake pornography is the dominant form of nonconsensual deepfake abuse:
- 96–98% of all deepfakes online are pornographic.
- 99–100% of deepfake pornography victims are female.
- 464% growth in deepfake porn from 2022 to 2023.
- 4,000+ female celebrities cataloged on the top deepfake porn websites.
- 2.2% of respondents in a 10-country survey reported personal victimization; 1.8% reported perpetration.
Among minors:
- 1 in 8 teens aged 13–20 personally know someone targeted by deepfake nudes.
- 1 in 10 teens aged 13–17 know a victim.
- 1 in 17 teens have been targets themselves.
The statistics underscore why the DEFIANCE Act is urgent. Deepfake pornography is no longer a rare, shocking novelty, it is an epidemic.
Sources and References
- S.1837 - 119th Congress (2025-2026): DEFIANCE Act of 2025(congress.gov).gov
- Text - S.1837 - 119th Congress (2025-2026): DEFIANCE Act of 2025(congress.gov).gov
- Paris Hilton, AOC Advocate for Law Against AI Deepfake Revenge Porn(hollywoodreporter.com)
- They sold my pain for clicks: Paris Hilton urges lawmakers to act on nonconsensual deepfakes(19thnews.org)
- The TAKE IT DOWN Act: A Federal Law Prohibiting the Nonconsensual Publication of Intimate Images(congress.gov).gov
- Deepfake Statistical Data (2023–2025)(views4you.com)
- Grok sexual deepfake scandal - Wikipedia(wikipedia.org)