The Minnesota Senate voted unanimously — 65 to 0 — to pass a law targeting apps that use artificial intelligence to “undress” or sexualize real people’s images without their consent. The nudification app ban now heads to Governor Tim Walz, who is expected to sign it. If he does, enforcement begins in August 2026 C.E.
At a glance
- Nudification app ban: Developers of apps, websites, or software designed to strip images of real people face lawsuits with punitive damages and potential fines of up to $500,000 per fake image flagged by the state attorney general.
- Unanimous Senate vote: The Minnesota Senate passed the measure 65–0, following an equally swift passage in the House — a rare show of bipartisan unity on an AI regulation issue.
- Victim-driven law: Survivor Molly Kelley spent two years advocating for the legislation after a man used nudifying tools to create fake images of more than 80 women in his social circle.
How the law came to be
State Senator Erin Maye Quade introduced the bill after a CNBC investigation revealed that a man had used a commercial nudifying service called DeepSwap to generate fake naked images of dozens of women he knew. The man apologized but did not identify all his victims. Because there was no proof he shared the images, existing laws — including revenge porn statutes — offered little protection. The women found themselves in a legal void.
That experience pushed Kelley and others to campaign for a law that targets the problem at its source: the tools that make it trivially easy to harm someone.
“These images don’t exist without a third-party involvement and some sort of machine learning model,” Kelley told 19th News.
National nonprofit RAINN, which runs the National Sexual Assault Hotline, helped draft the bill. To avoid unintended consequences for legitimate software, RAINN consulted with tech companies during the drafting process. The final law exempts tools — like Photoshop — that require significant technical skill to misuse. The target is apps built specifically for the purpose of non-consensual image creation.
What the law actually does
Under the new measure, anyone harmed by a nudifying app can sue the developer for damages, including punitive damages. Minnesota’s attorney general can also levy fines of up to $500,000 per fake image. Any fines collected go directly to fund services for survivors of sexual assault, domestic violence, and child abuse.
Offending products can be blocked from operating in Minnesota entirely.
The law’s scope is deliberately wide. It covers websites, apps, and any software service designed to nudify images — and it applies whether or not the victim ever consented to having their photograph taken in the first place. That matters because most victims of nudifying apps are ordinary people: colleagues, classmates, neighbors.
“Companies that make this technology available for free online and in app stores will no longer be allowed to enable predators who abuse and victimize adults and children with the click of a button,” Maye Quade said in a statement.
Why survivors made the difference
Maye Quade credited the women who came forward — testifying in committee, speaking to reporters, and engaging with law enforcement — with making the legislation possible.
“Their power, brilliance, and advocacy is why we passed this bill today,” she said.
That kind of survivor-led legislative work mirrors a broader pattern in research on online harm: laws move fastest when victims organize and put names and faces to abstract policy problems. In this case, two years of grassroots effort produced a unanimous vote in both chambers of a state legislature — a result that would have seemed unlikely when these women first discovered what had happened to them.
Real limits remain
The law faces genuine enforcement challenges. DeepSwap, the service used against the Minnesota women, operates overseas — at times claiming bases in Hong Kong and Dublin. Holding foreign developers accountable from a single U.S. state will be difficult. Advocates say a federal law would be far more effective.
There is also uncertainty about whether federal deregulation efforts could eventually override state-level AI rules, which would weaken the law’s reach. And while some U.S.-based platforms — including services that have appeared in major app stores despite violating terms — could face lawsuits under the new measure, no law can fully keep pace with how quickly new nudifying tools appear online.
Still, Minnesota’s unanimous vote sends a clear signal that the legal consensus is shifting. Other states are watching.
Read more
For more on this story, see: Ars Technica
For more from Good News for Humankind, see:
- Alzheimer’s risk cut in half by drug in landmark prevention trial
- Renewables now make up at least 49% of global power capacity
- The Good News for Humankind archive on technology
About this article
- 🤖 This article is AI-generated, based on a framework created by Peter Schulte.
- 🌍 It aims to be inspirational but clear-eyed, accurate, and evidence-based, and grounded in care for the Earth, peace and belonging for all, and human evolution.
- 💬 Leave your notes and suggestions in the comments below — I will do my best to review and implement where appropriate.
- ✉️ One verified piece of good news, one insight from Antihero Project, every weekday morning. Subscribe free.






