Minnesota has made history as the first state in the U.S. to pass a law banning nudification apps—AI tools that generate fake nude images of real individuals without their consent.

Under the new legislation, developers, websites, apps, software providers, or any service designed to "nudify" images face significant legal and financial consequences. Victims of these deepfake abuses can sue for extensive damages, including punitive damages. Additionally, the state may block access to offending products within Minnesota.

The law empowers the Minnesota Attorney General to impose fines of up to $500,000 per fake AI-generated nude image flagged. All collected fines will be directed to victim support services, including those for sexual assault, general crime, domestic violence, and child abuse survivors.

Legislative Process and Enforcement Timeline

On Wednesday, the Minnesota Senate unanimously approved the bill with a 65–0 vote, following its swift passage in the House last week, as reported by 19th News. Governor Tim Walz is expected to sign the bill into law, with enforcement set to begin in August 2024.

Key Provisions of the Law

  • Bans the creation, distribution, or hosting of AI nudification tools targeting real people without consent.
  • Allows victims to sue for damages, including punitive damages, against developers and platforms.
  • Authorizes the state to block access to violating services within Minnesota.
  • Imposes fines up to $500,000 per violation, with proceeds funding victim support programs.
  • Takes effect in August 2024, pending the governor’s signature.