In Scottsdale, Arizona, two women have filed lawsuits against men accused of creating AI-generated porn influencers using their Instagram photos without permission. The cases underscore rising concerns about deepfake technology and its misuse on social media platforms.
The first plaintiff, identified only as MG in court documents to protect her identity, was living a typical life as a twentysomething in Scottsdale a little over a year ago. She worked as a personal assistant and earned extra income by waiting tables on weekends. Like many of her peers, MG maintained an Instagram account where she occasionally shared Stories and photos—capturing moments like enjoying matcha drinks, lounging by the pool with friends, or attending Pilates classes.
“I never really cared to pop off and become popular on social media,” MG stated in the lawsuit. “I just used it the way most people did when it first came out, to share their lives with the people closest to them.” At the time, her account had just over 9,000 followers—a modest but engaged audience.
Last summer, MG received a direct message from a follower. The person asked if she was aware that photos and videos of a woman who looked exactly like her were circulating on Instagram. MG clicked the link and discovered multiple Reels featuring what appeared to be her face superimposed onto a body identical to hers. The AI-generated content depicted a scantily clad woman with tattoos in the same locations as MG’s own tattoos.
The second plaintiff, identified as Kylie in the lawsuit, also discovered AI-generated pornographic content using her likeness without her consent. According to court filings, the content was distributed across social media platforms, including Instagram, where it gained traction among users.
The lawsuits allege that the defendants—two men from Scottsdale—used publicly available photos from the plaintiffs’ Instagram accounts to train AI models. The AI then generated pornographic content featuring the plaintiffs’ faces and bodies, which was shared widely online. The plaintiffs argue that this constitutes a violation of their privacy rights, publicity rights, and emotional distress caused by the unauthorized use of their images.
Legal experts note that this case is part of a growing trend where deepfake technology is being misused to create non-consensual pornographic content. The rise of AI tools has made it easier for individuals to manipulate images and videos, often with harmful consequences for the victims.
“This is a clear violation of trust and privacy,” said one attorney representing the plaintiffs. “Our clients never consented to the use of their images in this way, and the emotional toll this has taken is significant.”
The lawsuits seek damages for emotional distress, invasion of privacy, and reputational harm. They also call for stricter regulations on the use of AI-generated content and stronger protections for individuals’ digital likenesses.