MG was working as a personal assistant in Scottsdale, Arizona, supplementing her income by waiting tables on weekends and posting the occasional Instagram photo of matcha and Pilates to her followers. Nothing special. Then a follower DM’d her a link to an account posting sexually explicit videos of a woman with her face, her tattoos, her body — videos she had never made. Millions of people had already seen them.
She is one of three women suing three Phoenix men — Jackson Webb, Lucas Webb, and Beau Schultz — in a case that lays bare the machinery of a new kind of exploitation: crowd-sourced AI porn built from ordinary women’s social media feeds, packaged and sold as a business opportunity.
The defendants, according to a 143-page complaint filed in Maricopa County in January, operated a platform called AI ModelForge. For $24.95 a month on the marketplace Whop, subscribers received a playbook: scrape photos of women from Instagram and Tinder, feed them into a generative AI tool called CreatorCore, and produce sexually explicit content to sell on subscription platforms like Fanvue.
The instructions were precise. A screenshot cited in the lawsuit reads: “PRO TIP: Micro-influencers (5k-50K followers) are perfect because they have professional-quality photos but aren’t famous enough to cause legal issues.”
As of August 2025, the platform had more than 8,000 subscribers who had generated over 500,000 images and videos using photos of more than 20,000 women, according to the complaint. One Instagram reel featuring an AI likeness of a plaintiff generated more than 16 million views and over $50,000 in just over a month.
Social media posts attributed to the defendants boasted of making “$199,000 in 3 months.” One caption read: “You found the most unethical way to make money online.”
The Law Arrives Late
The lawsuit lands in a legal landscape still catching up. The Take It Down Act, signed by President Trump in May 2025, makes publishing nonconsensual AI-generated sexual content a federal crime and requires platforms to remove it within 48 hours of a victim’s request. But the takedown mandate doesn’t take effect until May 2026 — a full year after the law was signed. The Arizona Republic found no widely reported federal criminal prosecutions under the law since it was enacted.
Arizona amended its revenge-porn statute to cover AI-generated images. But state Senator JD Mesnard, who sponsored the original law, suggested to The Arizona Republic that the technology might actually help some women, since victims of real leaked photos can now claim the images are fake.
MG might find that cold comfort. She told WIRED she tried repeatedly to get Instagram to remove the content, but the platform declined, saying the posts didn’t violate its guidelines — the AI-generated images weren’t technically impersonating her account. “It’s my face, my tattoos, on a different outfit on a slightly different body,” she said. “These are real women being transformed, not just a random AI-generated person.”
An Instagram spokesperson told WIRED the platform has “extremely strict policies” around nonconsensual intimate imagery and said the flagged accounts were under review. TikTok, by contrast, said the accounts cited in the suit had been removed for violating community guidelines.
The Platform as Playbook
What makes this case unusual is the business model. The defendants didn’t just generate nonconsensual porn — they industrialized the process and sold it as a course. The lawsuit names 50 John Does, subscribers who allegedly used the platform to do the same thing to other women.
Nick Brand, the Kansas City attorney representing the plaintiffs alongside Cristina Perez Hasano, put it plainly: “These boys aren’t just using generative AI to disrobe women — they’re selling the ability to do so to other men and boys.”
Ben Zhao, a University of Chicago computer science professor who develops anti-AI tools, told The Arizona Republic that without laws imposing real penalties on the platforms enabling this content, the problem will only worsen. He pointed to international laws against AI-generated child sexual abuse material, which have made tech companies proactive about removal, and suggested similar enforcement for nonconsensual adult deepfakes would have a real deterrent effect.
The business has since rebranded. AI ModelForge’s Linktree now directs to “TaviraLabs,” a Telegram group with more than 18,000 members advertising itself as “the #1 AI Influencer coaching community.”
MG told WIRED she lives in constant fear that people she knows will see the images. But she wants other women to understand the scope of the threat. “I’m not someone famous. I’m not someone special,” she told The Arizona Republic. “This can happen to anybody.”
Sources
- These Men Allegedly Profit Off Teaching People How to Make AI Porn Influencers — WIRED
- Their Instagram photos were turned into sexually explicit images. Now they’re suing — AZCentral (Arizona Republic)
- Arizona lawsuit alleges AI porn scheme used photos of Kansas City native, other women — KCTV Kansas City
Discussion (9)