Arizona Lawsuit Targets Men Accused of Selling AI Porn Courses

Key Takeaways

- Three Phoenix men allegedly created AI-generated explicit content using photos of real women without consent
- The defendants allegedly sold courses for $24.95/month teaching others to create similar AI influencers
- The lawsuit claims the courses included instructions on selecting victims who 'can't defend themselves'
From Pool Photos to AI Porn Without Consent
MG led an ordinary life in Scottsdale, Arizona. She worked as a personal assistant, waited tables on weekends, and posted occasional Instagram photos of matcha runs and Pilates classes. Her 9,000 followers were mostly friends and acquaintances. She never sought internet fame.
Last summer, a follower sent her a DM with a disturbing question: Did she know that photos and videos of a woman who looked exactly like her were circulating online? MG clicked the link and found multiple Instagram Reels showing what appeared to be her face on a body with tattoos in her exact placements. The woman was scantily clad.
“If you didn't know me well, you could very well think they were images of me. It was kind of like this reality check that I don't have any control over my own image.”
— MG, plaintiff
What horrified MG more: the images were not just circulating randomly. According to her recently filed complaint, they were being used to advertise AI ModelForge, a platform that allegedly teaches men how to generate their own AI influencers using photos of real women.
The Alleged Business Model
MG is one of three plaintiffs in a lawsuit filed in January in Arizona against Jackson Webb, Lucas Webb, and Beau Schultz. The suit also names 50 John Does as additional defendants.
According to the complaint, the three Phoenix men operated a multi-layered scheme. First, they allegedly scoured the internet for photos of young women. Then they used AI software called CreatorCore to train models that could generate realistic images and videos of fictional people who looked exactly like the real women. The resulting content was allegedly sold on Fanvue, a subscription platform.
But the alleged scheme went further. For $24.95 per month on the platform Whop, the men allegedly sold courses teaching other men how to replicate the process. The lawsuit claims these courses included detailed tutorials and a playbook for selecting targets.
“They provided a whole playbook, including instructions on how to pick the right person so that it's not someone who can defend themselves, so they all had instructions on what type of women to use and where to get their pictures. It was disgusting on every single level.”
— MG, plaintiff
How the AI Training Allegedly Worked
The lawsuit describes a systematic approach. The defendants allegedly taught subscribers to use CreatorCore, a software tool, to train AI models using photos of unsuspecting women. The resulting AI-generated content was then posted to Instagram and TikTok as if the fictional influencers were real people.
The complaint alleges the courses included "Blueprints" that guided students through the entire process. The goal was to create AI influencers that looked indistinguishable from real people, built from the likenesses of women who never consented.
The lawsuit suggests this was not a casual hobby. It was allegedly structured as a business, complete with monthly subscription fees, training materials, and a customer base of at least 50 men named as John Does.
The Legal Landscape for AI-Generated Explicit Content
This case arrives as lawmakers scramble to address non-consensual AI imagery. Existing revenge porn laws often require the images to be real. Deepfake-specific legislation exists in some states but remains inconsistent nationally.
Arizona, where this lawsuit was filed, passed a law in 2024 criminalizing the distribution of non-consensual deepfake pornography. But criminal prosecution requires state action. Civil lawsuits like this one allow victims to pursue damages directly.
The challenge for plaintiffs is proving damages and establishing clear legal theories that courts will accept. The men have not yet responded publicly to the lawsuit's allegations.
How European regulators are approaching AI content disclosure
The Broader Problem of AI Image Abuse
MG's case illustrates a growing category of harm. AI image generation tools have become cheap, accessible, and increasingly realistic. Anyone with a few photos can create convincing synthetic content. Platforms struggle to detect and remove such material.
What makes this lawsuit distinct is the alleged instruction element. The defendants are not accused merely of creating harmful content. They are accused of teaching others how to do it systematically, including guidance on victim selection. If the allegations are true, this represents a deliberate effort to scale abuse.
MG's Instagram account was modest. She had 9,000 followers. She shared everyday moments with friends. That was enough. Her photos became training data for AI that generated content she never agreed to, used to advertise courses she never knew existed, purchased by men she never met.
Logicity's Take
Frequently Asked Questions
What is AI ModelForge accused of doing?
According to the lawsuit, AI ModelForge allegedly used photos of real women without consent to create AI-generated explicit content and sold courses teaching others to do the same for $24.95 per month.
Is creating AI-generated explicit content of someone illegal?
Laws vary by state. Arizona passed a law in 2024 criminalizing non-consensual deepfake pornography distribution. Federal legislation remains limited, and enforcement is inconsistent.
How were the victims' photos allegedly obtained?
The lawsuit claims the defendants scraped photos from public social media accounts, then used AI software to train models that could generate realistic synthetic images.
Who are the defendants in this lawsuit?
The named defendants are Jackson Webb, Lucas Webb, and Beau Schultz, all from Phoenix, Arizona. The lawsuit also names 50 John Does who allegedly purchased the training courses.
What damages are the plaintiffs seeking?
The lawsuit seeks to hold the defendants accountable for using the plaintiffs' likenesses without consent. Specific damage amounts are determined as cases proceed through litigation.
Need Help Implementing This?
Source: Feed: Artificial Intelligence Latest / Ej Dickson
Manaal Khan
Tech & Innovation Writer
اقرأ أيضاً

رأي مغاير: كيف يؤثر اختراق الأمن الداخلي الأميركي على شركاتنا الخاصة؟
في ظل اختراق عقود الأمن الداخلي الأميركي مع شركات خاصة، نناقش تأثير هذا الاختراق على مستقبل الأمن السيبراني. نستعرض الإحصاءات الموثوقة ونناقش كيف يمكن للشركات الخاصة أن تتعامل مع هذا التهديد. استمتع بقراءة هذا التحليل العميق

الإنسان في زمن ما بعد الوجود البشري: نحو نظام للتعايش بين الإنسان والروبوت - Centre for Arab Unity Studies
في هذا المقال، سنناقش كيف يمكن للبشر والروبوتات التعايش في نظام متكامل. سنستعرض التحديات والحلول المحتملة التي تضعها شركات مثل جوجل وأمازون. كما سنلقي نظرة على التوقعات المستقبلية وفقًا لتقرير ماكنزي

إطلاق ناسا لمهمة مأهولة إلى القمر: خطوة تاريخية نحو استكشاف الفضاء
تعتبر المهمة الجديدة خطوة هامة نحو استكشاف الفضاء وتطوير التكنولوجيا. سوف تشمل المهمة إرسال رواد فضاء إلى سطح القمر لconducting تجارب علمية. ستسهم هذه المهمة في تطوير فهمنا للفضاء وتحسين التكنولوجيا المستخدمة في استكشاف الفضاء.