The Dark Side of AI: How Bots Are Fueling a Monetized Abuse Ecosystem

A recent analysis of 2.8 million Telegram messages reveals a shocking truth: AI-powered bots are being used to create and sell non-consensual intimate images. These bots can turn ordinary photos into synthetic nude images, and the abuse is being monetized through affiliate programs and subscription-based archives. The researchers behind the study are calling for stricter regulations to combat this growing problem.
Key Takeaways
- AI-powered bots are being used to create non-consensual intimate images on Telegram
- The abuse is being monetized through affiliate programs and subscription-based archives
- Researchers are calling for stricter regulations to combat this growing problem
In This Article
- The Problem: AI-Powered Abuse
- The Monetary Aspect: How Abuse Is Being Monetized
- The Role of Telegram: Enabling the Ecosystem
- The Solution: Calling for Stricter Regulations
- The Future: A Call to Action
The Problem: AI-Powered Abuse
Imagine a world where AI-powered bots can take an ordinary photo and turn it into a synthetic nude image. Sounds like science fiction, right? Unfortunately, this is the harsh reality we're facing today. A recent analysis of 2.8 million Telegram messages reveals that these bots are being used to create and sell non-consensual intimate images.
- The bots use advanced algorithms to manipulate images, making it difficult to distinguish between real and fake content
- The abuse is not limited to adults; child sexual abuse material is also being shared and sold
The Monetary Aspect: How Abuse Is Being Monetized
So, how are these bots making money? The answer is simple: affiliate programs and subscription-based archives. Users can pay a one-time fee or monthly subscription to access archives of non-consensual intimate images. The affiliate programs offer users a commission for promoting the bots and archives, making it a lucrative business for those involved.
- The archives contain non-consensual intimate images, including child sexual abuse material
- Payments are made through various channels, including PayPal, cryptocurrencies, and other services
The Role of Telegram: Enabling the Ecosystem
But how is Telegram enabling this ecosystem? The answer lies in its premium features and design. The platform allows users to organize channels into folders, automate access control, and reopen closed groups under the same name. This makes it easy for users to share and access non-consensual intimate images.
- Telegram's premium features are generating significant revenue, with $292 million in 2024 alone
- The platform has been a hub for these kinds of bots since the technology first emerged
The Solution: Calling for Stricter Regulations
So, what can be done to combat this growing problem? The researchers behind the study are calling for stricter regulations, including the banning of nudifying tools across the EU and the classification of Telegram as a Very Large Online Platform under the Digital Services Act.
- The researchers want to see mandatory safeguards against synthetic non-consensual intimate imagery
- The AI Act should include provisions to prevent the misuse of AI in creating and sharing non-consensual intimate images
The Future: A Call to Action
As we move forward, it's essential to recognize the gravity of this issue and take action. We need to work together to prevent the misuse of AI in creating and sharing non-consensual intimate images. This requires a collective effort from policymakers, tech companies, and individuals to create a safer and more responsible online environment.
- We need to raise awareness about the issue and its consequences
- We need to support researchers and organizations working to combat this problem
“AI has lowered the technical barrier so far that the number of potential victims is growing dramatically”
— Maximilian Schreiner, AI Forensics
Final Thoughts
The use of AI-powered bots to create and sell non-consensual intimate images is a growing problem that requires immediate attention. As we move forward, it's essential to recognize the gravity of this issue and take action to prevent the misuse of AI. We need to work together to create a safer and more responsible online environment, and that starts with raising awareness and supporting researchers and organizations working to combat this problem.
Sources & Credits
Originally reported by The Decoder — Maximilian Schreiner
Huma Shazia
Senior AI & Tech Writer
More Articles

The AI Revolution: Why the Sky's the Limit for Artificial Intelligence

Revolutionizing AI: The Game-Changing Tech That's Making Agents Smarter
