Teen sues AI tool maker over fake nude images
NEWNow you can listen to News articles!
A New Jersey teenager has filed a major lawsuit against the company behind an artificial intelligence (AI) “clothes removal” tool that allegedly created a fake nude image of her.
The case has drawn national attention because it shows how AI can invade privacy in harmful ways. The lawsuit was filed to protect students and teens who share photos online and to show how easily their images can be exploited by artificial intelligence tools.
Sign up to receive my FREE CyberGuy report
Get my best tech tips, urgent security alerts, and exclusive offers delivered right to your inbox. Plus, you’ll get instant access to my Ultimate Guide to Surviving Scams, free when you join me CYBERGUY.COM information sheet.
META LEAKED DOCUMENTS SHOW HOW AI CHATBOTS HANDLE CHILD EXPLOITATION
How fake nude images were created and shared
When she was 14 years old, the plaintiff posted some photos of herself on social media. A classmate used an AI tool called ClothOff to remove his clothes in one of those images. The doctored photo kept his face, making it look real.
The fake image spread quickly through group chats and social media. Now 17, he is suing AI/Robotics Venture Strategy 3 Ltd., the company that operates ClothOff. A Yale Law School professor, several students, and a trial attorney brought the case on his behalf.

A New Jersey teenager is suing the creators of an artificial intelligence tool that created a fake nude image of her. (iStock)
The lawsuit asks the court to remove all fake images and prevent the company from using them to train AI models. It also seeks to remove the tool from the Internet and provide financial compensation for emotional harm and loss of privacy.
The legal fight against the abuse of deepfakes
US states are responding to the rise of AI-generated sexual content. More than 45 states have passed or proposed laws to criminalize deepfakes without consent. In New Jersey, creating or sharing misleading AI media can result in prison sentences and fines.
At the federal level, the Take It Down Act requires companies to remove non-consensual images within 48 hours of a valid request. Despite the new laws, prosecutors still face challenges when developers live abroad or operate through hidden platforms.
APPARENT AI ERRORS FORCE TWO JUDGES TO RETRACT SEPARATE RULINGS

The lawsuit aims to stop the spread of ultra-fake “depilatory” applications and protect the privacy of victims. (iStock)
Why legal experts say this case could set a national precedent
Experts believe this case could change the way courts view AI liability. Judges must decide whether AI developers are liable when people misuse their tools. They should also consider whether the software itself may be a harmful instrument.
The lawsuit highlights another question: How can victims prove harm when no physical act occurred, but the harm appears real? The outcome may define how future deepfake victims will seek justice.
Is ClothOff still available?
Reports indicate that ClothOff may no longer be accessible in some countries, such as the United Kingdom, where it was blocked after a public backlash. However, users in other regions, including the United States, still appear to be able to access the company’s web platform, which continues to promote tools that “remove clothes from photos.”
On its official website, the company includes a brief disclaimer addressing the ethics of its technology. It says: “Is it ethical to use AI generators to create images? The use of AI to create ‘deep nude’ style images raises ethical considerations. We encourage users to approach this with an understanding of responsibility and respect for the privacy of others, ensuring that use of the app for stripping is done with full awareness of the ethical implications.”
Whether fully operational or partially restricted, ClothOff’s continued presence online continues to raise serious legal and moral questions about how far AI developers should go to allow such image manipulation tools to exist.
CLICK HERE TO GET THE News APP

This case could set a national precedent for holding AI companies accountable for misuse of their tools. (Kurt “CyberGuy” Knutsson)
Why this AI lawsuit is important for everyone online
The ability to create fake nude images from a simple photo threatens anyone with an online presence. Teenagers face special risks because AI tools are easy to use and share. The lawsuit draws attention to the emotional harm and humiliation caused by such images.
Parents and educators worry about how quickly this technology is spreading in schools. Lawmakers are under pressure to modernize privacy laws. Companies hosting or enabling these tools must now consider stronger safeguards and faster takedown systems.
What does this mean to you?
If you become the target of an AI-generated image, act quickly. Save screenshots, links and dates before the content disappears. Request immediate removal of the websites hosting the image. Seek legal help to understand your rights under state and federal laws.
Parents should talk openly about digital safety. Even innocent photographs can be misused. Knowing how AI works helps teens stay alert and make safer online decisions. Stricter AI rules that prioritize consent and accountability may also be required.
Take my quiz: How safe is your online security?
Do you think your devices and data are really protected? Take this quick quiz to see where you stand digitally. From passwords to Wi-Fi settings, you’ll get a personalized breakdown of what you’re doing well and what you need to improve. Take my quiz here: Cyberguy.com.
Kurt’s Key Takeaways
This lawsuit is not just about a teenager. It represents a turning point in the way courts handle digital abuse. The case challenges the idea that artificial intelligence tools are neutral and asks whether their creators share responsibility for the harm. We must decide how to balance innovation with human rights. The court’s ruling could influence how future AI laws will evolve and how victims will seek justice.
If an AI tool creates an image that destroys someone’s reputation, should the company that created it face the same punishment as the person who shared it? Let us know by writing to us at Cyberguy.com.
Sign up to receive my FREE CyberGuy report
Get my best tech tips, urgent security alerts, and exclusive offers delivered right to your inbox. Plus, you’ll get instant access to my Ultimate Guide to Surviving Scams, free when you join me CYBERGUY.COM information sheet.
Copyright 2025 CyberGuy.com. All rights reserved.
Kurt “CyberGuy” Knutsson is an award-winning technology journalist with a deep love for technology, gear and gadgets that improve lives with his contributions to News and News Business since mornings on “News & Friends.” Do you have any technical questions? Get Kurt’s free CyberGuy newsletter, share your voice, a story idea or comment on CyberGuy.com.


