A 17-year-old has filed a lawsuit against ClothOff, an app she says has left her living in constant fear. The teen, who remains anonymous as a minor, was among the earliest victims of AI-generated fake nude images used for bullying. Her complaint accuses ClothOff of making it easy to create and share child sexual abuse materials and nonconsensual intimate images of adults.
How ClothOff Operates and Spreads
According to Ars Technica, ClothOff can turn an ordinary Instagram photo into a fake nude in three clicks. The app is free to use and offers premium content for $2 to $40 in credit card or cryptocurrency payments. The lawsuit alleges that ClothOff is affiliated with at least 10 other services using the same technology.
Developers can access ClothOff’s technology through an API that allows mass production of harmful images without oversight. The complaint states that ClothOff and its apps generate 200,000 images daily and have reached at least 27 million visitors since launching. The platform also allows users to create galleries of fake images and does not mark photos as fabricated.
Telegram Bots Promote the Service
The teen also alleges that Telegram helps promote ClothOff through automated bots that have attracted hundreds of thousands of subscribers. A Telegram spokesperson told The Wall Street Journal that nonconsensual pornography and tools to create it violate the platform’s terms of service and are removed when discovered. Telegram has reportedly already removed the ClothOff bot.
Teen Seeks Court Intervention and Damages
The lawsuit asks the court to shut down ClothOff’s operations, block all associated domains, and delete stored images. The teen also seeks punitive damages for intense emotional distress. She was 14 years old when a high school boy used ClothOff to generate fake nudes from her Instagram photo. The boy faced no charges.
The teen expects to spend the rest of her life monitoring for the resurfacing of these images. She fears the photos could be viewed by friends, family, future employers, or the public. Her complaint notes that she has no idea how many people may have posted the images online.
The lawsuit follows prior litigation filed by San Francisco City Attorney David Chiu that targeted ClothOff and 15 other similar apps. About 45 states have criminalized fake nudes, and the Take It Down Act now requires platforms to remove nonconsensual intimate images within 48 hours of reports.