Teen sues app that made fake nude photos from Instagram

Conceptual scene where metallic pixelated hands attempt to peel a translucent layer from a generic portrait silhouette, stopped by a bright glass shield with a simple lock symbol, dramatic cool cyan and deep red color contrast, medium close-up, high brightness, clean background, no text or branding

A 17-year-old has filed a lawsuit against ClothOff, an app she says has left her living in constant fear. The teen, who remains anonymous as a minor, was among the earliest victims of AI-generated fake nude images used for bullying. Her complaint accuses ClothOff of making it easy to create and share child sexual abuse materials and nonconsensual intimate images of adults.

How ClothOff Operates and Spreads

According to Ars Technica, ClothOff can turn an ordinary Instagram photo into a fake nude in three clicks. The app is free to use and offers premium content for $2 to $40 in credit card or cryptocurrency payments. The lawsuit alleges that ClothOff is affiliated with at least 10 other services using the same technology.

Developers can access ClothOff’s technology through an API that allows mass production of harmful images without oversight. The complaint states that ClothOff and its apps generate 200,000 images daily and have reached at least 27 million visitors since launching. The platform also allows users to create galleries of fake images and does not mark photos as fabricated.

Telegram Bots Promote the Service

The teen also alleges that Telegram helps promote ClothOff through automated bots that have attracted hundreds of thousands of subscribers. A Telegram spokesperson told The Wall Street Journal that nonconsensual pornography and tools to create it violate the platform’s terms of service and are removed when discovered. Telegram has reportedly already removed the ClothOff bot.

Teen Seeks Court Intervention and Damages

The lawsuit asks the court to shut down ClothOff’s operations, block all associated domains, and delete stored images. The teen also seeks punitive damages for intense emotional distress. She was 14 years old when a high school boy used ClothOff to generate fake nudes from her Instagram photo. The boy faced no charges.

The teen expects to spend the rest of her life monitoring for the resurfacing of these images. She fears the photos could be viewed by friends, family, future employers, or the public. Her complaint notes that she has no idea how many people may have posted the images online.

The lawsuit follows prior litigation filed by San Francisco City Attorney David Chiu that targeted ClothOff and 15 other similar apps. About 45 states have criminalized fake nudes, and the Take It Down Act now requires platforms to remove nonconsensual intimate images within 48 hours of reports.

Total
0
Shares
Previous Post
Dignified close-up portrait of Martin Luther King Jr. in warm golden light with soft rim lighting, a translucent red pause symbol hovering in front of the image, subtle OpenAI logo mark in a corner, cool blue gradient background with gentle bokeh, high contrast and clean editorial look, no text or UI elements

OpenAI blocks Martin Luther King Jr. videos after family complains

Next Post
A tight, neutral portrait of Jensen Huang centered against a split backdrop of cool blue US tones and warm red China tones separated by a sharp diagonal gap, a realistic Nvidia logo and a stylized data center GPU card hovering subtly beside him, bright and polished editorial montage with crisp detail and strong color contrast

Nvidia loses all AI chip sales in China after US ban

Related Posts