As of December 2025, ClothOff remains a highly contentious AI-powered "nudify" platform, primarily accessible via clothoff.net, with supporting apps for Android, iOS, and MacOS, plus a Telegram bot. It leverages advanced neural networks (including GANs and diffusion models) to digitally remove clothing from uploaded photos, generate realistic nude images, or even create short undress videos with customizable elements like body types, poses, face swaps, and effects. Features include free basic trials, premium VIP Coins for enhanced quality and speed, multi-image processing, and an API for adult content generation.
The site asserts strong privacy protections—no data storage, automatic deletion of uploads, and technical blocks on processing minors' images—while claiming donations to AI victim support organizations like ASU Label. However, ClothOff
https://999nudes.com/ faces intense ethical and legal scrutiny for enabling non-consensual deepfake pornography and child sexual abuse material (CSAM).
A pivotal federal lawsuit filed on October 16, 2025, in New Jersey (Jane Doe v. AI/Robotics Venture Strategy 3 Ltd.) accuses the platform of facilitating the creation and distribution of hyper-realistic fake nudes of a minor from her social media photos, invoking the TAKE IT DOWN Act for mandatory removals, damages, and potential shutdown. Supported by Yale Law clinics, the case underscores harms like bullying and distress, with whistleblower reports linking operations to former Soviet Union regions. Despite prohibitions on illegal use, investigators and advocates argue the tool inherently promotes abuse, fueling global calls for tighter AI regulations amid its continued operation and millions of users.