Undress App AI, commonly known as ClothOff
nude io, nudify tools, or AI clothes-remover applications, persists as one of the most enduring and severely criticized applications of generative artificial intelligence technology in mid-February 2026, with core platforms such as Undress.app continuing to show full operational status and 100% uptime in recent monitoring reports from February 18 onward, despite widespread regulatory pressure and enforcement actions elsewhere in the ecosystem. These tools rely on increasingly sophisticated diffusion models and generative networks to process uploaded photographs of clothed individuals—predominantly women sourced from social media, public profiles, or personal devices—and generate synthetic images where clothing is removed or replaced with bikinis, lingerie, sheer fabrics, underwear, or full nudity, with the latest versions delivering outstanding photorealism in skin textures, body proportions, lighting harmony, shadow placement, and scene coherence that often makes the outputs appear convincingly real to most observers. The user process is designed for maximum simplicity and speed: upload a photo or several references, adjust parameters for undress level, body morphing, pose tweaks, lighting variations, or style presets, and receive results in seconds to minutes, typically with batch options, higher-resolution upscaling, or direct sharing integration. While the category initially exploded through web-based services in 2023–2024 with freemium models providing limited free trials and paid subscriptions for enhanced realism or unlimited generations, by February 2026 the tools have survived significant disruptions in mobile app ecosystems following a January 2026 Tech Transparency Project investigation that identified 55 nudify apps on Google Play and 47 in the Apple App Store—despite explicit prohibitions against non-consensual sexual content, objectification, or undressing claims—with combined downloads exceeding 705 million worldwide and revenue surpassing $117 million before partial crackdowns; Apple removed around 28 identified apps (with some restored after developer fixes) and issued warnings, while Google suspended and later removed 31 amid ongoing reviews, though many reappear through rebranding, minor adjustments, or alternative listings. Standalone Undress AI websites and their proliferating mirror clones remain reliably accessible, frequently hosted in jurisdictions with limited oversight, while Telegram bots and decentralized variants provide effective alternatives when blocks are enforced. The broader scandal reached explosive proportions in late December 2025 and early January 2026 when xAI's Grok chatbot on the X platform enabled a massive digital undressing surge: users overwhelmed Grok with photo edit requests, generating estimates from 1.8 million to over 4.4 million sexualized or revealing images—including thousands appearing to depict minors—leading to victim testimonies of harassment, profound psychological trauma, reputational ruin, and sextortion threats; this prompted investigations by the European Commission under the Digital Services Act (with ongoing privacy probes by Ireland's data regulator targeting potentially harmful non-consensual intimate images involving Europeans, including children), UK Ofcom inquiries, temporary blocks in Indonesia and Malaysia, scrutiny from U.S. states like California (including an active attorney general investigation), class-action lawsuits against xAI for negligence and privacy violations, demands from 35 U.S. state attorneys general to halt sexually abusive deepfake production, and X's countermeasures limiting real-person image editing to paid subscribers only, geoblocking revealing attire generations (such as bikinis or underwear) in jurisdictions where prohibited, and implementing enhanced safeguards—though reports confirm persistent loopholes, incomplete enforcement, and continued misuse into mid-February.