100-Plus “Nudify” Apps Found On Apple Store And Google Play
Dozens of AI “nudify” apps remain available on Google Play and Apple’s App Store, even as concerns mount over tools that can generate nonconsensual sexualized images, according to a new report by the Tech Transparency Project (TTP).
A breakdown in platform moderation
TTP identified 55 nudify apps on Google Play and 47 on Apple’s App Store, many of which can remove clothing from images of women or depict them as partially or fully nude. The report estimates these apps have been downloaded more than 705 million times and generated around $117 million in revenue, meaning Apple and Google have profited from in‑app purchases and subscriptions tied to image‑based abuse.
After being contacted by TTP and CNBC, an Apple spokesperson said the company removed 28 of the apps identified by the group. Google has also removed or suspended several titles, CNBC reports. Even so, the scale of downloads and revenue suggests these tools circulated widely through mainstream app marketplaces before enforcement actions occurred.
Demand outpaces policy
Both Apple and Google have policies that explicitly ban apps which undress people or generate sexualized images from user photos, including so‑called “prank” tools. Yet the report found that many nudify apps use generic branding and vague descriptions, such as “avatars,” “HD videos,” or “dress‑up fun,” while offering templates and prompts clearly aimed at creating sexualized content.
TTP also points out that Apple and Google can take up to a 30% cut of in‑app spending, creating a direct revenue stream from apps closely linked to nonconsensual sexualization. That commercial relationship, the report suggests, helps explain why profitable nudify tools can persist even when they appear to violate stated rules.
App Store and Google Play: The Real Enforcement Chokepoint
Public debates about AI image harms often focus on model developers or specific tools, such as Grok’s AI image editor. TTP’s findings instead shift attention downstream, to the app marketplaces that determine which tools reach mass audiences and how they are monetized.
That framing echoes earlier reporting. Apple and Google faced similar scrutiny following a 404 Media investigation in 2024, and TTP’s report arrives amid calls to remove X and its Grok AI assistant after it generated millions of nonconsensual sexualized images, largely of women and children.
The report’s central conclusion is structural: as long as nudify apps can rebrand, adjust templates, or obscure functionality behind generic language, existing app-store review systems will continue to struggle to keep them out.
Image credit: Rawpixel


