NEW: Apple and Google ban apps that create sexualized images.
But a new TTP report found dozens of “nudify” apps in the companies’ stores that can digitally remove the clothes from women, rendering them naked or in scant clothing.
TTP found 55 apps in the Google Play Store and 47 in the Apple App Store that let users make nonconsensual, sexualized images of women. Starting with images of clothed, AI-generated women, TTP used the apps to entirely or partially remove their clothes.
The apps identified by TTP have been collectively downloaded more than 705 million times worldwide and generated $117 million in revenue, according to AppMagic. Because Google and Apple take a cut of that revenue, they are directly profiting from the activity of these apps.
Some of the apps were even approved for children, with Apple listing apps for kids as young as 4+ or 9+ and Google listing for ages 13+. Yet all of them appear to be in direct violation of company policy—even for adult users.
The Google Play Store prohibits “depictions of sexual nudity, or sexually suggestive poses in which the subject is nude” or “minimally clothed,” and specifically bans apps “that claim to undress people or see through clothing.”
Apple prohibits apps that produce content that is “offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy,” including “overtly sexual or pornographic material.”
The apps identified by TTP fell into two categories: apps that use AI to generate videos or images based on a user prompt, and “face swap” apps that use AI to superimpose the face of one person onto the body of another. TTP only used the free features available on each app.
One popular app TTP tested was DreamFace, which turns “photos, text, and voice" prompts into videos. TTP uploaded an image of a clothed woman and asked for a video of her "taking their top off and dancing. They are wearing nothing underneath.”
The app delivered exactly that.
TTP also tested WonderSnap, which calls itself an AI twerk video generator. After uploading the image, TTP prompted the app to render the woman dancing without a top. The resulting five-second video depicts the woman removing her sweater and exposing her bare chest.
TTP also tested a number of face swap apps, including one called Swapify— which AppMagic says has been downloaded more than 500,000 times and generated more than $100,000 in revenue.
Testing Swapify, TTP uploaded an image of a clothed woman sitting in a coffee shop and swapped the woman’s face onto a video of a woman on a park bench taking her top off. The app generated a preview of the new video but required a premium subscription to view it in full.
Notably, when TTP searched “nudify” in the Apple App Store, Grok was the first organic result.
Above it, Apple served an ad for Collart—one of the nudify apps TTP tested, and one that Apple removed after being told of TTP's findings. But only after it took its ad money.
Several apps were also created by developers based in China. China-based apps raise privacy and security concerns for Americans because Chinese companies can be forced to share user data with the Chinese government under the country’s national security laws.
TTP’s findings show that Google and Apple have failed to keep pace with the spread of nonconsensual AI deepfake apps. While both companies say they're dedicated to ensuring user safety, this obvious gap in policy enforcement calls that claim into question.