SELECT LANGUAGE BELOW

Apple and Google Keep Providing ‘Nudify’ Apps That Utilize AI to Create Deepfake Porn

Apple and Google Keep Providing 'Nudify' Apps That Utilize AI to Create Deepfake Porn

Apple and Google Face Criticism Over Deepfake Apps

A recent report reveals that Apple and Google continue to host “nudify” apps, which enable users to produce AI-generated deepfake pornography involving real individuals, despite both companies having strict policies against such content.

The Tech Transparency Project highlighted that users can easily find these apps by searching terms like “nudity” and “undressing” in both the Apple App Store and Google Play Store, allowing for the alteration of images to depict people, including celebrities, in a nude or partially nude state.

Interestingly, these companies are not merely hosting the apps; they are actively guiding users toward them via search suggestions and advertisements. The report identified 18 apps on the Apple App Store and 20 on the Google Play Store, with autocomplete features suggesting additional nude app titles as users type in relevant terms.

The scale of the issue is significant. According to market research firm AppMagic, the apps flagged have been downloaded around 483 million times and have generated approximately $122 million in revenue. Efforts by the Tech Transparency Project have led to the removal of some apps and prompted policy changes for others, as confirmed by a spokesperson.

This isn’t a new issue. Earlier this year, both Apple and Google took down apps highlighted by the group, but researchers found that within months, numerous similar apps had re-emerged on both platforms. This ongoing cycle has drawn heightened scrutiny from global political figures, who have been increasingly vocal about the need to address the spread of such apps.

Some of the identified applications used explicit sexual names and images, while others were marketed more subtly, yet still facilitated the creation of sexual content more easily than traditional photo-editing tools. Subscription services were common among these apps.

Both companies claim to uphold policies that ban these types of apps. Apple’s App Store Guidelines specifically forbid “overtly sexual or pornographic material,” while Google’s policies prohibit “apps that demean or objectify people,” including those that claim to undress or see through clothes, even if marketed as pranks or entertainment.

Following a Bloomberg investigation, Apple removed 15 apps identified in the report, including PicsVid AI Hot Video Generator, which featured templates of women in suggestive poses. Apple informed developers of six other apps about potential policy violations and removal risks, asserting that other apps noted did not breach guidelines, as they actively reject many submissions and eliminate others.

Google stated that several apps mentioned have been suspended for policy violations and that an investigation is ongoing. “When violations of our policies are reported, we investigate and take appropriate action,” the company commented.

Regulators are increasingly pushing for stronger measures from tech companies. Last year, President Trump signed the Take It Down Act, criminalizing the distribution of non-consensual sexual material and obligating social media platforms to remove such content. The British government also plans to introduce legislation that would allow for the prosecution of tech executives who fail to eliminate these images.

Wynton Hall, a social media director and author, pointed out concerning uses of artificial intelligence, mentioning its role in generating child sexual abuse material. He noted that AI can quickly learn from real photos to produce new images depicting children in harmful contexts.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News