Apple and Google Direct Users Toward Nudify Apps

Recent investigations by the Tech Transparency Project (TTP) have uncovered a troubling trend: Apple and Google are inadvertently directing users to “nudify” apps through their search and advertising systems. These apps, which can digitally remove clothing from images of women, have raised significant concerns about the implications for privacy and consent.
Findings on Nudify Apps
The TTP investigation found that searches for terms like “nudify,” “undress,” and “deepnude” in both the Apple App Store and Google Play Store yielded multiple apps capable of creating sexually explicit content. Notably, 40% of the apps returned in the top search results were equipped to render women nude or scantily clad.
Statistics on App Usage
- These nudify apps have been downloaded 483 million times.
- They have generated over $122 million in lifetime revenue.
- 31 of the identified applications were rated suitable for minors, raising questions about their accessibility.
The potential for misuse is significant, particularly in light of the rising number of deepfake-related incidents in educational contexts. This situation has prompted skepticism about the moderation practices of the app stores.
Search Functionality and Advertising Tactics
According to TTP’s research, Apple’s and Google’s search functions actively promote nudify apps. For instance, autocomplete suggestions frequently lead users to related nudifying queries, amplifying discoverability. Both platforms also displayed ads for nudify apps directly in search results.
AI Tools and Privacy Concerns
Apps evaluated ranged from those that generate nonconsensual nude images to those that allow users to interact with AI chatbots designed for sexual content. Notably, many apps do not adequately flag inappropriate content, raising substantial concerns over user safety and privacy. For example, one app called “Best Body AI” quickly generated nude images after a simple prompt.
Other findings indicated that the nudify and undressing apps continued to be promoted despite the platforms’ policies against explicit content. Apple prohibits offensive apps, while Google Play restricts any application that degrades or objectifies individuals.
Response from Apple and Google
In response to inquiries, Apple did not provide comments regarding their app approval processes or how these nudify apps bypass their reviews. Google, however, indicated that many problematic applications had been flagged and removed as part of ongoing enforcement.
Controversial App Ratings
The age ratings assigned to these apps are particularly alarming. The presence of several nudify apps rated for minors raises significant ethical questions at a time when discussions about deepfake technologies are proliferating.
Conclusion
The findings highlight a critical need for Apple and Google to reevaluate their app store practices, particularly regarding how they handle nudify applications. The ongoing promotion of these apps could contribute to harmful consequences, including the potential exploitation of individuals through nonconsensual images. As scrutiny increases, both companies will be compelled to revisit their policies and practices to better safeguard user privacy and well-being.




