Meta Platforms has filed a lawsuit in Hong Kong against Joy Timeline HK, the developer of CrushAI, an app that uses artificial intelligence to generate nude or sexually explicit images of individuals without their consent. The legal action aims to prevent the company from advertising the app on Meta’s platforms, including Facebook and Instagram.
Meta’s Enforcement Measures Against AI Nudity Apps
Meta has been actively removing advertisements, Facebook pages, and Instagram accounts promoting CrushAI whenever detected. The company also blocks links to websites hosting such apps and restricts search terms like “nudify,” “undress,” and “delete clothing” to prevent access through its platforms.
Repeated Violations and Legal Action
According to Meta, Joy Timeline HK has made multiple attempts to bypass Meta’s ad review process, continuing to place ads even after repeated removals for violating platform policies. The lawsuit follows these persistent violations, reinforcing Meta’s commitment to protecting users from non-consensual AI-generated content.
Advancing AI Detection and Industry Collaboration
Meta is enhancing its enforcement methods by developing new technology designed to identify and remove misleading ads, even when they do not explicitly contain nudity. The company is also implementing matching technology to detect and eliminate copycat advertisements more efficiently.
Additionally, Meta plans to share intelligence on these apps with other tech companies, enabling broader industry action against AI-generated explicit content.
Disrupting Coordinated Inauthentic Activity
Meta has applied its network disruption tactics—typically used against coordinated inauthentic activity—to identify and remove accounts promoting nudity apps. Since early 2025, Meta’s expert teams have conducted in-depth investigations, exposing and dismantling four separate networks attempting to advertise such services.
For more details, visit here.