How a 'nudify' site turned a group of friends into key figures in a fight against AI-generated porn (www.cnbc.com)

🤖 AI Summary
A group of women in Minneapolis discovered last year that an acquaintance had used their social-media photos to create explicit deepfake images and videos via DeepSwap, one of a growing class of consumer "nudify" sites. Screenshots showed more than 80 women’s faces merged onto nude bodies; the site charges a $19.99/month premium, offers credits for AI video generation, and promises seven-day data retention on Irish servers while allowing users to download content. Victims reported severe emotional trauma and legal limbo: because the subjects were adults and the images hadn’t been publicly distributed, prosecutors found little to charge, highlighting gaps in existing law and enforcement. For the AI/ML community this case crystallizes urgent technical and policy challenges. These apps make convincing face-swap porn accessible without coding, exploit public photos as raw data, and use subscription models and opaque corporate footprints to evade accountability. Responses from platforms (Meta, Apple) and researchers stress improved detection, ad monitoring, and trust-and-safety measures, but experts warn those fixes lag behind misuse. The incident underscores the need for technical mitigations (watermarking, robust deepfake detectors, provenance signals), clearer regulation on nonconsensual synthetic content, and greater industry transparency to prevent scalable harms enabled by generative models.
Loading comments...
loading comments...