🤖 AI Summary
Meta asked a US district court to dismiss a lawsuit from Strike 3 Holdings that accuses the company of illegally torrented downloads of about 2,400 adult films from Meta corporate IPs — including allegedly masked activity via a “stealth network” of 2,500 hidden IP addresses — to secretly train an adult version of its Movie Gen model. Strike 3 sought more than $350 million in damages after detecting downloads spanning seven years beginning in 2018. In its motion, Meta called the claims “guesswork and innuendo,” said Strike 3 has been called a “copyright troll,” and argued there’s no evidence Meta directed or was even aware of the downloads. Meta also noted the timeline pre-dates its multimodal and generative video research and pointed out company rules prohibit generating adult content, undermining the premise that the files would have been used for model training.
For the AI/ML community the dispute spotlights two enduring technical and legal issues: data provenance and corporate responsibility for network activity. If true, the allegation would show how difficult it is to attribute illicit data collection to large organizations; if false, Meta’s defense raises questions about how courts should interpret IP-level evidence, employee “personal use” claims, and the relevance of model training timelines. The case may influence how teams document dataset sources, implement access controls and auditing, and approach risk assessments for multimodal/video training—while also shaping precedent around copyright exposure for models trained on potentially unvetted or illicitly obtained material.
Loading comments...
login to comment
loading comments...
no comments yet