🤖 AI Summary
Denmark has proposed amending its Copyright Act to grant individuals copyright over their own likenesses and is pushing the measure across the EU during its Council presidency, with the rules potentially taking effect as soon as winter 2025. The move is presented as a pragmatic route to give victims stronger takedown leverage on platforms that often fail to act, and to curb harmful deepfakes. Denmark’s campaign signals growing European momentum to regulate digital impersonation, and officials are considering tougher remedies — including criminalisation — and consent-first rules for online identity use.
The proposal, however, raises major legal and technical tradeoffs that matter to AI/ML practitioners. Lawyers warn that recasting faces as copyrighted “works” blurs copyright (transferable, economic) with personality rights (inalienable, dignity-based) and could commodify bodies in ways that complicate consent and liability. Practically, platform notice-and-take-down under the DSA has proven unreliable, so advocates emphasize complementary technical standards: robust watermarking, auditable provenance metadata, and better deepfake detection. For model builders and dataset curators this foreshadows stricter provenance, consent requirements, and potential legal risk for training or distributing models that reproduce identifiable likenesses — making transparent labeling, consent workflows, and watermark/provenance support essential design considerations.
Loading comments...
login to comment
loading comments...
no comments yet