Joint statement of scientists and researchers on the EU Chat Control regulation (csa-scientist-open-letter.org)

🤖 AI Summary
A coalition of 761 scientists and researchers from 37 countries issued a joint statement opposing the EU Presidency’s July 2025 proposal for mandatory on-device detection of child sexual abuse material (CSAM). While welcoming limited procedural improvements (e.g., easier voluntary reporting and faster handling), the signatories warn the draft still mandates large‑scale client‑side scanning that they say is technically infeasible, weakens end‑to‑end encryption (E2EE), and creates unprecedented surveillance and censorship capabilities with high risks of function creep. They call for more open, expert-driven debate before such requirements are enacted. Technically, the letter stresses that current detection methods—including machine‑learning models—produce unacceptably high false positives and negatives at the scale of hundreds of millions of users, and are trivially evadable (e.g., slight image perturbations, URL redirection). Unknown‑CSAM detection is fundamentally contextual (hard even for humans) and thus unreliable; mandating on‑device scanning introduces a single point of failure, expands attack surface, and can be repurposed to detect other speech or data. Narrowing scope to images and URLs does not solve these issues. The researchers warn the proposal would erode cryptographic protections central to digital security and civil liberties, urging policymakers to halt the measure, engage experts, and seek alternatives that don’t compromise encryption or public safety.
Loading comments...
loading comments...