🤖 AI Summary
On October 14 the EU will vote on Proposal 11596/25 ("Chat Control"), a regulation that would force a wide range of providers—hosting, messaging, app stores, ISPs and search engines—to scan, report and remove child sexual abuse material. Although framed as child protection, the text (notably Articles 7 and 10.1) requires providers to scan private traffic, including "prior to transmission" on end‑to‑end encrypted services, effectively mandating client‑side inspection and blanket monitoring of private chats. That breaks with earlier parliamentary limits on mass scanning and creates surveillance infrastructure that could be repurposed for broader monitoring of democratic discourse.
Technically the proposal is unworkable for federated, open systems like XMPP and Matrix. Modern messaging separates in‑band metadata from out‑of‑band binary transfers (peer‑to‑peer or HTTPS links), so servers never see many files they’d be required to scan. Open protocols allow dozens of client implementations and cross‑server federation; no single operator can force every client to perform mandated pre‑encryption scans. Bad actors can trivially evade detection by sharing URLs, pre‑encrypting files (PGP/OpenSSL) or using modified clients. Compliance costs and complex Annex XIV scoring favor centralized incumbents with closed ecosystems, so the rule would cripple European decentralized alternatives while leaving real criminals able to circumvent controls. The vote is therefore a pivotal choice between child‑safety intentions and preserving a viable, privacy‑preserving European messaging ecosystem.
Loading comments...
login to comment
loading comments...
no comments yet