The Online Safety Act comes for livestreaming (www.openrightsgroup.org)

🤖 AI Summary
Ofcom has opened a consultation on “Additional Safety Measures” to implement the Online Safety Act’s next phase, targeting livestreaming and algorithmic recommendations to prevent grooming, CSAM and terrorism content. Key proposals include highly effective age assurance (HEAA) — potentially facial scans or document checks — proactive technologies (PCU C10) that scan private communications, hash-matching for intimate image abuse, and new duties for recommender systems to non‑prioritise potentially illegal content until human review. For livestreams the rules would block commenting, reacting, recording or gifting on broadcasts by people under 18 unless age verification is completed. Technically aimed at reducing real harms, these measures carry serious privacy, speech and equality risks. HEAA and proactive scanning require more user data, raise false‑positive rates, and incentivise over‑moderation; crisis response protocols could rapidly suppress lawful but newsworthy streams (e.g., protest footage). Automated tools are apt to underperform on minority languages and content types, while ID-based checks disproportionately exclude lower‑income and marginalized youth. Ofcom proposes retrospective safeguards (appeals, bias monitoring), but critics warn they’re weak and untimely. The consultation forces a trade‑off: tighter automated and algorithmic controls that may curb clear harms, versus increased censorship risk, privacy invasion, and unequal exclusion — especially for young creators and marginalized communities.
Loading comments...
loading comments...