Czech police forced to turn off facial recognition cameras at the Prague airport (edri.org)

🤖 AI Summary
Czech police were forced to switch off a real-time facial recognition system at Václav Havel Airport in Prague in August 2025 after civil-rights group IuRe (an EDRi member) and the Czech Data Protection Authority (DPA) found it violated personal data rules and the EU AI Act. The camera system, active since 2018, converted facial contours into numerical “bio‑indexes” and compared them on the fly against a database of wanted or missing persons. IuRe’s complaint (first raised in 2021) and a subsequent DPA inspection—made public in 2025—showed the deployment lacked the explicit legal basis and the judicial approvals now required by the AI Act for biometric surveillance. Beyond the airport, the police continue questionable processing via a “Digital Personal Image Information System” that compares photos of unknown people against ~20 million ID/passport images, enabling retrospective identification (e.g., of the deceased or demonstrators). The case underscores a broader lesson for the AI/ML community: EU regulation is now enforceable and can halt live biometric deployments, but systemic gaps remain where operational practice outpaces law. Technical teams and policymakers must prioritize legal compliance, transparency, data minimization and independent oversight to prevent function‑creep and protect civil liberties when building or deploying face recognition systems.
Loading comments...
loading comments...