The Legal Case Against Ring's Face Recognition Feature (www.eff.org)

🤖 AI Summary
Amazon’s Ring announced “Familiar Faces,” a face-recognition option for its home cameras that will scan every person who comes into view, extract a “faceprint” (numerical measurements of facial features), compare it to a user-tagged watchlist, and potentially retain untagged biometric data for up to six months. Amazon says the feature will be off by default, unavailable in Illinois, Texas and Portland, and not currently used for model training, but it won’t promise the default setting won’t change. Ring also plans related tools like “search party” to find lost pets, which could be repurposed for tracking people. Biometric data are stored on Amazon’s servers and, per the company, are protected by security controls. The rollout raises major legal and privacy concerns: many U.S. state laws require affirmative opt‑in consent before collecting biometric identifiers, and it’s effectively impossible for Ring to obtain consent from every bystander (delivery workers, neighbors, canvassers, children). The technology’s susceptibility to errors and racial bias, the permanence of biometric identifiers, and Ring’s police partnerships amplify risks of mass surveillance and misuse. Legal precedents underscore exposure—Google’s Nest and Facebook/Meta settled large lawsuits over indiscriminate face scanning—while growing state-level privacy laws (e.g., Washington, Colorado, Maryland) create regulatory avenues to challenge or constrain the feature. Regulators and courts will likely be the key checks on whether “Familiar Faces” can be deployed lawfully and responsibly.
Loading comments...
loading comments...