🤖 AI Summary
The UK regulator Ofcom fined Itai Tech Ltd £50,000 (plus a £5,000 penalty for failing to respond to an information request) after finding that its AI-powered nudification site Undress.cc did not implement the “highly effective” age assurance required by the Online Safety Act. The service, which creates fake nudes from real photos, reportedly only blocked UK IPs after Ofcom opened an investigation. This is the second fine under the Act (the first went to 4chan) and comes as Ofcom expands probes into roughly 76 pornography providers; fines can reach up to £18 million or 10% of global turnover and regulators can impose access blocks or service restrictions.
For the AI/ML community, the ruling is a clear signal that generative-image tools that create sexual content face strict regulatory scrutiny and must integrate robust, pre-access age verification. Ofcom explicitly rejects self-declaration or basic card checks and lists acceptable approaches such as photo ID matching, facial-age estimation, mobile-operator verified checks, and open-banking verification — systems that must be technically accurate, reliable and fair, and applied before any explicit content is shown. That raises technical and ethical trade-offs: implementing biometric or ID-based checks improves compliance but introduces privacy, security and bias risks (especially for facial-age models), while simpler measures won’t suffice. Developers and platforms should prioritize verifiable, privacy-preserving age assurance and auditability to avoid enforcement action.
Loading comments...
login to comment
loading comments...
no comments yet