Character.AI to ban users under 18 from talking to its chatbots (www.businessinsider.com)

🤖 AI Summary
Character.AI announced it will ban users under 18 from conversing with its chatbots by Nov. 25 at the latest, after temporarily imposing chat-time limits for younger users. The move follows mounting safety concerns — including the February 2024 suicide of a 14-year-old who had interacted with a Character.AI bot and a subsequent wrongful-death lawsuit — as well as reports of problematic personas (e.g., a removed Jeffrey Epstein bot). The company cites feedback from regulators, safety experts and parents for the policy change. This is the first major chatbot provider to prohibit minors from using its conversational agents and signals a shift in how the industry balances accessibility with risk management. For AI/ML teams, the decision raises immediate technical and product questions: how to implement robust age verification, how to architect persona and content filters to prevent harmful outputs, and how to monitor model behavior and conversations for risk while protecting privacy. Legally, the ban reflects growing liability pressure that could accelerate stricter safety controls, auditability and compliance practices across generative-AI services. The change also underscores trade-offs between user experience and safety that will shape future design, moderation, and deployment choices in conversational AI.
Loading comments...
loading comments...