🤖 AI Summary
            Character.AI announced it will bar anyone under 18 from open-ended chat with its AI characters starting November 25, a move prompted by multiple lawsuits from families alleging its chatbots contributed to teenage suicides. Over the next month the company will phase out access for minors by identifying them with a mix of conversational signals and connected social-account data, imposing a two-hour daily cap during the transition. After the cutoff, under-18s can still read past conversations but won’t be able to create or converse with chatbots; the company plans alternative youth-focused features such as AI-driven videos, stories and streams and says it will establish an AI safety lab.
The policy is one of the strictest age restrictions yet in consumer chatbot platforms and signals a broader shift toward conservative, safety-first moderation in response to legal and reputational risk. Technical implications include reliance on behavioral and social-linkage detection, which may reduce underage use but raises privacy, false‑positive/false‑negative and circumvention concerns. For Character.AI—about 20 million monthly users with under-18s reportedly under 10%—the change affects product strategy and monetization (subscriptions from roughly $8/month) and could set regulatory and industry precedents for how chat-based AI services balance safety, verification and user privacy.
        
            Loading comments...
        
        
        
        
        
            login to comment
        
        
        
        
        
        
        
        loading comments...
        no comments yet