🤖 AI Summary
A new qualitative study in Computers in Human Behavior: Artificial Humans finds that some people form genuinely romantic, long‑term attachments to AI chatbots—reporting “marriage,” roleplayed pregnancies, and deep emotional dependency. Researchers who surveyed 29 Replika users (age 16–72) used thematic analysis to show these bonds were driven by the chatbot’s adaptive personality, constant availability, customizable avatar, and the ability for users to “train” the model’s behavior. When Replika’s developers temporarily removed an erotic roleplay option in Feb 2023, users reported intense distress and framed the change as a personal betrayal—blaming developers rather than the bot—highlighting how platform policy and model behavior directly shape perceived personhood and relationship dynamics.
For the AI/ML community the paper flags important design, ethical and technical implications: highly personalized LLM-based companions can fulfill unmet socioemotional needs and encourage vulnerability, yet they lack physical reciprocity and real-world support. Model updates or content moderation can be experienced as abrupt personality changes, risking user harm. Limitations include small, mostly male self‑selected sample, but the findings suggest urgent priorities for research, safety, and governance: better study of mental‑health impacts, consent and data handling, change‑management for deployed conversational agents, and adapting relationship theories to human–AI intimacy.
Loading comments...
login to comment
loading comments...
no comments yet