🤖 AI Summary
Interactive “griefbots” — AI recreations of deceased people built by combining large language models (LLMs) with voice- and video‑modelling tools — are moving from niche experiments into mainstream services. Personal stories in the article (a sound designer’s “Dadbot” built from ChatGPT and ElevenLabs voice cloning, and You, Only Virtual founded by Justin Harrison) illustrate why millions are trying text, voice and soon video recreations: they can offer rehearsed goodbyes, resolve unfinished business, or supply day‑to‑day companionship that feels familiar. Platforms such as Project December, Replika and You, Only Virtual already let users train models on messages and recordings so the bot mimics a loved one’s speech patterns and mannerisms.
That emotional power explains both enthusiasm and alarm. Technically, LLMs can hallucinate, producing nonsensical or misleading responses that break immersion and risk harming grievers; multimodal (AR/video) versions will amplify that effect. Ethical and practical issues include persistence of attachment (interfering with healthy internal grieving), data privacy, monetization (subscriptions, ads, or targeted offers to vulnerable users), and even courtroom uses that can sway decisions. Regulation and evidence of therapeutic benefit are scant, so the community faces urgent questions about standards for consent, provenance of training data, transparency about model limits, and safeguards to prevent exploitation or psychological harm.
Loading comments...
login to comment
loading comments...
no comments yet