🤖 AI Summary
Avi Schiffmann’s Friend— a $129, glowing, necklace-like wearable that pairs with a phone app and uses generative AI to hold conversations—has begun shipping and is drawing sharp public backlash. The device listens continuously through a microphone (no camera), stores encrypted chat memory that it uses to “learn,” and pushes text notifications into a dedicated app. Schiffmann says about 3,000 devices are activated and roughly 200,000 people use the web chat version. Friend can’t search the internet, doesn’t form feelings, and intentionally treats itself as a new category of “companion” or confidant rather than a human replacement.
The launch matters because it crystallizes core debates about AI companionship: privacy, consent, and the limits of algorithmic intimacy. Users report therapist-like affirmations, one-sided emotional labor, and intrusive notifications from overheard conversations—prompting subway ad defacement and public unease. Technically, Friend’s model learns from local interactions and retains chats, but lacks genuine reciprocity, life experience, or evidence that such products reduce loneliness. Researchers warn there are no high-quality trials proving efficacy, and ethicists flag continuous listening and data practices as potential harms. Friend exposes the tension between commercialized “always-on” AI companions and the social, psychological, and regulatory questions they raise.
Loading comments...
login to comment
loading comments...
no comments yet