You have to pay Claude to remember you, but the AI will forget your conversations for free (www.techradar.com)

🤖 AI Summary
Anthropic has added a free “Incognito” mode to Claude that makes any chat ephemeral: enable it with a ghost icon when starting a conversation and the session won’t appear in history or memory (visually confirmed by a black border and label). Anthropic keeps a short, temporary 30‑day retention window for safety, but otherwise the interaction is not stored. By contrast, Claude’s upgraded long‑term memory—capable of remembering context, project state, user preferences, and picking up where you left off—is currently limited to paying subscribers (Claude Max, Team, Enterprise) and is opt‑in per project, with project memories isolated from personal chats. This split—“forget by default for free, remember only if you pay and opt in”—is significant for AI adoption and privacy norms. It gives users a low-friction way to experiment, think aloud, or ask sensitive questions without leaving a permanent trail, lowering the barrier for privacy‑conscious users. Technically, the design emphasizes clear UX affordances (prominent labels/icons) and explicit consent for memory, contrasting with competitors whose memory/privacy distinctions are less transparent. Tradeoffs remain: incognito can’t be used inside Projects and anything unretrieved before closing is gone, so users must balance ephemerality with the risk of losing useful outputs. Overall, the move signals stronger user control expectations in consumer AI.
Loading comments...
loading comments...