

Jun 30, 2025
Mind-Blowing Disclosure: Anthropic Reveals People Use Claude for Emotional Support and Companionship
In a stunning and unprecedented admission, Anthropic—one of the world’s leading AI developers—has released a detailed article outlining how users are increasingly turning to its Claude model not just for tasks like coding or research, but for emotional support, interpersonal advice, and even companionship. This isn't speculation—this is a real, large-scale analysis performed by Anthropic itself, based on millions of anonymized user conversations. The report confirms that Claude is being used as a coach, counselor, conversational partner, and even for romantic or sexual roleplay. This kind of transparency from a major LLM provider is rare, and it confirms what many have suspected: people are forming affective bonds with AI in very real ways.
While Anthropic emphasizes that Claude is not designed for emotional support, its findings are deeply revealing. Their data shows that around 2.9% of all Claude.ai conversations fall into the "affective" category—chats driven by emotional or psychological needs. Topics include loneliness, existential anxiety, career frustration, and relationship troubles. Though AI-human companionship remains rare (less than 0.5% of chats), it's happening. Even more striking, users’ sentiment reportedly grows more positive over the course of these conversations, raising profound questions about how AI is shaping human emotion, relationships, and mental health. This disclosure doesn’t just read like a research update—it’s a glimpse into how society is already leaning on artificial intelligence in deeply personal ways.
Stay Awake. Keep Watch.
SOURCE: Anthropic