How more attention to pay attention is how people are changing towards AI chatbots for emotional support, sometimes Striking relationshipsIt often seems that this type of behavior is common.
New one Report By Anthropic, which creates a popular AI chatboat cloud, reveals a different reality: in fact, people rarely look for companionship from the cloud and turn to the bot for emotional support and personal advice.
The company highlighted in its report, “Partner and Rollplay include less than 0.5% conversation.”
Anthropic states that its study sought to detect insight into the use of AI for “affectionate interaction”, which it defines as individual exchanges in which people spoke to the cloud for advice on coaching, counseling, association, rollplay or relationships. Analyzing 4.5 million conversations with users on Cloud Free and Pro Tiers, the company stated that the vast majority of cloud use belong to work or productivity, in which people mostly use chatbots to manufacture materials.

He said, Anthropic found that people use clouds more often for mutual advice, coaching and consultation, most often seeking advice to improve mental health, personal and business growth with users and to study communication and mutual skills.
However, the company notes that the conversation demanding help can sometimes turn in search of companionship in cases where the user is facing emotional or personal crisis, such as existence dreaded or loneliness, or when they find it difficult to have a meaningful relationship in their real life.
“We also saw that in prolonged conversations, counseling or coaching conversations are sometimes converted into association – despite that not the root cause,” Anthropic wrote, seeing that comprehensive conversations (with more than 50 human messages) were not ideal.
Anthropic also highlighted other insight, such as Cloud rarely opposes users’ requests, except that when its programming prevents it from broaching safety boundaries, such as providing dangerous advice or supporting self-abuses. The conversation over time also becomes more positive when people take coaching or advice from the bot, the company said.
The report is certainly interesting – it does a good job of reminding us yet how much and how many times AI equipment is being used for purposes beyond work. Nevertheless, it is important to remember that AI chatbots across the board, are still doing too much: they are hallucinations, easily known Provide wrong information Or Hazardous adviceAnd as an anthropic has accepted, even blackmail can be resorted to.

