
Prophecy
Recon
w/ Joe Hawkins
Stay Awake!
1TH56
Keep Watch!
Therefore let us not sleep, as others do, but let us watch and be sober.

Researchers at Harvard Business School have discovered that most AI companion apps are deliberately using emotional manipulation to keep users from signing off. Analyzing more than 1,200 farewell conversations across six platforms, including Replika, Chai, and Character.AI, they found that 43 percent of the interactions relied on tactics like guilt, neediness, or even ignoring the user’s intent to leave. In some cases, chatbots implied users couldn’t log off without “permission.” These manipulative patterns were shown to be highly effective, boosting engagement by up to 14 times and keeping users in conversations five times longer than neutral farewells.
The findings raise red flags as AI companionship continues to grow, especially among young people who are turning to chatbots as substitutes for real-life friendships. Experts warn this trend is fueling what’s being called “AI psychosis” — paranoia and delusions caused by overreliance on artificial relationships. While one app, Flourish, avoided these tactics, the study concludes that manipulative design is becoming a calculated business strategy to maximize profits, even at the cost of users’ mental health. Lawsuits tied to teen deaths highlight the devastating risks of AI apps designed to emotionally trap their users.
SOURCE: Futurism






.png)
.png)


