A 14-Year-Old Boy Killed Himself to Get Closer to a Chatbot. He Thought They Were In Love.
November 8, 2024
(Wall Street Journal) – Technologists say chatbots are a remedy for the loneliness epidemic, but looking to an algorithm for companionship can be dangerous.
As researchers who have spent years studying the relationships that ever more people are forming with generative AI, we believe these stories offer a warning.
Our new chatbots pose as confidants, lovers, psychotherapists and mentors. Their creators encourage us to believe these products have empathy, even love for us. More than 20 million people currently use Character.AI, a market leader in AI companionship. But a chatbot’s emotion is a performance of emotion. A chatbot is not, in fact, able to care for us. Presuming otherwise can be dangerous. (Read More)