Can A.I. Be Blamed for a Teen’s Suicide?

October 25, 2024

a person looking at a phone with social media apps

(New York Times) – The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.

Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.

Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)

But he developed an emotional attachment anyway. (Read More)