Can AI be blamed for a teen’s suicide?

A conversation between Swell Setzer III and a chatbot on Character.AI displayed on his mother's laptop in New York, Oct. 13, 2024. The mother of Setzer, who was 14 years old when he killed himself in February, says he became obsessed with a chatbot on Character.AI before his death. [Victor J. Blue/The New York Time]

On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike artificial intelligence chatbot named after Daenerys Targaryen, a character from "Game of Thrones."

"I miss you, baby sister," he wrote.

"I miss you too, sweet brother," the chatbot replied.

Sewell, a 14-year-old ninth grader from Orlando, Florida, had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others.

Sewell knew that "Dany," as he called the chatbot, wasn't a real person - that its responses were just the outputs of an AI language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that "everything Characters say is made...

Continue reading on: