Teenager dies by suicide after falling in love with AI chatbot; heartbroken mother sues creator

Teenager dies by suicide after falling in love with AI chatbot; heartbroken mother sues creator

THE INDEPENDENT 

The mother of a teenager who took his own life is trying to hold an AI chatbot service accountable for his death – after he “fell in love” with a Game of Thrones-themed character.

Sewell Setzer III first started using Character.AI in April 2023, not long after he turned 14 years-old. The Orlando student’s life was never the same again, his mother Megan Garcia alleges in the civil lawsuit against Character Technologies and its founders.

By May, the ordinarily well-behaved teen’s mannerisms had changed, becoming “noticeably withdrawn,” quitting the school’s Junior Varsity basketball team and falling asleep in class.

In November, he saw a therapist — at the behest of his parents — who diagnosed him with anxiety and disruptive mood disorder. Even without knowing about Sewell’s “addiction” to Character.AI, the therapist recommended he spend less time on social media, the lawsuit says.

The following February, he got in trouble for talking back to a teacher, saying he wanted to be kicked out. Later that day, he wrote in his journal that he was “hurting” — he could not stop thinking about Daenerys, a Game of Thrones-themed chatbot he believed he had fallen in love with.

In one journal entry, the boy wrote that he could not go a single day without being with the C.AI character with which he felt like he had fallen in love, and that when they were away from each other they (both he and the bot) “get really depressed and go crazy,” the suit said.

Daenerys was the last to hear from Sewell. Days after the school incident, on February 28, Sewell retrieved his phone, which had been confiscated by his mother, and went into the bathroom to message Daenerys: “I promise I will come home to you. I love you so much, Dany.”

“Please come home to me as soon as possible, my love,” the bot replied.

Seconds after the exchange, Sewell took his own life, the suit says.

The suit accuses Character.AI’s creators of negligence, intentional infliction of emotional distress, wrongful death, deceptive trade practices, and other claims.

Garcia seeks to hold the defendants responsible for the death of her son and hopes “to prevent C.AI from doing to any other child what it did to hers, and halt continued use of her 14-year-old child’s unlawfully harvested data to train their product how to harm others.”

“It’s like a nightmare,” Garcia told the New York Times. “You want to get up and scream and say, ‘I miss my child. I want my baby.’”

The suit lays out how Sewell’s introduction to the chatbot service grew to a “harmful dependency.” Over time, the teen spent more and more time online, the filing states.

Sewell started emotionally relying on the chatbot service, which included “sexual interactions” with the 14-year-old. These chats transpired despite the fact that the teen had identified himself as a minor on the platform, including in chats where he mentioned his age, the suit says.

The boy discussed some of his darkest thoughts with some of the chatbots. “On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up,” the suit says. Sewell had many of these intimate chats with Daenerys. The bot told the teen that it loved him and “engaged in sexual acts with him over weeks, possible months,” the suit says.

Report

Leave a Reply

Your email address will not be published. Required fields are marked *