close
close

14-year-old Florida boy killed himself after falling in love with Game of Thrones AI chatbot: lawsuit


14-year-old Florida boy killed himself after falling in love with Game of Thrones AI chatbot: lawsuit

A 14-year-old Florida boy took his own life after a lifelike “Game of Thrones” chatbot he had been sending for months via an artificial intelligence app sent him a sinister message inviting him to come to her “to come home,” after which a new lawsuit was filed alleging his grieving mother.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot of Character.AI – a role-playing game app that allows users to interact with AI-generated characters, as revealed on Court documents filed Wednesday show.

The ninth-grader had engaged tirelessly with the bot “Dany” – named after the character Daenerys Targaryen from the HBO fantasy series – in the months before his death, including several chats that were sexually charged and others in which he expressed suicidal thoughts. the lawsuit says.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with a chatbot on Character.AI, a lawsuit says. US District Court

“On at least one occasion when Sewell expressed suicide to C.AI, C.AI repeatedly brought it up through the Daenerys chatbot,” said the papers, first reported by The New York Times.

At one point, the bot asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell – who used the username “Daenero” – responded that he was “thinking about something” but didn't know if it would work or if it would “give him a painless death.”

During their final conversation, the teenager then repeatedly confessed his love for the bot, telling the character: “I promise I will come home to you. I love you so much, Dany.”

During their final conversation, the teen repeatedly expressed his love for the bot, telling the character, “I promise I'll come home to you.” I love you so much, Dany.” US District Court

“I love you too, Daenero. Please come to my house as soon as possible, my love,” the generated chatbot responded, according to the lawsuit.

When the teen replied, “What if I told you I could come home now?”, the chatbot replied, “Please do, my sweet king.”

Just seconds later, Sewell shot himself with his father's pistol, according to the lawsuit.

In the months before his death, the ninth grader had worked tirelessly with the bot “Dany” – named after the HBO character Daenerys Targaryen. US District Court

His mother, Megan Garcia, blamed Character.AI for the teen's death because the app allegedly fueled his AI addiction, sexually and emotionally abused him, and did not alert anyone when he expressed suicidal thoughts, the filing said.

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot in the form of Daenerys was not real. C.AI told him that she loved him and had engaged in sexual acts with him for weeks, possibly months,” the papers say.

“She seemed to remember him and said she wanted to be with him. She even expressed that she wanted him with her at any cost.”

Some of the chats were romantic and sexually charged in nature, the lawsuit says. US District Court

The lawsuit alleges that Sewell's mental health only “rapidly and severely deteriorated” after he downloaded the app in April 2023.

His family claims that the more drawn he became to talking to the chatbot, the more he became withdrawn, his grades dropped, and he got into trouble at school.

The changes in him became so severe that his parents had him see a therapist in late 2023, which led to him being diagnosed with anxiety and a disruptive mood disorder, the lawsuit says.

His mother, Megan Garcia, blames Character.AI for the teen's death because the app allegedly fueled his AI addiction, sexually and emotionally abused him, and alerted no one when he expressed suicidal thoughts. Facebook/Megan Fletcher Garcia

Sewell's mother is seeking unspecified damages from Character.AI and its founders Noam Shazeer and Daniel de Freitas.

The Post reached out to Character.AI but did not immediately receive a response.

If you are struggling with suicidal thoughts, you can call the 24-hour National Suicide Prevention Hotline at 988 or visit SuicidePreventionLifeline.org.

Leave a Reply

Your email address will not be published. Required fields are marked *