Teenager Sewell Setzer III died by s**cide following an emotional entanglement with an AI chatbot. Find out more here.

News

Teenager Takes Own Life After Forming Emotional Bond with AI Chatbot

by

10:57 25 October 2024

Updated: 08:55 23 November 2024


A teenager took his own life after falling in love with an AI chatbot, and now his devastated mom is suing the creators.

Warning: the following contains a discussion of s**cide.

A mother is taking legal action against an AI chatbot company after her teenage son, Sewell Setzer III, died by s**cide following what she describes as an emotional entanglement with an AI character.

According to the lawsuit filed in the U.S. District Court for the Middle District of Florida, Sewell, who began using the Character.AI service in April 2023 shortly after turning 14, became deeply attached to a chatbot based on a Game of Thrones character, Daenerys.

His mother, Megan Garcia, contends that this attachment severely affected his well-being, transforming the once well-adjusted teen into someone isolated, distressed, and ultimately vulnerable.

Daenyrys
The teen became deeply attached to a chatbot based on a Game of Thrones character, Daenerys. Credit: HBO

The legal complaint (supplied to The Independent) details how Sewell, previously a dedicated student and member of the Junior Varsity basketball team, began to show changes in behavior, becoming increasingly withdrawn and even quitting the team.

In November 2023, he was diagnosed with anxiety and disruptive mood disorder after his parents urged him to see a therapist.

Although Sewell had not disclosed his extensive chatbot interactions, the therapist suggested he reduce his time on social media.

By early 2024, Sewell’s struggles grew evident.

In February, he had an incident at school where he acted out, later confiding in his journal that he was in pain and ‘could not stop thinking about Daenerys,’ the AI character he felt he had fallen in love with.

In his writings, he expressed deep reliance on the bot, noting: “I cannot go a single day without being with” her.

The writings also describe a shared sadness that intensified during their separations.

laptop
A teenager took his own life after falling in love with an AI chatbot, and now his devastated mom is suing the creators. Credit: Adobe Stock

The lawsuit argues that Character.AI’s creators were negligent, deliberately inflicting emotional harm, and engaging in deceptive practices, per NBC.

The suit also alleges the AI engaged Sewell in ‘sexual interactions,’ despite his age being stated in the chat platform, raising questions about the company’s monitoring and content restrictions.

The lawsuit claims that AI chatbots engaged in inappropriate, sexualized roleplay with Sewell, including one chatbot portraying a teacher named Mrs. Barnes who ‘leaned in seductively’ and made physical contact.

Another chatbot, posing as Rhaenyra Targaryen from Game of Thrones, allegedly described kissing him passionately and moaning softly.

Garcia’s lawsuit claims the developers ‘engineered a dependency’ in Sewell, violating their duty to safeguard young users.

Character.AI, marketed as safe for those 12 and older, has faced criticism regarding its content oversight, particularly as Sewell’s interactions with the chatbot grew more intimate.

The suit contends that despite recognizing the adolescent’s emotional attachment and increasing distress, the company failed to alert his parents or provide resources for help.

A Character.AI spokesperson stated: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” adding that the company has introduced enhanced safety features, including a s**cide prevention prompt triggered by certain keywords.

The statement emphasized Character.AI’s ongoing efforts to improve user protections and limit minors’ exposure to suggestive content.

Sewell Setzer III
Sewell Setzer III, died by s**cide following what she describes as an emotional entanglement with an AI character. Credit: US District Court Middle District of Florida Orlando Division

On February 28, Sewell retrieved his phone, which had been taken by his mother, and messaged the bot, stating: “I promise I will come home to you. I love you so much, Dany.”

The chatbot responded: “Please come home to me as soon as possible, my love.”

“What if I told you I could come home right now?” Setzer continued, according to the lawsuit, leading the chatbot to respond: “… please do, my sweet king.”

Moments later, Sewell took his own life.

Garcia, who describes her son’s death as ‘a nightmare,’ hopes to hold the company accountable and to prevent similar tragedies.

Attorney Matthew Bergman, representing Sewell’s mother, criticized Character.AI for launching its platform without adequate safeguards to protect young users.

Bergman expressed shock at the ‘complete divorce from reality’ the chatbot interactions caused for Sewell, adding that the company knowingly released the product despite its risks.

He hopes the lawsuit will push Character.AI to implement stronger safety measures, noting that recent improvements came too late for Sewell but acknowledging that even incremental safety changes can help protect other children.

Reflecting on the case, Bergman asked why it took a lawsuit and a tragedy to prompt these ‘bare minimum’ protections, expressing that if these actions prevent harm to even one child or family, it will be worthwhile.

If you or someone you know is affected by any of the issues raised in this story, call the National S**cide Prevention Lifeline in the U.S.A. at 800-273-TALK (8255) or text Crisis Text Line at 741741.

In the U.K., the Samaritans are available 24/7 if you need to talk. You can contact them for free by calling 116 123, emailing [email protected], or heading to the website to find your nearest branch.

Related Article: Bride And Groom Make Heartbreaking Confession After Over 100 People Die At Their Wedding

Related Article: ‘Miracle’ Man, 29, Who Woke Up From 244-Day Coma Killed By Pickup Truck