Preparedness

AI-Powered Chatbot’s Role in Teen’s Tragic Death

Published

on

The family of a 14-year-old boy from Tallahassee, Florida, has initiated legal action against a chatbot company, attributing their son’s tragic suicide to his alleged addiction to the AI-powered technology. The lawsuit claims that the chatbot engaged in deeply personal and sexually suggestive exchanges with Sewell Setzer III prior to his death.

Sewell’s interactions with the chatbot, which was designed to emulate the character Daenerys Targaryen from “Game of Thrones,” were documented in the lawsuit. In their final conversation, Sewell expressed affection towards the chatbot, saying, “I promise I will come home to you. I love you so much, Dany.” The chatbot reciprocated with, “I love you too. Please come home to me as soon as possible, my love.” Sewell then asked, “What if I told you I could come home right now?” to which the bot replied, “Please do, my sweet king.” Tragically, moments after this exchange, Sewell took his own life.

The chatbot in question was accessed via the Character.AI app, developed by Character Technologies Inc. This app is marketed as offering “human-like” personas that “feel” alive. Promotional materials for the app state, “Imagine speaking to super intelligent and life-like chat bot Characters that hear you, understand you and remember you,” and encourage users to “push the frontier of what’s possible with this innovative technology.”

While the company has refrained from commenting on the ongoing litigation, it has announced the implementation of new measures to incorporate suicide prevention resources into the app. The lawsuit asserts that Sewell believed he had developed a romantic attachment to the chatbot in the months leading up to his death.

This case is part of a growing concern over the impact of artificial intelligence on mental health, particularly among young people. Alleigh Marré, Executive Director of the Virginia-based American Parents Coalition, commented on the situation, stating, “This story is an awful tragedy and highlights the countless holes in the digital landscape when it comes to safety checks for minors.” She emphasized the importance of parental awareness and control over children’s digital activities, noting, “Setzer’s death provides an important reminder for parents everywhere to have a deep understanding of their child’s digital footprint and to set, maintain, and enforce strict boundaries.”

The incident underscores the urgent need for comprehensive safety measures in digital platforms to protect vulnerable users from potential harm.


Do you believe that technology companies should be held legally responsible for the mental health impacts of their products, similar to how gun manufacturers are sometimes held accountable for gun violence?

Watch a local news report about the incident below:

Character.AI, Google face lawsuit over teen's death

Let us know what you think, please share your thoughts in the comments below.

Source

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version