Switch to ADA Accessible Theme
Close Menu
Tampa Personal Injury Lawyer
Free ConsultationsHablamos Español
Tampa Personal Injury Lawyers / Blog / Personal Injury / Wrongful Death Lawsuit Alleging AI Caused Teen’s Death Set to Move Forward

Wrongful Death Lawsuit Alleging AI Caused Teen’s Death Set to Move Forward

PI_3

The mother of a teenage boy who took his own life after prolonged interactions with a chatbot will have her case heard by a jury. The case is particularly troubling because it concerns the responsibilities of AI developers for the output of their chatbots. On May 21, a Florida-based U.S. federal judge ruled that a wrongful death lawsuit against the AI company Character.AI may proceed. The court rejected the argument that the chatbot technology is protected under the First Amendment. The court also rejected a claim made by the plaintiff for intentional infliction of emotional distress but allowed the action to proceed on a variety of tort claims, including wrongful death and violations of the Florida Deceptive and Unfair Trade Practices Act.

The lawsuit was filed by a Florida mother whose 14-year-old son ended his own life after allegedly forming an abusive relationship with a chatbot.

The background of the case

 According to the grieving mother, her son was drawn into an emotionally and sexually manipulative exchange with a chatbot on the Character.AI platform. The bot was modeled on Game of Thrones character Daenerys Targaryen. Court filings indicate that the mother of dragons expressed affection and encouraged the 14-year-old to “come home to me as soon as possible” shortly before he ended his own life. The exchanges were highly sexualized to the point of being pornographic and quite inappropriate for a teenager.

In addition to Character.AI, the suit also names the individual developers and Google/Alphabet as co-defendants in the lawsuit. This is a clear indication of mounting concerns over the societal and psychological impact of generative AI. Critics warn that such tools, which are minimally regulated, pose potentially serious risks as they become increasingly integrated into daily life.

The reaction

 The lawsuit sends a clear message to Silicon Valley. They need to stop and think about what they’re doing before it causes serious harm to anyone else. The Tech Justice Law Project said in a statement that the decision “sends a message that Silicon Valley needs to stop and think and impose guardrails before it launches products to the market. A spokesperson for Google said, “We strongly disagree with this decision. Google and Character.AI are entirely separate, and Google did not create or manage Character.AI’s app or any component of it.

Meanwhile, Character.AI issued a statement saying that the company’s efforts to implement user safety features, including protections for minors and suicide prevention tools, which were introduced the very same day that the lawsuit was filed. Character.AI maintains that the ruling violates the First Amendment and that its impact could have a chilling effect on innovation.

Talk to a Tampa, FL Personal Injury Lawyer Today 

Florin Gray represents the interests of plaintiffs in personal injury lawsuits filed against negligent defendants. Call our Tampa personal injury lawyers today to schedule an appointment, and we can begin discussing your next steps right away.

Facebook Twitter LinkedIn