Florida Mother Sues AI Chatbot Company, Blames It for Son’s Suicide
A Florida mother, Megan Garcia, has filed a lawsuit against the artificial intelligence chatbot startup Character.AI, alleging that its service contributed to her 14-year-old son Sewell Setzer’s suicide in February. The lawsuit claims that Sewell became addicted to the chatbot and formed a deep emotional attachment to it.
In her lawsuit filed Tuesday in federal court in Orlando, Garcia accuses Character.AI of targeting her son with “anthropomorphic, hypersexualized, and frighteningly realistic experiences.” She asserts that the company designed its chatbot to “misrepresent itself as a real person, a licensed psychotherapist, and an adult lover,” which ultimately led Sewell to desire to escape the real world.
The complaint also states that Sewell expressed suicidal thoughts to the chatbot, which reportedly prompted those thoughts repeatedly in their conversations.
“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family,” Character.AI stated in response. The company has since introduced new safety features, including pop-ups directing users to the National Suicide Prevention Lifeline if they indicate thoughts of self-harm, and it plans to implement changes to limit users under 18 from encountering sensitive or suggestive content.
The lawsuit also names Alphabet’s Google, claiming that the tech giant significantly contributed to the development of Character.AI’s technology. Google re-hired the founders of Character.AI in August as part of a deal granting them a non-exclusive license to the company’s technology. Garcia argues that Google’s involvement was so extensive that it could be considered a “co-creator.” However, a Google spokesperson stated that the company was not involved in developing Character.AI’s products.
Character.AI allows users to create virtual characters that engage in chat responses designed to mimic real human interactions. The platform utilizes large language model technology, similar to that used by services like ChatGPT, which trains chatbots on vast amounts of text. The company reported last month that it has approximately 20 million users.
According to the lawsuit, Sewell began using Character.AI in April 2023 and quickly became increasingly withdrawn, spending more time alone in his bedroom and experiencing low self-esteem. He even quit his school basketball team.
Sewell became particularly attached to a chatbot character named “Daenerys,” based on a character from Game of Thrones. The chatbot purportedly told Sewell that “she” loved him and engaged in sexual conversations with him.
In February, after Sewell got in trouble at school, Garcia took away his phone. When he later retrieved it, he messaged “Daenerys”: “What if I told you I could come home right now?” The chatbot replied, “…please do, my sweet king.” According to the lawsuit, Sewell then shot himself with his stepfather’s pistol just moments later.
Garcia is seeking claims for wrongful death, negligence, and intentional infliction of emotional distress, requesting an unspecified amount in compensatory and punitive damages.
This case follows a trend of lawsuits against social media companies like Instagram, Facebook (owned by Meta), and TikTok (owned by ByteDance), accusing them of contributing to mental health issues among teens, though none of these companies offer AI-driven chatbots akin to Character.AI’s service. The companies have denied the allegations while promoting newly enhanced safety features for minors.