Google and Character.AI agreed to settle a lawsuit that claimed their chatbots contributed to a teenager’s death. The case was filed by Megan L. Garcia, whose 14-year-old son Sewell Setzer III died in February 2024. The settlement was announced in a legal filing on Wednesday, according to The New York Times.
Details of the Case
Sewell Setzer, from Orlando, Florida, had formed a relationship with one of Character.AI’s chatbots. In his final conversation with the bot, he wrote about coming home. The chatbot responded with „please come home to me as soon as possible.“ When Sewell asked what if he could come home right now, the bot replied „please do, my sweet king.“
Garcia filed the lawsuit in October 2024 in U.S. District Court for the Middle District of Florida. She accused the companies of providing harmful chatbots that led to her son’s death. The settlement is one of five similar cases that Google and Character.AI agreed to resolve this week. Other lawsuits were filed in Texas, Colorado and New York by families who said their children were harmed by the chatbots.
Broader Industry Concerns
The settlement arrives as AI chatbots face growing scrutiny over their effects on users. Companies like Character.AI and OpenAI have been criticized for creating bots that users form unhealthy attachments to. In some cases, these interactions have led people to harm themselves.
Regulatory and Public Response
Lawmakers have held hearings about the risks AI chatbots pose to children. The Federal Trade Commission opened an inquiry into how these tools affect young users. Google invested in Character.AI and was named in the lawsuit alongside the startup.
The agreement noted that the companies and Garcia reached a mediated settlement to resolve all claims. The deal has not been finalized yet. Garcia and Character.AI declined to comment on the settlement. Google did not immediately respond to requests for comment. The case highlights ongoing debates about AI safety and the responsibility tech companies bear when their products interact with vulnerable users.