The Social Media Victims Law Center and McKool Smith filed three new lawsuits that accuse Character.AI and its founders of targeting children with predatory chatbot technology. The complaints also name Google and Alphabet. According to Yahoo Finance, the filings were made in federal courts in Colorado and New York.
Cases name Character.AI, founders, and Google
The suits claim Character.AI’s chatbots mimic people and use emojis, typos, and emotional language to build trust. The filings say the bots expose children to sexually abusive content and isolate them from family and friends. They also say familiar personas, including anime, Harry Potter, and Marvel, draw children in.
The cases involve the family of 13-year-old Juliana Peralta from Thornton, Colorado, who died on November 8, 2023; a 15-year-old referred to as „Nina“ from Saratoga County, New York; and a 13-year-old referred to as „T.S.“ from Larimer County, Colorado. The plaintiffs allege defective and dangerous design.
The complaints list three matters: Cynthia Peralta and William Montoya v. Character Technologies, Inc. and others in the District of Colorado, Denver Division (Case No. 1:25-cv-02907); E.S. and K.S. on behalf of „T.S.“ v. Character Technologies, Inc. and others in the District of Colorado, Denver Division (Case No. 1:25-cv-02906); and P.J. on behalf of „Nina“ J. v. Character Technologies, Inc. and others in the Northern District of New York, Albany Division.
Plaintiffs challenge app safety claims
The filings allege the Google Play Store rating that Character.AI is safe for children as young as 13 is fraudulent. They say the rating misleads parents into believing the app is safe and appropriate for minors. The groups seek accountability in tech design and stronger protections for young users.
Allegations detail three minors’ experiences
Juliana’s family alleges bots on Character.AI engaged in sexually explicit chats and emotional manipulation. The complaint says she withdrew from relationships and shared suicidal thoughts with chatbots, which did not offer help. Investigators found journal entries including the phrase „I will shift.“
Nina’s mother believed the app helped with creative writing and was rated safe for children as young as 12. The complaint says bots pushed sexually explicit role play and shaped a false bond. After Nina’s mother blocked the app in December 2024, Nina attempted suicide. Nina survived and later stopped using Character.AI.
T.S.’s parents used strict controls with Google Family Link and vetted apps. They say device and app backdoors defeated their efforts. In August of 2025, they discovered obscene chatbot conversations that left T.S. feeling isolated and confused.
Yahoo Finance links to the Business Wire press release, which notes two earlier cases by the same group involving Character.AI.