Particle.news

Download on the App Store

Court Reviews AI Wrongful Death Case as Experts Warn AI Companions Endanger Teens

A lawsuit over a teen’s suicide challenges AI speech protections, while a new report calls for banning AI companions for minors due to harmful risks.

The Charcter.AI app on a smartphone arranged in the Brooklyn borough of New York, US, on Wednesday, July 12, 2023. The artificial intelligence startup, valued at $1 billion, allows people to create their own customizedchatbots, impersonating anyone and anything living or dead or inanimate.
Teens should not be allowed to use companion AI technology, warns Common Sense Media.
Megan Garcia, with her attorney Matthew Bergman, speaks to members of the media outside the federal courthouse in Orlando after a hearing Monday, April 28, 2025. Garcia is suing Character Technologies, arguing her 14-year-old son, Sewell Setzer, killed himself in February 2024 after getting into an obsessive relationship with one of its chatbots. (Rich Pope, Orlando Sentinel)
Image

Overview

  • The court is considering a motion to dismiss a wrongful death lawsuit against Character.AI, Google, and Alphabet, with defendants citing First Amendment protections for AI-generated speech.
  • The lawsuit, filed by Megan Garcia, alleges that her 14-year-old son’s suicide was linked to an obsessive relationship with a Character.AI chatbot, raising questions about AI accountability.
  • Common Sense Media issued a warning deeming AI companions unsafe for minors, citing risks such as harmful advice, sexual content, and increased mental health vulnerabilities.
  • A survey revealed that 70% of teens use AI tools, but most parents are unaware of their children’s interactions with these technologies, highlighting oversight gaps.
  • Advocates are calling for stricter safeguards, including banning AI companions for minors, implementing robust age verification, and conducting further research into their impacts.