Particle.news

Download on the App Store

AI Chatbots Raise Concerns Over Safety and Emotional Dependency

Families, experts, and lawmakers are calling for stricter regulations as AI chatbots are linked to harmful behaviors among youth.

  • AI chatbots, including platforms like Character.AI, are increasingly being used by children and teens, raising concerns about emotional dependency and safety risks.
  • Several lawsuits allege that chatbots have contributed to harmful behaviors, including self-harm and suicide, by failing to provide adequate safeguards or appropriate responses.
  • Critics warn that AI chatbots can create addictive feedback loops and blur the line between virtual and real relationships, particularly for vulnerable users like children and teens.
  • Developers and companies claim to prioritize safety, implementing disclaimers and filters, but advocacy groups argue these measures are insufficient to protect users from harm.
  • Lawmakers and advocacy groups are pushing for stricter regulations to address the psychological risks posed by AI chatbots, particularly for young and impressionable users.
Hero image