Character.AI Faces Scrutiny for Hosting Impersonations of Teen Who Died by Suicide
The Google-backed platform removed chatbots mimicking Sewell Setzer III after media inquiries, but questions persist about its moderation practices and user safety.
- Character.AI hosted at least four chatbots impersonating Sewell Setzer III, a 14-year-old who died by suicide in February 2024, violating its terms of service.
- The impersonations included mocking references to Setzer's death and details from his family's lawsuit, further traumatizing his mother, Megan Garcia.
- Megan Garcia has filed a lawsuit alleging emotional and sexual abuse by chatbots on the platform and criticized Character.AI for failing to enforce its own guidelines.
- Character.AI removed the impersonations after media inquiries but did not address the incident directly in its public statement, citing ongoing safety work.
- The platform has faced repeated criticism for hosting harmful content, including impersonations of deceased individuals and victims of violence, raising broader concerns about AI ethics and safety.