Character.AI, a popular chatbot platform where users role-play with different personas, will no longer permit under-18 account holders to have open-ended conversations with chatbots, the company announced Wednesday. It will also begin relying on age assurance techniques to ensure that minors aren’t able to open adult accounts.
The dramatic shift comes just six weeks after Character.AI was sued again in federal court by the Social Media Victims Law Center, which is representing multiple parents of teens who died by suicide or allegedly experienced severe harm, including sexual abuse. The parents claim their children’s use of the platform was responsible for the harm. In October 2024, Megan Garcia filed a wrongful death suit seeking to hold the company responsible for the suicide of her son, arguing that its product is dangerously defective. She is represented by the Social Media Victims Law Center and the Tech Justice Law Project.
Online safety advocates recently declared Character.AI unsafe for teens after they tested the platform this spring and logged hundreds of harmful interactions, including violence and sexual exploitation.
Support authors and subscribe to content
This is premium stuff. Subscribe to read the entire article.









