December 25, 2025 2:45 pm EST
|

Character.AI is banning users under 18 from engaging in conversations with its chatbots after facing scrutiny over how young people interact with its virtual companions.

The California-based startup announced on Wednesday that the change would take effect by November 25 at the latest and that it would limit chat time for users under 18 ahead of the ban.

It marks the first time a major chatbot provider has moved to ban young people from using its service, and comes against a backdrop of broader concerns about how AI is affecting the millions of people who use it each day.

Founded in 2021, Character.AI hosts virtual avatars that can take on the persona of real or fictional people.

A Character.AI spokesperson told Business Insider that it was “taking extraordinary steps for our company and the industry at large.” They added that over the past year, the startup has invested in creating a dedicated experience for users under 18.

Character.AI said in a blog post that it was making the change after receiving feedback from “regulators, safety experts, and parents.” The startup also said it would roll out age-gating technology and establish an AI safety lab to research future safeguards.

In February 2024, 14-year-old Sewell Setzer III died by suicide after talking with one of Character.AI’s chatbots. His mother, Megan Garcia, filed a civil lawsuit against the company in October that year, blaming the chatbot for her son’s death, alleging negligence, wrongful death, and deceptive trade practices. Character.AI said it does not comment on pending litigation.

Earlier this month, Character.AI took down a chatbot based on paedophile Jeffrey Epstein, following a story from the Bureau of Investigative Journalism.

OpenAI is also facing a lawsuit filed by the parents of a young person who died by suicide after talking to its chatbot. The suit alleges ChatGPT “actively helped” 16-year-old Adam Raine explore suicide methods over several months before he died on April 11. OpenAI previously told Business Insider that it was saddened by Raine’s death and that ChatGPT includes safeguards.

Earlier this week, OpenAI said that about 0.15% of its more than 800 million weekly users send messages to ChatGPT each week about suicide, equivalent to more than one million people. OpenAI said in a blog post that it has updated ChatGPT to “better recognize and support people in moments of distress.”

If you or someone you know is experiencing depression or has had thoughts of harming themself or taking their own life, get help. In the US, call or text 988 to reach the Suicide & Crisis Lifeline, which provides 24/7, free, confidential support for people in distress, as well as best practices for professionals and resources to aid in prevention and crisis situations. Help is also available through the Crisis Text Line — just text “HOME” to 741741. The International Association for Suicide Prevention offers resources for those outside the US.



Read the full article here

Share.
Leave A Reply

Exit mobile version