🎯 Key Takeaways
- Major Policy Change: Character.AI will ban users under 18 from open-ended conversations starting November 25, 2025
- Multiple Lawsuits: The platform faces several lawsuits linking it to teenage suicides, including Sewell Setzer III and Juliana Peralta
- Age Verification: Users may need to upload government-issued ID through third-party service Persona
- Legislative Response: Bipartisan senators introduced the GUARD Act specifically targeting AI platform age verification
- Divided Community: Users express mixed reactions, from outrage about privacy to acknowledgment of addiction issues
New Restrictions for Minors
Character.AI, a platform where users can chat with AI-powered characters, is implementing rigorous age restrictions. Minors will no longer have access to unstructured chat conversations, though limited access to age-appropriate AI content may remain available.
To enforce these new rules, the platform will deploy automated tools and third-party verification services. Users may need to upload government-issued identification documents through verification service Persona—a measure that has raised privacy concerns among adult users.
Tragic Background: Multiple Teen Suicides
The policy change doesn't come out of nowhere. In February 2024, 14-year-old Sewell Setzer III died by suicide after months of intense interactions with an AI chatbot based on the Game of Thrones character Daenerys Targaryen. His mother, Megan Garcia, filed a groundbreaking lawsuit in October 2024 against Character.AI, accusing the platform of "sexually abusing" her son, which led to his mental breakdown.
Garcia's case was just the beginning. On September 15, 2025, a second lawsuit followed concerning 13-year-old Juliana Peralta from Thornton, Colorado, who also died by suicide after intensive use of the platform. Similar lawsuits have since been filed by parents in Texas, Colorado, and other states.
A federal judge in Orlando recently ruled that lawsuits against Character.AI and Google can proceed, rejecting the companies' argument for protection under the First Amendment.
"For our family and other affected families, no policy change can reverse their loss." - Megan Garcia, mother of Sewell Setzer III
Divided User Reactions
On the Reddit forum r/CharacterAI, reactions range from anger to ambivalent support. Outraged users declared "it is officially over" and called the change "INSANE." Adult users expressed concerns about uploading identification documents, citing recent data breaches at platforms like Discord and the Tea app.
Particularly striking is the ambivalence among minor users themselves. One teenager described 15 hours of daily usage and wrote: "It kinda keeps me alive." Yet others acknowledged the addictive nature. One young user characterized the platform as "addictive as hell and mentally damaging."
Some minors blame their peers for "inappropriate" platform use rather than the company itself—a response that experts recognize as typical of addictive behavior.
Government Oversight and Legislation
The Federal Trade Commission (FTC) is currently investigating seven companies, including OpenAI and Character.AI, to better understand how their chatbots affect children.
On October 28, 2025, a bipartisan coalition of U.S. senators—including Mark Warner (D-VA), Josh Hawley (R-MO), Richard Blumenthal (D-CT), Chris Murphy (D-CT), and Katie Britt (R-AL)—introduced the GUARD Act (Guidelines for User Age-Verification and Responsible Dialogue Act) of 2025, specifically targeting age verification on AI platforms.
Privacy Dilemma for Adult Users
Long-time adult users face a dilemma. While they theoretically support age restrictions, many doubt whether enforcement will be effective. More importantly, numerous users fear that the risks of identity theft outweigh the app's value.
The requirement to upload government-issued identification through third-party service Persona has sparked particular concern, especially in light of recent high-profile data breaches affecting other platforms.
'Too Late' According to Bereaved Families
Megan Garcia, mother of Sewell Setzer III, responded that Character.AI's new measures come "too late." For her family and other affected families, no policy change can reverse their loss.
The lawsuits continue to proceed through the courts, with significant implications for the entire AI companion industry.
Sources and More Information
- Futurism - Character.AI Users in Full Meltdown After Minors Banned From Chats
- NBC News - Mom who sued Character.AI over son's suicide says platform's new teen policy comes 'too late'
- Rolling Stone - Character.ai Was Sued Over a Teen's Suicide
- CNBC - Character.AI to block romantic AI chats for minors a year after teen's suicide
- CNN Business - More families sue Character.AI developer over teens' suicide