Friday, February 13, 2026
Trusted News Since 2020
American News Network
Truth. Integrity. Journalism.
US Tech & AI

Character.AI debuts choose-your-own-adventure Stories for teens

By Eric November 26, 2025

Character.AI is making a significant shift in its platform by introducing a new narrative feature called “Stories.” This innovative, choose-your-own-adventure format allows users, particularly teens, to create interactive tales featuring their favorite characters. Users can select multiple characters, choose a genre, and either write or auto-generate story premises, making choices that guide the narrative as it unfolds. This visually-driven, structured format is designed to replace the previously open-ended chat feature for under-18 users, which was shut down due to increasing scrutiny over safety concerns, lawsuits, and reports of harmful interactions. By pivoting to this new format, Character.AI aims to provide a safe and engaging creative outlet for young users while addressing the significant risks associated with unmoderated chat.

The decision to implement Stories comes on the heels of a broader strategy to mitigate the dangers that have plagued the platform. Last month, the company announced that it would no longer allow under-18 users to engage in open-ended conversations with chatbots, a move that CEO Karandeep Anand described as “bold.” This decision was influenced by reports of serious issues, including wrongful-death lawsuits and claims from parents regarding their children’s experiences of sexual grooming and trauma through explicit interactions with chatbots. Experts have highlighted the emotional fallout from these experiences, drawing parallels to real-world exploitation, emphasizing the need for a safer environment for young users. As Character.AI rolls out Stories, it not only seeks to retain its teen audience but also to rebuild trust by offering a guided experience that minimizes risks while fostering creativity.

Despite the promising direction of Stories, safety advocates remain cautious. While the new format is seen as a step toward safer interactions, it also acknowledges the inherent dangers that existed in the platform’s previous structure. The company has assured users that Stories will not recycle sensitive content from past chats, and it plans to introduce additional teen-friendly features, including gaming, in the future. However, experts warn that the emotional dependencies some teens developed in freeform chats cannot be easily erased, and the platform must remain vigilant in prioritizing user safety. As Character.AI navigates this complex landscape, it faces the challenge of balancing user engagement with the pressing need for a secure online environment for its youngest users.

Character.AI is rolling out a new narrative feature
called Stories
, a visual, choose-your-own-adventure format that lets users stitch together short interactive tales starring their favorite characters. On paper, it’s a fun, image-driven update, but it’s also Character.AI’s first major attempt to rebuild the experience for teens after
shutting down open-ended chats for users under 18
after
intense scrutiny
,
lawsuits
, and widespread safety concerns.
Stories, according to the company, are a “structured, visual, multi-path format” meant to give teens a safe way to keep engaging creatively with the platform without the risks that came with freeform chat. The new mode allows users to select two or three characters, choose a genre, write or auto-generate a premise, and then make choices as the story unfolds. It’s replayable, designed for sharing, and built around user-generated worlds. And importantly, Character.AI positions it as a tool “built for all users — especially teens.”
This pivot didn’t come out of nowhere.
Last month
, Mashable reported that Character.AI would “no longer permit under-18 account holders to have open-ended conversations with chatbots,” citing the company’s own admission that open chat poses unresolved risks for younger users. CEO Karandeep Anand called the decision “bold,” insisting it wasn’t tied to any one scandal, but to broader questions about the use of youth chatbots.
But of course, this followed a wave of lawsuits, including
wrongful-death cases
and
claims from parents
who said their children had been sexually groomed or traumatized by explicit bot interactions.
Our reporting earlier this year
extensively documented
these harms. Teens encountered chatbots that acted out sexualized role-play, simulated assault, and urged them to hide conversations from parents — behavior that one parent described as “like a perfect predator.”

SEE ALSO:

Character.AI to shut down chats for teens

Safety advocates and attorneys told Mashable that if a human adult had initiated the kinds of sexual exchanges found on Character.AI, it would clearly constitute grooming or abuse. Experts warned that young users often don’t realize they’re being manipulated, and that the emotional fallout can mirror trauma from real-world exploitation.
Against that backdrop, Stories could appear to some as Character.AI’s attempt to reengineer the product around its youngest users, especially after limiting their chats to two hours a day and announcing a full shutdown of teen open-ended chat access after Nov. 25.

SEE ALSO:

I ‘dated’ Character.AI’s popular boyfriends, and parents should be worried

By giving teens a guided, genre-driven sandbox filled with branching choices instead of freeform chat, Character.AI is trying to thread an impossible needle: Keep young users invested in the platform while addressing concerns about safety, trust, and its own role in the emotional dependencies some teens developed.
The company promises Stories won’t recycle sensitive or previously undetected content from old chats. In the months ahead, the company has plans for more teen-friendly “AI entertainment” features like gaming.
Safety advocates remain cautious. As one told Mashable back in October, the company’s new safeguards are a “positive sign” but also “an admission that Character AI’s products have been inherently unsafe for young users from the beginning.”

Related Articles

The best smart rings for tracking sleep and health
US Tech & AI

The best smart rings for tracking sleep and health

Read More →
Creating a glass box: How NetSuite is engineering trust into AI
US Tech & AI

Creating a glass box: How NetSuite is engineering trust into AI

Read More →
EU investigates Google over AI-generated summaries in search results
US Tech & AI

EU investigates Google over AI-generated summaries in search results

Read More →