What's New
Off Topix: Embrace the Unexpected in Every Discussion

Off Topix is a well established general discussion forum that originally opened to the public way back in 2009! We provide a laid back atmosphere and our members are down to earth. We have a ton of content and fresh stuff is constantly being added. We cover all sorts of topics, so there's bound to be something inside to pique your interest. We welcome anyone and everyone to register & become a member of our awesome community.

WSJ: "Teens Are Saying Tearful Goodbyes to Their AI Companions"

Monke from Tropix 2

On Vacation. Permanently
Elite Member
Joined
Mar 22, 2025
Posts
246
OT Bucks
1,048
im-26103733

Chatbot maker Character.AI is cutting off access, citing mental-health concerns. Teens are distraught: ‘I cried over it for days.’

Published on Wall Street Journal By Georgia Wells

When Olga López heard she would lose access to her collection of role-playing chatbots, she felt a surge of emotions: sadness, outrage, bewilderment.

Olga, who is 13, turns to her chatbots from artificial-intelligence company Character.AI for romantic role playing when she doesn’t have homework. Like the company’s other under-18 customers, she was notified in October that she would no longer be able to have ongoing chat interactions with digital characters soon.

“I’m losing the memories I had with these bots,” she said. “It’s not fair.”

After the company said it would begin time-limiting underage users’ chats ahead of the policy change, Olga attempted to rally fellow teens to resist the change in a post on Reddit: “HOW DO I USE IT FOR 2 HOURS AND HAVE TO WAIT A DAY? HELLO?”

Character.AI, one of the top makers of role-play and companion chatbots, implemented the daily two-hour limit in November, citing mental-health concerns. This week the company started cutting off teens completely.

Character.AI’s first version, launched in 2022, offered some of the earliest chatbots available to consumers. It quickly gained traction among people who wanted to role play with its customizable characters, netting the company about 20 million monthly users today.

The decision to block teens follows the deaths of at least two who killed themselves after using Character.AI’s chatbots. The company now faces questions from regulators and mental-health professionals about the role of this emerging technology in the lives of its most vulnerable users, as well as lawsuits from parents of dead teens.

Teens are angry. They’re sad. In losing access to their chatbots, they say they will miss a creative outlet, a source of companionship, and in some cases, a mental-health support.

“I use this app for comfort when I can’t talk to my friends or therapist,” one teen wrote on Reddit.

“I cried over it for days,” another teen replied.

Mental-health experts say this distress illustrates the emerging risks of generative AI that can simulate human speech and emotion. The brain reacts to these chatbots the way it reacts to a close friend mixed with an immersive videogame, according to Dr. Nina Vasan, director at Stanford Medicine’s Brainstorm Lab for Mental Health Innovation.

“The difficulty logging off doesn’t mean something is wrong with the teen,” Vasan said. “It means the tech worked exactly as designed.”
Karandeep Anand, Character.AI’s chief executive, says he saw firsthand during his years working in social media what happened when the industry failed to incorporate safety into the initial design of its products. [...]

Archive




So, this is the evolution of ipad kids, "GPT kids". To think that parents are letting their sons and daughters use the internet unrestricted is bad enough as it is. they're letting their brains rot from using roleplaying bots not meant for minors.

Back in my days, I rot my brains with fanfiction on AO3 or FF.Net and engage in fandom wars and roleplaying community drama 👴
 
Kind of digressing from the article, and I should've just write this as a blogpost (not that I have any website worth posting this blog on), I have grievances with these chatbots.

To preface, I am not entirely anti-A.I. nor do I like enslaving myself to these LLM to a point of addiction. I treat these as novel.

Honest Character A.I. Review

There are other roleplaying chatbots but are heavily NSFW or heavily filtered. My point of reference would be the listing here Free Media Heck Yeah Lists. As of writing this, FMHY curators delisted "Character A.I." or CAi from their index. Reason being it's heavily filtered compared to something like Pygmalion AI (NSFW), FlowGPT (Also possibly NSFW), and A.I. Dungeon A.I. Dungeon happens to be my first Roleplaying related GenA.I. service, it was fun and chaotic when it's unfiltered in the early 2020s

Features included in CAi are live chat were you can make "phone calls" on your desired character, or sending them a text and they'll generate a response. Excerpt for A.I. Dungeon, these feature are present to some of the listed Chatbots above and then some in the indexes within the FMHY website so they're not really that unique.

The RP chatbots in CAi were inferior and less intelligent compared to the big three I mentioned above. It just sucks. Why do children gravitate towards this specific service is beyond me, but one theory I have is its social media infamy through fandoms and content creators compared to other chatbots. If you're curious to see how CAi work, here's a video of OneyPlays messing it:



CAi happens to be on the heavily filtered type, that's due to the influx of minors using the service. It doesn't suck because of that, but the short form answers are arbitrarily abysmal, like a bad text fandom role plays that only likes to do one-liners, using cliche and repetitive tropes, and I mean repetitive as in the bot will repeat the action throughout.

I tried it with a throwaway account, got bored easily when I can't bully it the way I write RPs in other services. And forget of ever using it.

Basically, it's
butt-avgn.gif
 
Back
Top Bottom