Slay or Sus? The Pros and Cons of Teen Chatbot Therapy
On any given night, lonely, stressed-out teens are opening up to AI chatbots—unloading, venting or sharing worries they don’t feel comfortable telling anyone else. A Common Sense Media survey published last month found that 72% of American teens use chatbots as companions, and nearly one in eight has turned to them for mental health support. That’s about 5.2 million teens.
For a generation raised online, chatbots can feel natural and immediate. But as their popularity grows, parents, licensed therapists and researchers are asking an important question: Can a bot really fill the role of a therapist—or even safely understand the way teens communicate when slang like delulu (delusional), simp (people pleasing) or skibidi (nonsense word) mixes with serious feelings?
Let’s break down the pros and cons.
The Pros
Accessible
Nearly half of young Americans aged 18 to 25 with mental health needs received no treatment last year. That gap makes chatbots appealing: they’re free, available 24/7 and never overbooked. For a teen lying awake at 2 a.m. ruminating about everything, a chatbot might feel like the only option.
Accepting
Teens often feel misunderstood by parents, teachers or even their own peers. Chatbots provide a space that’s always accepting, never critical. For some, it’s the first time they feel safe enough to put emotions into words. That’s something.
Progressive
When chatbots are designed for mental health, results can be encouraging. In a 2025 Dartmouth trial, participants using “Therabot” reported reduced anxiety and depression, plus a greater sense of connection. While those studies were on adults, they hint at the potential for thoughtfully designed AI support for younger people.
Slang Friendly
Teens don’t always say, “I’m depressed.” Instead, it might come out as a joke, a meme or slang like “I’m so delulu rn.” AI trained to pick up on those cues could catch distress signals adults might miss. In some cases, bots might even decode youth language more accurately than therapists who aren’t tuned into teen culture.
The Cons
Safety Gaps
Not all AI responses are safe. In some cases, ChatGPT has provided dangerous suggestions, even around self-harm or passive suicidal ideation. For vulnerable teens, one careless reply could reinforce harmful behaviors.
Of note, teens are still developing impulse control and judgment. They may not always question whether a chatbot’s advice is realistic or safe, which increases the risk of harmful outcomes.
Robotlike
Chatbots are good at listening but not always good at picking up on nuance. A human therapist might hear the alarm bells behind a dark joke, while a bot could respond with something neutral or even encouraging.
Therapy is more than words. It’s tone, presence and trust. Bots can simulate those qualities, but they can’t fully replace the healing relationship between two people. Unlike humans, who rely on the richness of context, emotions and body language, chatbots interact based on statistical patterns from their training data.
No Accountability
Chats with AI often include deeply personal disclosures. Without strong protections, that data could be misused. And unlike a licensed therapist, a bot isn’t accountable if harm occurs.
Let’s Lean In
Chatbots are already shaping the mental health landscape, especially for young people who naturally turn to digital tools for support. Rather than resisting this shift, professionals and parents can lean in—learning how these tools work and embracing their benefits and limits.
Obviously, we can’t ban bots, but we can focus on:
Large-scale clinical trials that specifically study teens, not just adults.
Clear safety standards to test whether bots can recognize crisis situations.
Privacy protections and regulations similar to medical devices.
Hybrid models, where bots provide support but quickly connect teens to live therapists when needed.
The stakes are high, and as adoption grows, so does the responsibility to make sure these digital companions are safe, worthwhile and worthy of the trust millions of teens are already placing in them.
If you are having thoughts of suicide, call or text 988 to reach the 988 Suicide and Crisis Lifeline.