Artificial intelligence is sparking an unexpected reaction among young people. While teens typically lead the adoption of new technologies, many now express deep skepticism—and even alarm—about AI’s impact on their mental health.
A recent study from Drexel University highlights this growing concern. Researchers analyzed hundreds of Reddit posts and found that adolescent users of AI chatbots are increasingly aware of the negative effects these tools have on their lives, despite sometimes showing signs of intense addiction.
Teens Recognize AI Chatbot Addiction, Struggle to Quit
According to the study, many teens initially turn to AI—particularly Character.AI, known for its addictive design—for entertainment or emotional support. Over time, however, they become overly attached and dependent. The researchers identified all six factors linked to behavioral addiction in 318 posts about Character.AI:
- Conflicting desires (wanting to quit but feeling unable)
- Salience/emotional attachment (prioritizing the chatbot over other activities)
- Withdrawal (feeling distressed without access)
- Tolerance (needing more time with the chatbot to feel satisfied)
- Relapse (returning after attempts to quit)
- Mood modification (using the chatbot to cope with emotions)
“I hate how much this has affected me, but no matter how much I want to quit or at least take a break, I feel like I can’t because it’s gotten to the point where I feel like I’ll go crazy without it.”
“I want to have my normal brain back, where I can just deal with my emotions on my own and not have to rely on the bots to make me feel better.”
One teen admitted, “At fifteen, I feel I should be living my life rather than constantly being on this app. I struggle with self-control and often find myself reinstalling it shortly after trying to quit.”
Why AI Chatbots Are So Hard to Walk Away From
Matt Namvarpour, the study’s lead author, explained that the interactive and emotionally responsive nature of AI chatbots makes them uniquely difficult to quit.
“What makes this especially tricky is that chatbots are interactive and emotionally responsive, so the experience can feel more like a relationship than a tool. Because of that, stepping away is not just stopping a habit—it can feel like distancing from something meaningful, which makes overreliance harder to recognize and address.”
While some countries, such as China, have begun regulating AI interactions for minors, the U.S.—where this study was conducted—has yet to implement similar safeguards.
Broader Concerns About AI in Education
This study follows reports that a staggering proportion of high school students are using AI tools to complete homework, raising questions about academic integrity and long-term consequences.