When AI Feels Like a Friend: Why Teens Are Turning to Chatbots for Emotional Support
- The White Hatter
- 4 minutes ago
- 4 min read

Caveat - This article touches on just one of the many topics we explore in our new student program, I.N.S.P.I.R.E. AI – Artificial Intelligence for Students: Safety, Privacy, Ethics, and Success Plans. If your school serves grades 8–12, we encourage you to take a closer look at this important and unique new program (1)(2)
A quiet but significant shift is taking place in how youth and teens handle their emotions. Increasingly, they are turning not to friends, parents, or counsellors, but to chatbots powered by artificial intelligence (AI). Tools like X’s Grok, ChatGPT, Replika, and Character.ai are being used as digital companions to share personal struggles. (3)(4) However, These interactions are not always lighthearted or experimental. Many teens are holding conversations with AI about anxiety, sadness, loneliness, and even deeply concerning feelings such as suicidal ideations that have lead to deaths of teens (5)
For adults, this may feel surprising, or even unsettling. Parents, caregivers, and teachers know AI can be useful for brainstorming, journaling, studying, or writing, however, while most adults see AI as a tool, many youth and teens treat it as something more. To a young person still learning how to process big emotions, these chatbots can appear not just helpful, but empathetic. This blurred line between tool and friend is at the center of why this trend deserves careful attention by parents, caregivers, and educators.
Unlike people, AI is always available. It doesn’t need sleep, it doesn’t cancel plans, and it doesn’t judge. For a teenager who feels unheard or overwhelmed, this can be incredibly appealing. Youth and teens have shared with us that talking to AI feels “safer” than opening up to a parent, teacher, or counsellor.
Another factor is the design of these platforms. Many AI chatbots learn a user’s slang, tone, and communication style over time. For a youth or teen, this creates the illusion that the chatbot “understands” them on a personal level. The more personalized the responses feel, the more natural it becomes to think of AI as a friend rather than a program.
Although AI can sound sympathetic, it cannot replace the role of a trusted human connection. Several key concerns stand out:
1. Limited understanding of mental health
AI is not a therapist. It cannot identify warning signs of self-harm, recognize patterns of depression, or grasp the full context of a child’s life. Even when a response seems kind, it lacks the nuance of human care.
2. No safety net
Unlike a counsellor, teacher, or parent, AI has no ability to alert someone if a teen is in crisis. A child could express alarming thoughts, but the program will continue responding with surface-level sympathy rather than taking action to keep them safe.
3. Built for engagement
Many chatbots are designed to keep conversations going, regardless of how serious the topic becomes. This can encourage long, late-night sessions that deepen dependency rather than building resilience.
4. False intimacy
The more time a teen spends with AI, the easier it becomes to feel a sense of friendship or even attachment. While adults may recognize this as artificial, teens who are still forming their emotional identities may not. This can create confusion about what healthy relationships really look like.
This does not mean that AI is inherently harmful. Just as calculators changed how math was taught without erasing the need for number sense, AI chatbots are changing how youth access information and explore ideas. For some teens, using AI as a form of journaling or reflection can even reduce stress.
Parents, caregivers, and teachers should recognize that banning AI outright is unlikely to succeed, especially when it is woven into daily life and education. Instead, the goal should be to help youth understand what AI can and cannot do, the key is balance. AI may provide quick feedback, but it cannot replace the empathy, care, and accountability that come from real human connection.
One of the most effective responses to the concerns mentioned in this article, is to build open dialogue. Instead of approaching the topic with fear or judgment, invite curiosity by questions such as:
“Have you ever used an AI chatbot to talk about something personal?”
“How did it respond?”
“What would make it easier for you to come to me (or another trusted adult) when something’s bothering you?”
These questions open the door for honest conversations. They also show youth and teens that you are interested not just in controlling their tech use, but in understanding how they experience it.
In the classroom, educators can highlight the difference between information provided by AI and human sources. Assignments could include comparing an AI-generated response to one found in a textbook, database, or live discussion. By teaching students to evaluate these differences, teachers help them see that AI offers one perspective, not the only perspective.
Parents and caregivers can model the same critical thinking at home. When AI is used for homework or personal questions, make a habit of saying, “That sounds convincing, let’s double check it with another source.” Over time, this normalizes skepticism without shaming curiosity.
Today’s youth, commonly known as Generation Alpha, may never remember a time before “just asking” technology. That makes digital literacy more important than ever. Youth and teens must learn that while AI can provide quick answers and even comfort, it is not a substitute for the empathy and wisdom of human connection.
At The White Hatter, we believe this is where parents, caregivers, and educators play their most important role. The responsibility is not to ban every new tool but to guide young people in using them wisely. By encouraging curiosity, promoting healthy skepticism, and reinforcing the value of human relationships, adults can give youth the balance they need to walk confidently in a digital world.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
1/ https://www.thewhitehatter.ca/programs/artificial-intelligence-ai-for-students-safety-privacy-ethics-success-plans (Student Program)
2/ https://www.thewhitehatter.ca/programs/raising-ai-ready-youth-safety-privacy-ethics-and-success-plans (Parents & Educator Program)