Five Ways We’re Seeing Teens Currently Using AI
- The White Hatter
- 6 minutes ago
- 6 min read

Artificial intelligence is no longer something youth and teens experiment with once in a while, it’s woven into how they communicate, learn, create, cope, and even explore relationships. Many parent, caregiver, and even educator conversations still focus almost entirely on cheating or homework shortcuts. That lens is far too narrow and misses what is actually happening in a youth and teen’s onlife world.
Youth and teens are not interacting with one single kind of AI. They are engaging with what we have identified as several distinct categories, each with its own developmental, emotional, and safety implications. Understanding these categories helps parents and caregivers move away from fear or guesswork and toward informed, confident guidance.
This is not about banning AI, it’s about understanding the role it plays in a youth or teen’s life, and parenting with that knowledge in mind. At this point in time, we have identified five categories of AI that youth and teens are interacting with.
#1: AI agents with personality and voice
These are conversational AI systems that speak naturally and respond with tone, humour, and sometimes memory. They feel less like tools and more like “someone” you are interacting with.
For teens, this matters. Asking questions feels easier when there is no risk of embarrassment or judgment. Curiosity, reassurance, and advice can flow quickly, especially late at night or during moments of uncertainty.
The concern is not that teens are talking to AI, the concern is misplaced trust. A confident, fluent response can feel authoritative or caring even when it is neither. youth and teens may give these systems more credibility than they deserve, especially when answers sound calm and assured. Unlike a human, AI does not understand context, intent, or emotional nuance. It does not know when it is wrong, when advice could be harmful, or when a situation requires adult judgment. For a developing youth or teen, this can blur the line between useful information and guidance that should never be taken at face value.
Common examples: ChatGPT voice mode, Siri, Alexa, Gemini Live, Copilot Voice
What parents and caregivers can do:
Help youth and teens understand that sounding human does not mean being wise, accurate, or emotionally invested. Encourage them to double-check advice, especially around health, relationships, or risky decisions.
#2: Friendship and therapist style AI apps
Some AI tools are designed specifically to act like companions, friends, or emotional support systems. They remember personal details, validate feelings, and designed to offer comfort during stress or loneliness.
For teens who feel misunderstood or hesitant to talk to adults, this appeal makes sense. These systems are always available and never dismissive.
The risk is emotional substitution. These tools simulate care without accountability, boundaries, or real responsibility. They cannot reliably recognize crisis situations, challenge unhealthy thinking, or replace real-world support.
Common examples: Replika, Wysa, Woebot
What parents and caregivers can do:
Ask where your teen turns to when they feel overwhelmed. Reinforce that support tools are not the same as supportive human relationships. Keep conversations open and nonjudgmental so AI does not become the only place they feel heard. Make it clear that curiosity about AI is not a problem, but relying on it for emotional reassurance, advice, or validation can be limiting and, at times, risky. Youth and teens need to know there are trusted adults who will listen without overreacting, correcting, or shutting the conversation down. When young people feel safe talking to real people, AI is more likely to remain a tool rather than a substitute for connection.
#3: Romantic and sexual intimacy AI
A growing number of AI systems are designed for flirtation, romantic bonding, or sexual role play. These tools provide affirmation, attention, and intimacy without rejection, boundaries, or real consequences.
For youth and teens, this can shape expectations about relationships, consent, and emotional reciprocity. AI intimacy requires no compromise, no patience, and no respect for another person’s autonomy because there is no other person involved.
The issue is not curiosity. It is conditioning. Repeated exposure to one sided, always-available intimacy can distort how teens understand real relationships.
Common examples: AI girlfriend or boyfriend apps, erotic chatbots, sexual role-play AI platforms
What parents and caregivers can do:
Include AI in conversations about healthy relationships and consent. Emphasize that real relationships involve boundaries, effort, and mutual respect. Silence on this topic leaves AI to fill the gap. When teens do not hear these messages from trusted adults, they may look to AI systems for guidance on intimacy, connection, or emotional validation. That guidance is shaped by training data and design choices, not care or accountability. Ongoing, age-appropriate conversations help teens recognize the difference between simulated attention and genuine human connection, and reinforce that consent and respect cannot be learned from an algorithm.
#4: Traditional text-based and tutoring AI
This category includes AI used for research, explanations, studying, and skill-building. These systems are more transactional and less emotionally engaging.
The main concerns here are accuracy, over reliance, and shortcuts. Youth and teens may accept answers without questioning sources or use AI to bypass productive struggle, which is where learning actually happens.
This is generally the lowest emotional risk category, but it still benefits from guidance.
Common examples: NotebookLLM, ChatGPT text mode, Khanmigo
What parents and caregivers can do:
Shift the question from “Did you use AI?” to “How did you use it?” Encourage AI as a study partner, not a replacement for thinking or effort. Ask your youth or teen to explain what the tool helped with, what they changed, and what they still had to figure out on their own. This keeps learning visible and reinforces accountability. When AI is framed as a support for brainstorming, clarification, or practice, rather than a shortcut to finished work, youth and teens are more likely to use it in ways that build skills instead of weakening them.
#5: Creative and identity expression AI
Youth and teens are using AI to create images, music, videos, stories, and avatars. These tools intersect with self-expression, identity exploration, and social validation.
AI creativity can be empowering and playful. It can also quietly shape unrealistic standards around appearance, talent, or popularity. When AI-generated content is shared socially, validation often becomes part of the feedback loop. These types of AI apps can also be weaponized and used for the purpose of cyberbullying, sextortion, or the creation of deep fake intimate images.
Common examples: Midjourney, Canva Magic Studio, Lensa
What parents and caregivers can do:
Ask what your youth or teen is creating and why. Focus on expression and process rather than comparison or perfection. Talk about authenticity and the difference between creating something and curating an image. AI can make it easy to polish, remix, or generate content quickly, but that speed can blur where ideas come from and what they mean. Helping teens reflect on their intent encourages ownership of their work and reduces pressure to perform or impress. When adults value creativity, effort, and learning over flawless outcomes, young people are more likely to use AI as a tool for exploration rather than a measure of their worth.
NOTE: These categories are not fixed, and they often overlap and influence one another. For example, an AI app designed around friendship can quickly shift into romantic or sexually intimate use. We also fully expect that these categories can also evolve as youth and teens gain more experience and confidence using AI.
AI is not one thing, it’s a collection of tools, voices, and experiences that intersect with how youth and teens think, feel, learn, and relate to others. When parents, caregivers, and educators understand these categories, conversations become calmer, clearer, and more productive.
When it comes to AI, the goal is not fear or control, it’s AI literacy. Youth and teens who understand what AI is, and what it is not, are better equipped to use it thoughtfully, question it when needed, and keep it in its proper place in their lives.
AI Literacy gives young people language, context, and confidence. It helps them recognize limits, spot bias, and understand that AI is a tool shaped by data and design choices, not a source of truth or authority. When families focus on understanding rather than restriction, youth and teens are more likely to engage responsibly and bring questions forward instead of hiding their use.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech














