top of page

Support | Tip | Donate

Recent Posts

Featured Post

UPDATE: Youth, Teens, and AI Companionship Apps

  • Writer: The White Hatter
    The White Hatter
  • Jul 18
  • 4 min read
ree

In June 2024, we published our first article to raise awareness about a growing trend we were seeing among youth, the rise of AI-powered companionship apps. (1) These apps were being talked about by teens not just online, but in our own conversations with students. Since then, we’ve written several follow-up articles to keep parents, caregivers, and educators informed about this evolving form of digital interaction. (2)(3)(4)(5)


Fast forward to July 16, 2025, Common Sense Media released a national survey in the United States that confirms what many of us working in digital literacy and youth education have been seeing on the ground. (6) According to their report, nearly three out of four teens between the ages of 13 and 17 have used AI companionship apps. That’s a significant number, and it’s worth paying attention to.


Key Findings from the Survey


Here’s what Common Sense Media found:


  • 72% of teens have used an AI companion app at least once.


  • More than half of teen users engage with these apps at least a few times a month.


  • About 1 in 3 teen users reported using AI companions for:

    • Social interaction and conversation practice

    • Emotional support and friendship

    • Role-playing or romantic engagement


  • Roughly 1 in 3 also said:

    • They’ve had conversations with AI companions that felt as satisfying, or even more satisfying, than those with real-life friends.

    • They’ve discussed serious or personal matters with an AI companion instead of talking to someone they know.

    • They’ve felt uncomfortable about something the AI companion said or did.


From our direct work with youth, it’s clear that many teens are approaching these apps with a mix of curiosity and entertainment in mind. They’re not necessarily looking to replace real-life relationships with digital ones. For most, these interactions are more about novelty and curiosity than emotional dependence.


However, the Common Sense Media survey highlights an important shift that shouldn’t be ignored. About a third of teen users are turning to AI companions for emotional and even romantic support. Some find these conversations more fulfilling than those they have with peers. This raises a key question, “Why?”


Although there is not a lot of good evidence based research yet to answer this question, we do have some thoughts.


AI companionship apps are designed to mimic connection. They’re built to reflect the user’s interests, validate their emotions, and reinforce the kind of responses a teen might want to hear, not necessarily what they need to hear. For teens feeling lonely, isolated, or misunderstood, these apps offer a predictable and affirming space, without judgment or conflict that are present in an off-line personal relationship. That can feel comforting, especially during a time when identity and belonging matter so much.


However, this becomes a concern when AI becomes the primary space where teens turn for support. For those dealing with serious mental health challenges, the echo chamber effect of AI validation can reinforce harmful thoughts or behaviours, rather than challenging or redirecting them in a healthy way. We’ve mentioned this risk in earlier articles, and it remains one of the most troubling aspects of this technology.


There’s another issue parents, caregivers, and teens should be aware of, and that is data collection. These apps don’t just respond to your child’s questions or comments, they record them. Conversations involving intimate feelings, fantasies, and deeply personal topics are being stored, analyzed, and possibly shared or sold by the companies that develop these tools.


What’s happening to this data? How is it being used? Who owns it once it’s entered into the app? These are important questions we still don’t have complete answers to. Until we do, there’s a risk that sensitive information shared in what feels like a private moment could end up being used in ways no one ever intended.


This isn’t about causing a panic, it’s about creating perspective and awareness. AI companionship apps aren’t inherently dangerous, but they aren’t neutral either. They reflect how they’re designed and trained. If your teen is using one of these platforms, consider the following steps:


  • Ask open-ended questions about what they’re using and why.


  • Explore the app yourself so you understand how it works.


  • Talk about the difference between validation and growth.


  • Remind them that digital spaces aren’t truly private, even when they feel that way.


  • If your teen is turning to AI for emotional support, check in more often about how they’re doing offline.


This is something that speak to in greater detail in this article. (7) 


We’ll continue monitoring this trend closely and sharing what we learn. In the meantime, awareness is the first step toward informed parenting in today’s onlife world that’s changing faster than most of us can keep up with. Please take the time to read our other articles on this topic in the references below.



Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech


References:

Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page