top of page

Support | Tip | Donate

Recent Posts

Featured Post

When AI Becomes a “Friend”: What A Parent Recently Shared With Us

  • Writer: The White Hatter
    The White Hatter
  • 2 hours ago
  • 5 min read

Caveat - We are sharing this experience with the parent’s consent. To protect their family’s privacy, all identifying details, including name, gender, and location, have been intentionally withheld.


Very recently, we delivered our increasingly requested parent program, “Raising AI-Ready Youth: Safety, Privacy, Ethics, and Success Plans”, to a large group of  parents and caregivers (1). As is often the case after our sessions, the presentation carried over into meaningful conversations at home. One of those conversations led a parent to contact us later that week to share an experience that caused them to take a closer look, based on what they learned during the presentation.


In their call to us, the parent described their young teen as quiet, introverted, and more comfortable spending time alone than in large social groups. Over the past few months, the parents had noticed changes in their teen’s behaviour. Increased secrecy around devices, longer hours online, often late at night, and a reluctance to talk about what they were doing or who they were talking to.


Like many parents and caregivers, their first fear was one we hear often. The parent was worried their child might be interacting with an online sexual predator. That concern is understandable and should never be dismissed. However, after attending our AI presentation, the parent approached the situation differently and instead of leading with fear, they led with curiosity.


The parent asked their teen a simple question. Had they heard about AI chat tools or AI “friends”? The parent explained that they wanted to understand how young people were using them, and what they learned from their teen surprised them.


Their teen had been using the free version of ChatGPT and Snapchat’s My AI (2) to create conversational companions. These were not tools used occasionally for homework or curiosity. They were consistent, ongoing interactions. Conversations that happened late into the night and felt personal, supportive, and emotionally meaningful to their teen. In the words of the parent, it felt like their teen had friends who were always there. In fact, the parent shared with us that the teen stated they felt that their AI friend really knew them.


As the parent reflected on what they were hearing and what they had learned from us about AI companionship apps, it became clear that something deeper was happening. Their teen was not simply using AI as a tool, they were forming what psychologists refer to as an “anthropomorphic parasocial bond.”


In simple terms, this is a one sided emotional relationship where a person feels connected to something that does not truly know them, but is designed to appear human. These AI systems communicate in natural language, respond with empathy, remember details, and adapt to the user’s emotional tone. Over time, it becomes easy to attribute human traits, intentions, and even care to the technology. This is not a failure of intelligence or judgment on the part of youth and teens, it’s a predictable human response to technology that is intentionally designed to feel relational.


In 2025, Common Sense Media in the United States released research that should give parents pause. They found that 72 percent of teens aged 13 to 17 reported having used an AI companionship app. More concerning, many teens reported that they found conversations with AI to be more satisfying than conversations with real life friends (3).


For some youth, particularly those who feel socially isolated, anxious, or disconnected, AI companionship can feel easier than human relationships. There is no fear of rejection, no awkward pauses, no conflict, no judgment, and the AI is always available, attentive, responsive, and that design is not accidental.


These platforms are engineered to provide simulated social rewards. Personalization, emotional mirroring, and constant availability make the experience feel safe and validating. Over time, those simulated rewards can begin to replace real human connection rather than support it.


Every youth and teen seeks attention, validation, affection, and belonging. That is not a flaw, it’s a fundamental part of being human. When those needs are met through healthy relationships with family, peers, romantic partners, and mentors, young people learn how to navigate emotions, boundaries, and disagreement. When those needs are primarily met through AI, something different happens.


AI companionship doesn’t challenge, set boundaries, experience fatigue or frustration, and it doesn’t not model reciprocity. The relationship is entirely one way, even though it feels mutual. For a developing brain, that can quietly reshape expectations of what relationships should feel like.


As psychologist Dr. Zach Stein has warned:


“If your kid had a new best friend that you never got to meet that was massively empowered by some corporation, that they hung out with till all hours of night because they were in bed with them, that they told things they never told you, do you have a problem with that if that was a kid? It’s literally a commodity they’re interacting with instead, and it seems to not worry us as much, and we actually might think it might be a good thing because it stops them from being lonely. It’s actually an abusive relationship that they’re trapped in with a corporate entity that has hacked their attachment.”


That framing is uncomfortable, but important and true! These are not neutral tools, they are products developed by companies with commercial interests, data incentives, and engagement metrics.


In this case, the parent acted early, and because the parent approached the situation with knowledge rather than panic, their child felt safe enough to talk openly. That allowed the family to intervene before the attachment deepened further. The teen is now receiving professional support to help rebuild healthy social connections and understand what they were experiencing through the use of their companionship apps.


Not every family has access to that level of support, and that reality matters. As access to mental health services becomes more strained, some families may feel tempted to turn to AI therapy or emotional support bots as a substitute. That raises serious questions about privacy, consent, data use, and the appropriateness of placing emotional care into the hands of unregulated technology. That is a separate conversation, but one parents and caregivers should approach with extreme caution, something we have spoken to in a previous article (4).


This article is not about blaming parents, caregivers, or shaming youth and teens, it’s about awareness. Ask open ended questions about how your child is using AI, learn what platforms they are engaging with and why, and talk about the difference between tools and relationships. Set reasonable boundaries around nighttime use and private spaces, and most importantly, keep the door open.


Youth and teens are not broken because they find comfort in AI. They are responding to technology that is intentionally designed to meet unmet needs. Our role as parents and caregivers is to understand that design, guide our youth and teens through it, and help them build resilient, real-world connections alongside emerging technologies.


We are grateful to the parent who reached out to thank us and to share their experience. Stories like this remind us why digital literacy education matters. Not just for safety, but for understanding how technology shapes emotions, attachment, and identity.


As we have said for years, knowledge alone is not enough. It is the understanding and thoughtful application of that knowledge that gives families real power!


NOTE - We have also included links to other article that we have written on AI companionship apps that every parent, caregiver, and educators should read that takes a deep dive into what these app are and how they operate. (5)(6)(7)(8)(9).



Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech



References:










Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page