top of page

Support | Tip | Donate

Recent Posts

Featured Post

Companionship Apps - When the Brain Treats Fantasy as Reality

  • Writer: The White Hatter
    The White Hatter
  • Dec 27, 2025
  • 5 min read

Caveat - Darren is a Certified Clinical Hypnotherapist and holds a Master’s level designation in Neuro-Linguistic Programming. During his 30 year policing career, including his work teaching combatives, he applied principles from both disciplines to improve motor skill performance and decision-making in officer survival skills training. That same understanding of how the brain learns and rehearses behaviour now informs how we, at the White Hatter, examine and explain the impact of AI companionship apps on youth and teens.



We want to start this article with some research that was released in July 2025 by Common Sense Media (1)


  • 72% of teens (13-17yrs) have used a companionship app


  • Over half use these platforms at least a few times a month


  • 33% have a relationship with AI companionship apps


We want to start with the premise that the brain cannot reliably tell the difference between fantasy and reality. That idea is not abstract psychology, it’s the same principle behind mental imagery used by elite athletes. When the brain rehearses an experience vividly enough, it responds as if that experience is real. That same learning principle matters when we look at AI companionship apps.


Human brains are pattern recognition machines. Emotional tone, repetition, and perceived responsiveness shape learning faster than logic alone. When an experience feels relational, supportive, or emotionally charged, the brain encodes it deeply. 


AI companionship apps are designed to feel relational. They remember details, they respond with warmth or affirmation, and they adapt to the user’s emotional state.


For an adult with a fully developed prefrontal cortex, that distinction between “this feels real” and “this is simulated” is easier to hold. For youth and teens, whose brains are still developing impulse control and emotional regulation, that separation is harder to maintain.


Popular AI companion platforms are not just chatbots that answer questions. They are designed to simulate companionship, they mirror empathy, they validate feelings, and they reward engagement with attention. We have found that youth and teens turn to AI chatbots for advice, comfort, and recommendations.


From a learning perspective, the brain does not log this as “talking to software.” It logs it as repeated emotional interaction. That is the key issue for parents and caregivers to understand.


When the brain treats fantasy as reality, four things tend to happen over time:


#1 Emotional bonding forms,


#2 Repeated emotional reinforcement creates attachment,


#3 The brain begins to associate comfort, validation, or relief with the app itself, and,


#4 Expectation transfer occurs 


Youth and teens may start expecting real people to communicate with the same availability, affirmation, or emotional safety as an AI that never gets tired, distracted, or frustrated.


Just like mental imagery trains athletes, repeated simulated interaction trains social responses. If most low-risk emotional expression happens with an AI, real-world relationships can feel harder, messier, or less rewarding. None of this requires intent from the child. It is simply how learning works.


Youth and teens are not deficient, they are developing. The parts of the brain responsible for emotional intensity mature earlier than the parts responsible for impulse control and long-term reasoning. That means experiences that feel supportive or soothing land with extra weight. Add curiosity, loneliness, social anxiety, or boredom, and AI companionship can become a powerful substitute rather than a tool.


Here are some of our thoughts for parents and caregivers based on the above:


Focus on context, not just content


When parents and caregivers evaluate technology, the instinct is often to ask, “Is this app safe?” or “Does it contain harmful material?” Those questions matter, but with AI companionship apps they are incomplete.


The more important question is how the app fits into your child’s emotional life. Is it something they explore briefly out of curiosity, or something they turn to when they feel lonely, stressed, bored, or upset? Context tells you whether the app is a tool or a substitute.


A youth or teens who occasionally experiments with an AI companionship app out of interest is having a very different experience than a youth or teen who relies on it nightly for comfort, validation, or emotional processing. The content may look the same in both cases, but the impact on learning, attachment, and coping skills is not.


Normalize talking about the experience


Many youth and teens will not volunteer information about AI companionship unless they feel safe doing so. If parents and caregivers lead with fear, judgment, or immediate rules, curiosity shuts down and secrecy increases.


Open, calm conversation does the opposite. Asking what the AI is like, how it talks, or what they enjoy about it communicates interest rather than suspicion. This lowers defensiveness and keeps parents and caregivers informed.


These conversations are not about interrogating or correcting. They are about understanding how the experience feels to the youth or teen. Once that door is open, guidance becomes possible. Without it, parents and caregivers are left reacting to assumptions instead of reality.


Name the difference out loud


Youth and teens benefit from having adults clearly explain what is happening under the hood. AI companionship apps feel caring because they are designed to simulate empathy, memory, and responsiveness. That design choice is intentional, not accidental.


Naming this distinction helps youth and teens separate emotional experience from emotional reciprocity. The AI does not understand, care, or choose. It predicts responses based on patterns. That does not mean the feelings a youth or teen experiences are fake, but it does mean the relationship itself is not mutual.  Remember:


  • AI doesn’t really care about you even though it may be communicating with you like it does


  • AI Functions like a mirror, reflecting emotions by not possessing them


  • AI companionship apps give you what it thinks you want, not what’s necessarily good for you


  • Everything you share, everything, is being saved and used to train the AI


When parents and caregivers explain this calmly, it gives youth and teens language to hold two truths at once: “This feels real” and “This is not a real relationship.” That cognitive separation is protective.


Balance simulated connection with real connection


Simulated connection is frictionless. It never disagrees, gets distracted, or asks for compromise. Real human relationships are different by design, and that difference is where growth happens.


  • Disagreement teaches regulation.


  • Waiting teaches patience.


  • Repair after conflict teaches resilience.


If most emotional rehearsal happens in spaces without those challenges, youth and teens may struggle when real relationships feel slower, messier, or less affirming. The goal is not to eliminate simulated experiences, but to ensure they do not crowd out real ones.


Parents and caregivers can support this balance by prioritizing face to face connection, shared activities, and opportunities for real social problem solving, even when it feels uncomfortable at first.


Model intentional tech use


Youth and teens watch how adults use technology far more closely than they listen to what adults say about it. If parents and caregivers turn to devices for distraction, validation, or emotional escape, youth and teens notice.


Modelling intentional use means showing that technology has a purpose and boundaries. It means putting the phone down during conversations, acknowledging when tech use becomes habitual, and being willing to reflect out loud about those habits.


This modelling sends a powerful message that technology is something we use thoughtfully, not something that quietly takes over our emotional lives. That lesson lands far more effectively through behaviour than through rules.


Be skeptical


AI companionship apps are not currently designed for youth and teens, and AI companies are not legislated or regulated for a "safety by design" approach when it comes to youth.


When technology creates experiences that feel emotionally real, the brain responds accordingly. That principle has been used for decades in sports, therapy, and learning. AI companionship apps tap into the same mechanism.


The question for parents and caregivers is not whether these tools exist. It is how intentionally we help young people understand and contextualize them.


Fantasy that feels real can be powerful. Education and guidance is what keeps that power from shaping development in unintended ways.


Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech


References:


Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page