Artificial Intelligence Gives Advice, But It Doesn’t Care About the Outcome Of That Advice
- The White Hatter
- 13 minutes ago
- 8 min read

In homes across Canada and beyond, artificial intelligence (AI) is quickly becoming part of everyday life. Youth and teens are using AI tools to get help with homework, provide advice about friendships and romantic relationships, provide guidance on mental health, and even provide answers to deeply personal questions they may not feel comfortable asking an adult.
With AI, information is more accessible than ever, and needed support appears to be just a prompt away with the use of an AI companion. However, there is an important reality that parents, caregivers , and educators need to understand….
AI gives advice based on what it is prompted with, but it doesn’t wait around for the results, and that distinction matters more than most people realize.
AI systems are designed to respond to input. They analyze patterns, generate language, and provide answers that sound helpful, informed, and often very confident. In many cases, the responses can be impressively accurate or supportive. However, AI does not have a stake in the outcome of the advice it gives.
AI does not:
Check back to see what happened after the advice was followed
Adjust its guidance based on real world consequences
Feel concern if something goes wrong
Take responsibility for harm caused by misunderstanding or misuse
AI participates in the moment of decision, but not in the aftermath of that decision, and that gap is where real risks can emerge.
It is important to remember that youth and teens are in a stage of life where they are still developing judgment, emotional regulation, and critical thinking skills. When a youth or teen turns to AI for advice, several things can happen:
1/ Confidence Can Be Misread as Credibility
AI is designed to communicate clearly and often sounds polished, direct, certain, and extremely confident. For adults, that tone can be useful, however, for youth and teens, it can easily be mistaken for authority. The challenge is that AI does not “know” in the way a human expert does. It generates responses based on patterns, not lived experience or situational awareness.
This can lead youth and teens to place a level of trust in the answer that it hasn’t actually earned. Even when the information is partially correct, it may lack nuance, relevance, or appropriateness for that specific moment in a youth or teen’s life. Teaching youth and teens that “confidence in delivery” is not the same as “accuracy in advice” is an important AI literacy skill.
2/ Advice Built on Partial Information
AI can only work with what it is given. Unlike a parent, caregiver, teacher, or trusted adult who can ask follow up questions, read body language, or recognize when something feels off, AI relies entirely on the prompt it receives, it lack the human element of connection that is important when it comes to interpersonal communications.
If a youth or teen leaves out key details, whether intentionally or unintentionally, the response they receive may be based on an incomplete or skewed version of reality. The result is guidance that may sound reasonable but doesn’t fully apply to the situation.
This is where real conversations matter. Humans can pause, clarify, and dig deeper, AI can’t. AI fills in the gaps, and sometimes those gaps are exactly where the most important context lives.
3/ No Follow Through When Things Go Sideways
In the onlife world, advice is part of a relationship. When something doesn’t go as planned, there is someone to turn back to. Someone who can help make sense of what happened, adjust the approach, and support the next step forward.
AI doesn’t operate in that way, instead it provides an answer and then moves on. If a young person follows that advice and the outcome is negative, there is no built in support system to help them process the consequences.
With AI, there is no accountability, no shared responsibility, and no ongoing guidance. That absence matters, especially for youth and teens who are still learning how to navigate mistakes, repair relationships, and build resilience through lived experience.
4/ When Artificial Support Replaces Real Connection
For some youth and teens, especially when dealing with sensitive or uncomfortable topics like sex, as an example, AI can feel like an easier option than talking to a real person. There is no fear of judgment, no risk of embarrassment, and no immediate emotional reaction coming back at them that they can adapt and react to.
While AI may lower the barrier to asking questions, it can also create a subtle shift where it becomes a stand in for meaningful human connection. Over time, this can reduce opportunities for youth and teens to practice difficult conversations, build trust, and experience the empathy that comes from real relationships where conflict is a reality that needs to be negotiated.
AI can simulate understanding through language, but it does not truly understand. It has no lived experience, no emotional investment, and no ability to care, and that distinction is extremely important. Helping youth and teens recognize that AI can support learning but cannot replace human connection, is one of the most important guardrails we can provide as parents, caregivers, and educators.
One of the biggest risks is that AI can create the feeling of support without providing the substance of support. A teen might ask:
“What should I do if my friend is ignoring me?”
“How do I deal with my anxiety?”
“I want to have sex , how do I approach this with my partner”
“Should I send this message?”
AI can generate responses that feel thoughtful, measured, and even well balanced, and in many cases, the advice it provides can sound very similar to what a reasonable person might say.
However, there are important limitations that often go unseen in the moment. AI does not know the people involved in a situation or the dynamics between them. It does not understand the history that may be shaping the issue, whether that’s past conflict, trust, or unresolved emotions. It cannot read tone, pick up on subtle cues, or recognize the kind of emotional nuance that a parent, caregiver, or trusted adult might immediately notice in a conversation.
AI does not remain present once the advice is given, and it does not follow up, check in, or help a youth or teen work through the outcome of a decision. The AI interaction is transactional. A question is asked, an answer is given, and then it ends.
For youth and teens who are still developing their ability to think critically and navigate complex social situations, this can create a false sense of certainty. The advice may feel complete, even when the situation itself is not. What many of these moments actually require is ongoing dialogue, reflection, and support, something that only real human relationships can provide.
Having said all of this, it’s also important to remember that AI is not inherently harmful. In fact, it can be incredibly useful when used appropriately where it can:
Help explain complex ideas
Support learning and creativity
Offer starting points for problem solving
The conversation around artificial intelligence should not be about removing it from a youth or teen’s life, that approach is neither practical nor productive. AI is already embedded in the digital environments our youth and teens are growing up in, from search tools to apps to the platforms they use every day. Trying to eliminate it often leads to avoidance rather than understanding, and avoidance does little to prepare them for the reality they are already living in.
What matters more is how we position it in their minds. Youth and teens need to understand what AI is, what it does well, and where its limitations begin. That framing helps shift a youth and teen’s perspective from passive acceptance to active evaluation. Instead of seeing AI as something to rely on, they begin to see it as something to use with intention and awareness.
At its core, AI should be understood as a tool, it can assist with learning, offer ideas, and help explore questions, but it is not a substitute for judgment, experience, or human connection. AI does not know them, AI does not share their values, and AI is not invested in their outcomes. When we help our youth and teens see AI in this light, we are not restricting them, we are equipping them.
One of the most effective ways to support youth and teens in an AI driven world is to keep communication open and ongoing. Start by making conversations about AI a normal part of everyday life. This doesn’t need to feel like an interrogation, a simple, curious approach works best. Ask what they’re using AI for, what kinds of questions they ask, and whether they’ve ever received an answer that didn’t sit right with them. These conversations create insight and signal that you are interested, not just monitoring.
It’s also important to help youth and teens understand the difference between information and judgment. AI can be very good at providing facts, explanations, and suggestions. What it cannot provide is lived experience, personal values, or an understanding of context in the way a human can. Helping your youth or teen recognize that distinction builds a foundation for better decision making, especially in more complex or emotional situations.
Encouraging a “second opinion” mindset is another key in AI literacy education. Just as we teach youth and teens to question what they see online or on social media, we should be guiding them to think critically about AI generated advice. When a youth or teens brings something forward, ask questions that prompt reflection. Who else could you talk to about this? What might be missing from that response? This approach helps them slow down and consider multiple perspectives before acting.
Reinforcing that you are a safe and reliable place for support. Youth and teens need to know that they can come to you, even when the topic is uncomfortable or when they’ve made a mistake. If AI becomes the first place they go for advice, that’s not necessarily the problem. The goal is to ensure it’s not the only place. Your presence provides something AI cannot, that being understanding, accountability, and ongoing support.
As a parent, caregiver, or educator, model what thoughtful use looks like. If you use AI in your own life, be open about how you engage with it. Talk through how you assess its responses, where you see value, and where you remain cautious. When youth and teens see that you don’t accept everything at face value, they learn to approach these tools with the same level of awareness and critical thinking.
AI can provide answers quickly and often convincingly, but it cannot provide accountability, relationship, or genuine care. It does not stay present when decisions lead to unexpected consequences, nor does it help a youth or teen work through mistakes or rebuild after something goes wrong. Those moments, the ones where learning, growth, and resilience are shaped, still depend on human connection.
As our youth and teens grow up in an increasingly connected and AI influenced world, that reality does not change. Technology may evolve, but the need for trusted adults does not. Parents, caregivers, educators, and mentors remain the steady presence who can offer perspective, context, and support over time, not just in a single moment of advice.
What youth and teens need most is not unlimited access to information, but meaningful connection. They need guidance that understands their world, support that continues beyond a single answer, and someone who remains present when things don’t go as planned.
POST SCRIPT - The concepts explored throughout this article form the foundation of our student program, “I.N.S.P.I.R.E AI – Artificial Intelligence for Students: Safety, Privacy, Ethics, and Success Plans (Ages 14+)”, which we deliver to schools and youth organizations. This program is designed to move beyond theory and provide practical, real world guidance that helps youth and teens navigate AI with awareness, intent, responsibility, and confidence. https://www.thewhitehatter.ca/programs/artificial-intelligence-ai-for-students-safety-privacy-ethics-success-plans
We also offer a parent, caregiver, and educator program that you can find here https://www.thewhitehatter.ca/programs/raising-ai-ready-youth-safety-privacy-ethics-and-success-plans
If you are a school, youth group, or parent group who are interested in our AI educational programs, connect with us here at the White Hatter via email at contact@thewhitehatter.ca We offer these programs both in person and through our high production live virtual broadcast studio https://www.thewhitehatter.ca/studio , allowing us to deliver it in real time to schools and organizations anywhere in the world with an internet connection.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech














