Young People Want a Say: Why Youth Must Be Included in Today’s Tech and AI Policy Decisions
- The White Hatter
- 6 minutes ago
- 4 min read

Parents, caregivers, and lawmakers across Canada are scrambling to keep up with rapid changes in technology and artificial intelligence. Much of the public debate focuses on what adults believe young people need, fear, or should be protected from. What’s often missing is the one voice that deserves a central seat at the table, youth and young adults.
Today’s teens are the most digitally engaged generation in history. They navigate complex online environments every day, yet they are rarely invited into conversations about the laws, regulations, and safety standards being designed for them. That’s a problem, because young people are not only capable of understanding these issues, they have a strong appetite to be included. Many are already thinking critically about AI, algorithms, data practices, and what healthy digital literacy should look like.
Despite the stereotype that youth are careless online, many are acutely aware of the challenges tied to AI and digital platforms. They talk about cognitive off-loading, the tendency to rely so heavily on technology that it becomes harder to think independently. They recognize how problematic algorithms can shape their feeds, influence their self-perception, and steer their choices. These aren’t abstract ideas to them. Why?, because they live inside these systems every day.
Young people also know that digital literacy has to evolve beyond basic media awareness. They want education that teaches them how AI works, what data it uses, how decisions are made behind the scenes, and when a tool’s output should or shouldn’t be trusted. They understand that healthy skepticism is a skill, and they want support in building it.
Today’s teens did not grow up during the early days of Facebook, Instagram, or YouTube. However, they’ve watched the fallout from that era where misinformation, surveillance advertising, algorithmic bias, and the harm that can occur when tech moves faster than regulation. Many have told us here at the White Hatter that they don’t want to repeat those mistakes as AI and immersive technologies become central to their lives.
This awareness is shaping a new generation of expectations. Young people want stronger protections, clearer rules, and more transparency from tech companies. They want oversight that keeps them safe without shutting them out where they find themselves in today’s onlife world.
One of the strongest messages coming from teens that we have heard loud and clear, regulation, innovation, and access can co-exist. They aren’t asking for a digital lockdown, instead they want safer access. They want the ability to use new tools, explore creativity, and take part in learning and social opportunities that technology provides. At the same time, they want laws that curb exploitation, strengthen privacy, provide recourse when harm occurs, and require platforms to act responsibly.
We believe that this is an important distinction. Too often, adult conversations fall into two extremes, total restriction or complete freedom. Young people aren’t asking for either. They want thoughtful legislation that protects them without shutting off opportunities or stunting innovation. They want rules that serve them, not rules that sideline them.
What do we here at the White Hatter believe is the missing piece in most of these discussions; the voice of youth, teens, and young adults who are the true experts when it comes to their use of technology.
For any meaningful progress to happen, youth, teens, and young adults must be invited into the policy process. That can take many forms:
Student advisory panels on provincial or federal digital policy
Youth consultations for school districts shaping AI or tech guidelines
Direct participation in legislative committee hearings
Co-design of curriculum focused on AI literacy and digital rights
Partnerships between youth organizations, educators, and policymakers
When teens are asked what they need, they offer practical ideas. They want AI tools held to safety standards, they want clear explanations of how algorithms work, they want privacy protections that don’t depend on parents reading 60-page terms of service, they want online reporting tools that are easier to use, and they want tech companies held accountable when harm happens. Young people want to take part in these adult conversation that will and are affecting the world they are growing up in.
If we want technology to support healthy development, education, and social connection, youth need a genuine role in shaping the rules. They are the ones navigating these systems, and they are the ones who will live with the consequences of our choices.
The next era of tech policy, especially in AI, cannot be something adults design for youth without designing it with them.
Parents and caregivers can support this by encouraging their teens to speak up, get involved in school or community consultations, and learn the basics of digital rights and AI literacy. Educators can create space for student voices in conversations about classroom tech. Legislators can ensure that youth engagement is a required part of any regulatory process tied to digital safety or AI governance.
Young people want safe, fair, and transparent digital systems. They want innovation, but not at the cost of their well-being or access. Most importantly, they want to be heard.
It’s time we listened.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech














