top of page

Support | Tip | Donate

Recent Posts

Featured Post

The UK Says “No” To Age Gating Legislation. Other Countries Should Take Notice - Why We Agree With The UK Decision.

  • Writer: The White Hatter
    The White Hatter
  • 6 minutes ago
  • 8 min read


Caveat - The UK has decided not to move forward with social media age gating laws, a decision that may offer an important lesson for other countries, including Canada. While protecting youth online matters, restricting access alone does little to address the real drivers of harm. This article explains why thoughtful legislation that targets platform design and accountability may be far more effective than simple age limits.


Over the past several months we have been asked, both publicly and privately, about our position on age gating legislation aimed at restricting youth access to social media and other online platforms here in Canada. Our answer has remained consistent, we do not believe that age gating legislation, by itself, is an effective way to protect youth and teens online, nor do we believe it meaningfully holds large technology companies accountable for the design and operation of their platforms.


At the same time, it is important to be clear about something else. We are not opposed to legislation. In fact, we strongly believe that thoughtful, well constructed legislation is needed to address some of the systemic problems that exist within the digital ecosystem. We have previously published two documents outlining a legislative framework that we believe would more effectively address these challenges (1)(2). Our concern is not with legislation itself, our concern is with whether the age gating legislation being proposed actually solves the problem it claims to address.


Unfortunately, when we have raised these concerns in discussions with some advocates who strongly support age gating legislation, those concerns have sometimes been dismissed or brushed aside. The conversation is often framed as a simple binary choice, either support strict age restrictions to protect youth and teens, or accept that nothing will change and leave young people vulnerable. In reality, the situation is far more complex than that.


One thing we have learned through more than three decades of work in youth  and teen digital safety education, legislation is difficult to pass. Political negotiations, lobbying pressures, legal reviews, and competing interests all shape what eventually becomes law. However, something many people underestimate is that once legislation is passed, it can become even harder to change.


Laws can remain in place for years, sometimes decades, even when their limitations become clear. Modifying them often requires new legislative cycles, renewed political battles, and significant public debate. That reality means that when governments decide to regulate something as complex and influential as the digital platforms used by millions of young people, it is essential to get the framework as right as possible from the beginning.


No legislation will ever be perfect. Good policy evolves over time as technology and society change. However, poorly designed legislation can create unintended consequences that are difficult to correct once they are embedded into law.


If the goal is to protect youth and teens, safeguard privacy, and hold technology companies accountable, then the legislative approach needs to be carefully constructed with those outcomes in mind. That is something we strongly support.


Parents and caregivers are not imagining the problems. For many years, large technology companies have operated in regulatory free environments that allowed them to design platforms largely around engagement and growth. Their business models rely heavily on advertising revenue, user attention, and data collection. That reality has shaped how many social media platforms are built.


Features such as endless scrolling, algorithmic recommendation systems, engagement feedback loops, and personalized content feeds are not accidental. They are deliberately engineered to keep users interacting with platforms for longer periods of time. In some cases, these systems have contributed to negative outcomes for certain youth and teens, including exposure to harmful content, harassment, manipulation, and other forms of digital harm.


Parents and caregivers are justified in asking whether companies that influence the online environments used by billions of people should face stronger oversight. Frustration with the lack of accountability in the technology sector is understandable. There is growing global recognition that the current regulatory framework for large platforms is inadequate. However, frustration and emotion should not determine public policy.


When public concern around a problem increases, governments often feel pressure to respond quickly. In those moments, there can be a strong temptation to adopt policies that appear decisive and easy to communicate politically. In our view, age gating legislation often falls into that category.


At first glance, restricting youth or teen access to certain platforms until a specific age may seem like a straightforward solution. It can be presented as a protective measure, and it creates the impression that governments are taking strong action. The challenge is that the effectiveness of this approach is far from clear.


Youth and teens have historically demonstrated a remarkable ability to bypass technological restrictions. Anyone who has worked closely with youth and technology understands that determined teens often find ways around barriers through alternate accounts, shared credentials, VPN services, or other workarounds, something that we are already significantly seeing in countries that have already implemented age gating legislation. If legislation can be easily bypassed, it risks becoming more symbolic than practical.


Even more concerning is that age gating legislation may allow technology companies to shift responsibility away from themselves. Once a platform implements a legally required age verification system, it can argue that it has fulfilled its legal obligations. Any remaining problems can then be framed as failures of parents, youth, teens, or enforcement systems rather than the platform itself. In other words, the architecture shaping the digital environment, which actually causes the harm, remains unchanged.


Many of the risks young people face online are not simply the result of their presence on digital platforms. They are often connected to how those platforms are designed. Algorithmic recommendation systems, engagement maximizing feedback loops, personalized content targeting, and other design features influence what users see, how long they remain engaged, and how information spreads. If these structural elements contribute to harm, restricting youth access does not address the root of the problem. It addresses the symptom rather than the cause.


Meaningful accountability for large technology companies would likely require examining platform design, transparency around algorithms, data-use practices, and the incentives driving engagement based business models. These conversations are more complex than simply setting an age limit. They require technical expertise, regulatory understanding, and careful policy design. However, they are also the conversations most likely to produce lasting change.


Recently, the United Kingdom made the decision not to proceed with proposed age gating legislation. In our view, this decision may serve as an important circuit breaker in what has increasingly begun to resemble what we like to call a policy contagion. Over the past several years, similar legislative proposals have begun appearing across multiple jurisdictions. Often, once one country introduces a policy framework, others follow quickly. In some cases, these policies spread faster than the evidence supporting their effectiveness.


The UK’s decision introduces something that has largely been missing from the global conversation, a pause. Much like a circuit breaker in an electrical system, the purpose is not to permanently shut down the system. Instead, it interrupts the flow long enough to prevent overload and allow people to examine what is actually happening.


In policy terms, that pause creates space for governments, researchers, educators, and families to ask important questions:


  • Are age-verification systems technically effective?


  • What privacy trade-offs do they require?


  • Will they meaningfully reduce harm to young people, or do they simply shift responsibility away from companies and onto families?


These are not small questions. They involve the privacy rights of citizens, the development of youth, the design of global online systems, and the responsibilities of governments to regulate industries that shape modern life.


There is a phrase sometimes used by some in policy discussions: “doing something imperfectly is better than doing nothing perfectly.” There is truth in that idea. Waiting indefinitely for perfect solutions can lead to inaction while real problems persist. However, when legislation may shape the digital environment for an entire generation of young people, we should be cautious about adopting solutions that primarily serve a political purpose rather than a practical one.


Legislation that appears strong but fails to address underlying issues can create the illusion of progress while leaving core problems untouched. It may even delay the development of more effective policy by allowing governments to claim that the issue has already been solved. Parents and caregivers deserve more than symbolic action.


The goal of legislation should not be to remove youth from the digital world. Today’s youth and teens are growing up in what many researchers describe as an “onlife” environment, where the boundaries between online and offline life are deeply interconnected. Digital tools support learning, creativity, communication, and social development.


The real objective should be ensuring that the systems shaping those experiences are designed with the well-being of young users at the forefront, which should involve policy approaches focused on:


  • transparency around algorithmic systems


  • stronger data protection for minors


  • accountability for harmful platform design practices


  • meaningful oversight of recommendation systems


  • clearer reporting and enforcement mechanisms when harm occurs


These types of policies focus on shaping the environment rather than simply restricting access to it.


The conversation about youth and technology often becomes emotionally charged, especially when concerns about safety, privacy, and mental health are involved. Parents and caregivers understandably want action. However, effective policy requires more than urgency, it requires careful thinking about what actually produces meaningful outcomes for young people.


If governments are going to regulate the digital systems shaping our children’s lives, the goal should not simply be to pass legislation quickly. The goal should be to pass legislation that genuinely improves the digital environments our children are growing up in. Because once those laws are in place, changing them is rarely easy, and that is why getting it right the first time matters.


We hope Canadian legislators take a close look at the United Kingdom’s recent decision and treat it as an opportunity to pause and reflect. Moments like this can provide valuable space to step back from momentum driven policymaking and consider whether the proposed solutions truly address the problems they are meant to solve.


Age gating legislation may appear to be a straightforward response to concerns about youth safety online, but simple solutions often struggle to address complex systems. The digital environments shaping young people’s lives are influenced by platform design, economic incentives, algorithmic recommendation systems, and the broader social context in which youth are growing up. Restricting access based solely on age does little to address these deeper structural issues.


Rather than rushing to implement legislation that focuses primarily on limiting youth participation, Canadian policymakers have an opportunity to pursue a more thoughtful and comprehensive approach. Effective legislation should focus on accountability for platform design, transparency in how algorithms operate, stronger privacy protections, and safeguards that reflect how children and adolescents actually develop and interact with technology.


Public policy works best when it is built carefully, with a clear understanding that once legislation is enacted it can be difficult to revise or undo. That reality makes it even more important to ensure that the laws being proposed today are not simply symbolic responses to public pressure, but meaningful frameworks capable of improving the digital environments that youth, teens, and adults all share.


The United Kingdom’s decision may serve as a useful signal that taking the time to get legislation right is not a failure of action. In many cases, it is the most responsible action a government can take. Canada now has an opportunity to follow that example by focusing on thoughtful, evidence-informed policy that addresses the real drivers of online risk rather than relying on solutions that may sound decisive but fail to create lasting change.



Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech



References



Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page