top of page

Support | Tip | Donate

Recent Posts

Featured Post

The “Doing Something Is Better Than Doing Nothing” Trap in Technology and Social Media Legislation

  • Writer: The White Hatter
    The White Hatter
  • 7 minutes ago
  • 5 min read

Supporters of youth age gating laws often begin with a comparison that feels immediately persuasive. They draw a direct line between tobacco regulation and youth cellphone or social media use. Phones are framed as addictive, and social media platforms are framed as dangerous. The conclusion feels intuitive. If cigarettes required regulation to protect young people, then social media should be treated the same way. 


We understand why this argument resonates. Parents and caregivers feel overwhelmed, educators are concerned, and legislators are under pressure to respond to real stories of harm. Developmental research shows that adolescents are still building impulse control, emotional regulation, and judgment. At the same time, technology companies profit from engagement systems that were never designed with youth or teen well being as the primary goal. In that context, calls for clear age limits can feel not only reasonable, but necessary.


We agree on one core point, technology and social media companies need regulation and accountability. Where we diverge is on the assumption that age based access bans, on their own, meaningfully address the sources of harm.


At The White Hatter, we spend a great deal of time slowing these conversations down, not because harm does not exist, and not because urgency is misplaced, but because history shows that fast, symbolic solutions built on simplified logic often create new problems while leaving deeper ones untouched. Something Darren saw in policing all the time surrounding issues such as homelessness and drug addiction.


To explain why, we need to look closely at the analogy often used to justify age bans - smoking!


For decades, the dangers of smoking were well known. Public pressure mounted, governments were expected to act, and tobacco companies were forced to respond. What emerged after government oversight hearings after hearings was a compromise between legislators and the tobacco industry that appeared decisive without fundamentally changing the system. Cigarettes were rebranded as “light,” “low tar,” or “mild.” On paper, this looked like progress and for politicians, a “win” in the eyes of their constituents. In practice, it delayed meaningful reform and cost lives. These products did not reduce harm, they reshaped perception.


Smokers believed they were making safer choices and many postponed quitting altogether. The real drivers of harm, nicotine addiction, engineered delivery systems, and aggressive marketing, remained intact. Regulation focused on surface characteristics while the core business model continued unchanged.


Human behaviour surrounding smoking adapted in predictable ways. People inhaled more deeply or smoked more cigarettes to achieve the same effect. The risk did not disappear, it simply traveled a different path and continued to place stress on the healthcare system and take lives.


Most importantly, once a partial solution existed, urgency faded. The public was reassured that the problem was being handled which resulted in stronger, more comprehensive measures were delayed for years.


So why does this matters for youth age gating laws?


Age based bans on social media access follow a similar pattern. They feel decisive., they signal concern, and they allow governments to say to their constituents something has been done. But signalling is not the same as protecting.


Supporters of age gating often argue that even imperfect measures are better than none, and that argument deserves to be taken seriously. When youth and teens are being harmed, doing nothing is not acceptable.


The problem is that age bans do not operate in a vacuum. Youth and teens who are motivated to connect do not simply stop connecting when barriers appear, they adapt. We already see this happening in real time in countries where age bans have been implemented:


  • Teen migration to private group chats and encrypted platforms


  • Use of shared accounts or adult credentials


  • False birthdates becoming normalized


  • Use of VPNs and other workarounds


  • Risk shifting into spaces with weaker moderation and fewer safeguards (1).


The danger does not disappear, it just becomes less visible and harder for parents  and caregivers to see, and harder for platforms and educators to support.


One of the most concerning side effects of age gating legislation, in our opinion, is the sense of closure it can create specific to the false sense of safety it creates.


When a law is framed as “protecting kids,” it can unintentionally signal to families that the problem has been solved upstream, and that perception matters. Many of the most serious harms we encounter do not originate on public feeds. They occur through private messaging, peer networks, compromised accounts, and off-platform interactions that escape most age gating legislation that is being adopted. An age cutoff does not meaningfully address those realities.


Just as “light” cigarettes did not make smoking safe, age bans do not make online spaces safe. They mainly make risk easier to overlook and easier to rationalize.


Another parallel worth examining is where responsibility ends up. With “light” cigarettes, responsibility quietly shifted to the consumer where the messaging was  choose the safer option and manage your own risk.


With age gating laws, some say such a law shifts responsibility to the companies. We would argue that there is in fact  a bidirectional responsibility shift to families and youth as well. Prove your age, navigate the system correctly, and  stay out of trouble.


Meanwhile, the structural issues that shape risk remain largely untouched such as:


  • Engagement driven algorithms


  • Dark patterns designed to maximize time on platform


  • Weak default privacy settings


  • Inconsistent reporting tools


  • poor or no moderation specific to issues surrounding predation/exploitation, pornography, child sexual abuse material, and graphic violence to name a few.


These are design choices and not youth failures. Effective policy must address them directly.


Sixteen is not a switch that flips maturity on overnight. Some younger teens have strong judgment, supportive adults, and healthy boundaries. Some older teens do not. Blanket bans flatten these differences and remove space for guided, supervised participation. Such bans also remove parental choice as to what they think is in the best interest of their individual child’s development and needs (2).


We recognize the urgency driving these conversations. Parents and caregivers are asking for help now, and  legislators are under pressure to do something now. But urgency should raise the bar for policy, not lower it. History shows that symbolic action can delay effective action.


Real progress in tobacco regulation did not come from partial fixes that felt reassuring. It came when policy finally addressed the full system such as marketing restrictions, product transparency, clear warnings, public education, accountability tied to evidence, and yes, age limits as one component among many.


Youth online safety deserves the same seriousness. We here at the White Hatter have actually published an article on what we believe Canadian Legislation should look like and why (3)(4).


We do not need rushed legislation designed to signal action rather than deliver results. Laws built for optics may offer short term reassurance and political wins, but they often replace thoughtful problem solving with emotional comfort. When frustration or fear drives policy, complexity is flattened, unintended consequences are ignored, and the root causes of harm remain untouched.


What is needed instead is legislation grounded in evidence and informed by how these systems actually function. Effective policy must focus on the absence of safety by design principles and the lack of meaningful accountability for technology and social media vendors. Without addressing product design, algorithmic incentives, and enforcement mechanisms, new laws risk shifting harm rather than reducing it. Real protection comes from holding companies responsible for the environments they build, not from symbolic measures that make us adults safer without making youth safer.


We get it, doing something feels good. However, doing the right thing to hold technology and social media companies accountable actually protects kids.



Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech



References:









Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page