top of page

Support | Tip | Donate

Recent Posts

Featured Post

When Governments Gain Influence Over Online Platforms, Parents Should Pay Attention

  • Writer: The White Hatter
    The White Hatter
  • 3 minutes ago
  • 6 min read


Caveat - Parents and caregivers are increasingly calling for stronger government oversight and well crafted legislation to hold large technology companies accountable. That is a position we too share at The White Hatter. Meaningful regulation, when done well, can address real harms and push platforms toward safer design choices. At the same time, oversight that happens behind closed doors carries its own risks. When decisions about data control, content moderation, and platform governance are made without transparency or public scrutiny, unintended consequences can follow. We may be beginning to see early signs of that dynamic emerge given what is happening around the world. That is why we felt it was important to write this article, and invite parents and caregivers into a broader conversation about how accountability, transparency, and democratic oversight must go hand in hand when it comes to our kids future.



Parents and caregivers are often told that the greatest risk to their children online comes from strangers, predators, or harmful content. Those risks are real and deserve attention. However, there is another issue that receives far less discussion, yet has profound implications for families, youth, and democratic societies. That issue is who controls the digital platforms where information flows.


When governments gain direct or indirect control over social media platforms, even through intermediaries or proxy arrangements, they gain the ability to influence what is seen, what is amplified, and what quietly disappears. This is not a hypothetical concern, it’s something we are watching unfold in real time.


In the United States, TikTok’s operations recently shifted into a majority American owned joint venture after federal pressure required the platform to separate from its previous Chinese ownership or face an outright ban. Under this new arrangement, U.S. user data is stored on American servers, and decision making authority over trust, safety, and content moderation now sits with the new U.S. controlled entity whose owners align very very closely with the current US administration.


From a privacy and national security standpoint, many parents and caregivers understandably see this as a positive step. Data sovereignty matters. Foreign state access to youth data should concern all of us.


At the same time, however, this change introduces a different question that deserves equal scrutiny. When a government has close ties to the entities controlling moderation policies, it gains influence over what narratives are allowed to circulate freely and which may face friction, suppression, or delay.


In the days following the TikTok transition, some users reported difficulty uploading or sharing certain types of politically sensitive content, including videos involving immigration enforcement activity. Whether technical, procedural, or policy based, these moments highlight how fragile open information ecosystems can be when power is centralized.


The key point for parents and caregivers is not whether a specific claim is true or false. The point is understanding that platform control always comes with narrative influence. As our friend and colleague Matthew Johnson with Media Smarts Canada says, “Media have social and political implications”


History offers sobering examples of how governments act when they want to control information. During periods of civil unrest, authoritarian regimes have repeatedly restricted or shut down internet access entirely to prevent images, videos, and first-hand accounts from reaching the outside world. Iran’s use of internet blackouts during the recent protests in their country is one of the clearest modern examples. By limiting access, authorities were able to reduce global visibility into events on the ground and disrupt the ability of citizens to organize, document, and communicate.


These tactics are not limited to any one country or ideology. The mechanism is always the same. If you control the platform, you control the flow of information. This is something that the for profit social media big tech companies also understand and use to their advantage. Democratic societies are not immune to this risk. The tools may be subtler, but the leverage exists all the same.


For parents and caregivers, this issue goes beyond politics. It touches core digital literacy principles that affect how young people understand the world. When platforms quietly shape visibility, youth and teens may believe they are seeing an unfiltered reality when they are not, especially now given AI. This can influence opinions, trust in institutions, and civic engagement without users ever realizing it is happening.


Young people are especially vulnerable to this because they tend to place a high level of trust in the platforms they use daily as their news source. If something does not appear in their feed, many assume it simply is not important or did not happen. This is why teaching critical thinking, source evaluation, and an understanding of how platforms operate is just as important as discussing online safety risks.


At The White Hatter, we often remind families that protection and control are not the same thing. Strong safety by design standards, transparency requirements, and independent oversight can protect users without turning platforms into narrative gatekeepers. When protection shifts into control, accountability becomes harder to trace and challenge.


Parents and caregivers should be cautious of any solution that promises safety by concentrating power, whether that power sits with corporations, foreign governments, or domestic ones.


Technology itself is not the enemy, and neither is regulation. The real risk lies in unexamined power. When any government, through direct ownership or proxy influence, gains leverage over the digital spaces where billions of people communicate, parents  and caregivers should pause and pay attention.


As we write this article, we are reminded of reading George Orwell’s 1984 in high school. At the time, many of us questioned why it was required reading and what relevance it really had. Looking back now, the reason is much clearer. The book was never just about the past or fiction. It was meant to help us recognize how power, control of information, and the shaping of narratives can quietly take hold if we stop paying attention. The central thesis of Orwell’s 1984 is that unchecked state power, when combined with mass surveillance, control of information, and psychological manipulation, destroys individual freedom and the ability to think independently.


Orwell argues that control over truth is the most powerful form of control. By rewriting history, limiting language, and monitoring behaviour, a totalitarian government can shape what people believe, remember, and even perceive as reality. In this world, truth becomes whatever those in power say it is.


At its core, 1984 was a warning. It shows how societies can slide into oppression not only through force, but through fear, compliance, and the gradual normalization of control over information and thought.


This warning becomes especially relevant when governments gain influence over the social media platforms that shape how information moves today.


In 1984, control did not rely solely on violence or visible force. It relied on people slowly adjusting to limits on what could be said, shared, or even remembered. Over time, those limits became normal. When information was missing, altered, or discouraged, citizens learned to stop questioning it.


Social media now plays the role that state controlled media once did in Orwell’s world. It is where people learn about current events, form opinions, organize, and express dissent. When governments influence how these platforms operate, even indirectly, they gain the ability to shape public perception without ever issuing a formal order.


The danger is rarely a single dramatic act, it’s the quiet accumulation of small changes. Content that becomes harder to find, posts that are deprioritized or shadow banned, topics that trigger extra friction or delays. Over time, users adapt, share less, question less, and assume that what they see represents the full picture.


For families and young people, this matters deeply. If control over information becomes normalized, critical thinking erodes. Truth begins to feel subjective and  silence is mistaken for safety. This is exactly the dynamic Orwell warned about. Oppression does not always arrive loudly, sometimes it arrives disguised as protection, convenience, or order.


That is why transparency, independent oversight, and public accountability are essential when governments intersect with social media. The goal should never be control of thought, but protection of people, given that past and present history shows us what happens when that line becomes blurred.





Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech 

Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page