top of page

Support | Tip | Donate

Recent Posts

Featured Post

Conflicts of Interest in Digital Safety: Scrutiny Should Go Both Ways

  • Writer: The White Hatter
    The White Hatter
  • 16 minutes ago
  • 4 min read



Recently, Irish digital literacy expert Bianca Garibaldi wrote a thoughtful post about conflicts of interest within youth and teen safety organizations, particularly when those organizations partner with the very social media platforms they are expected to scrutinize.


In her words:


“Personally, I can only place real trust in organisations that maintain true independence — those that do not attend platform-sponsored events and do not rely financially on the same social media companies they should be scrutinising.”


We agree with that position. Independence and transparency matters, and credibility depends on both.


Where this conversation needs to go further is in recognizing that conflicts of interest do not only exist in one direction.


We are now seeing a rapidly growing economy built around age gating legislation and school cellphone bans. That growth has created new financial incentives, new commercial relationships, and new influence pathways that deserve the same level of scrutiny being applied to NGOs, or even PhD’s who receive research funding from large technology companies. Questions about “limitations” and “funding” should be applied to “all” and not just one side or the other.


If conflicts of interest are relevant in one corner of this debate, they are relevant everywhere, consistency matters if accountability is the goal.


When governments and school districts began moving toward cellphone bans and age based access restrictions, the public framing focused almost entirely on child protection. That framing is powerful. Once a policy is positioned as protecting children, it often bypasses deeper questions about cost, effectiveness, governance, and unintended consequences. What followed was predictable, a policy vacuum created a market opportunity.


Prohibitions and age gating laws do not enforce themselves. Schools, platforms, and governments still need systems to determine who is allowed, who is not, and how compliance is monitored. That requirement immediately created demand for commercial products and services such as:


  • Age verification software


  • Biometric and facial analysis tools


  • Identity management and data storage systems


  • Third-party compliance vendors


  • Monitoring software and enforcement infrastructure


  • Physical phone restriction tools such as locking pouches and storage systems


These are not public services. They are commercial products, often sold on subscription or contract models. The tighter and broader the regulation, the larger and more durable the market becomes.


School phone bans are often framed as removing technology from classrooms. In practice, they rarely do that. Instead, they replace one set of devices with an ecosystem of paid enforcement solutions.


School districts that prohibit phones now routinely purchase:


  • Locking pouches or magnet-sealed cases


  • Phone lockers and charging cabinets


  • Monitoring and enforcement hardware


  • Staff training programs and implementation contracts


  • Discipline and compliance tracking software


None of this is cost free. Each ban requires physical infrastructure, ongoing maintenance, and administrative oversight. Vendors actively market these products directly to schools, often framing bans as incomplete or unworkable without their proprietary solutions.


It is important to be clear here. Some implementation costs are unavoidable in any policy decision. Schools need systems to function. The concern is not that money is spent, but how financial incentives begin to shape narratives about what solutions are presented as necessary, urgent, or inevitable.


This is why understanding conflict of interest often goes beyond simple funding. However, not all conflicts of interest are the same. In governance and ethics frameworks, distinctions matter.


There is a difference between:


  • Financial dependency, where income relies directly on a specific policy outcome


  • Incentive alignment, where professional success or business growth benefits from certain narratives prevailing


  • Reputational or access based incentives, such as visibility, speaking opportunities, or influence tied to a particular stance


None of these are inherently unethical. Many professionals operate in mixed environments where expertise, compensation, and advocacy overlap. The issue is not the existence of incentives, but whether they are acknowledged and whether scrutiny is applied consistently.


When protection becomes a market, incentives shift. Concern for children is real, and online harm exists. Families, educators, and young people experience it every day. Many people working in digital literacy and online safety are thoughtful, ethical, and deeply committed to reducing risk. At the same time, when protection becomes a market, incentives inevitably shift.


Messages that are simple, emotionally compelling, and easily packaged tend to travel faster. Nuanced discussions about platform design, algorithmic accountability, developmental differences, and evidence based harm reduction are harder to sell and less profitable. Fear, certainty, and simplicity convert better than complexity. This does not mean motives are bad, it means incentives matter.


In medicine, research, education policy, and public procurement, conflicts of interest are evaluated using consistent standards. Funding sources, commercial relationships, and secondary benefits are disclosed because they help decision-makers contextualize recommendations.


Selective scrutiny undermines trust. When financial ties are treated as disqualifying in one part of the digital literacy and internet safety ecosystem but ignored in another, credibility erodes. This is not about fairness for its own sake. It is about evidence weighting, policy integrity, and public confidence.


Age gating requirements and school cellphone bans did more than change rules. They reshaped who benefits financially from youth safety policies and how influence is distributed across the ecosystem.


Parents and policymakers are often told these measures are neutral or cost-free, they are not. Every restriction creates demand. Every demand creates vendors. Every vendor introduces incentives that shape which solutions are amplified and which are sidelined.


Recognizing that economic layer does not weaken child protection efforts. It strengthens them by allowing decisions to be made with clearer eyes.


Accountability must apply everywhere, including ourselves


If we are serious about accountability, scrutiny cannot stop with NGOs or researchers whose funding comes from big tech. The same expectations must apply across the entire digital literacy and online safety landscape, including for-profit companies, non-profits, coalitions, consultants, and individual experts who shape public opinion and policy.


That means being willing to examine financial relationships, sponsorships, paid partnerships, product affiliations, and indirect benefits wherever they exist. Not selectively, and not only when it is politically convenient. Transparency does not undermine good work, it strengthens it.


As part of that standard, we believe disclosure should be normalized. For clarity, we do not receive payment, sponsorship, or gifts in kind for promoting any platform, product, or restriction-based solution here at the White Hatter. We believe that level of transparency should be routine across this space.


Balanced oversight does not weaken child protection, it’s one of the few ways to ensure that policies and education remain grounded in evidence rather than influence, and that trust is built through consistency rather than assumption. 


Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech

Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page