Big Tech, Parents, Caregivers, Legislation, & Real-World Harm, Would We Accept This in Any Other Industry?
- The White Hatter
- 4 days ago
- 6 min read

Imagine, visualize, or even pretend, walking into a hotel chain where reports show that child exploitation frequently happens in the rooms. Now imagine a shopping mall where sexual predators openly operate without being stopped. In any other consumer facing industry, this would be a scandal. Regulators and law enforcement would step in, and businesses would face lawsuits, closures, and arrests.
Now consider this, these kinds of harms and crimes are happening online every day, on platforms run by some of the biggest tech companies in the world, and yet those same companies continue to operate with little to no accountability.
The Australian eSafety Commissioner recently stated:
“No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises.” (1)
The Commissioner is absolutely right. No other industry could get away with this.
If a hotel knowingly failed to prevent child abuse on its property, it would be shut down, the owners would be arrested, and sued civilly. If a taxi service allowed known sexual offenders to drive without background checks, it would face legal consequences. If a toy company marketed directly to kids while turning a blind eye to dangers in its supply chain, it would be blasted in headlines and sued.
However, in the tech world, the response is often different. Companies say they are working on it, and offer vague promises about AI tools and moderation systems to help. However, reports of grooming, sextortion, livestream abuse, and other forms of exploitation are still present and clear threats.
This double standard would never fly in any other industry. So why do we accept it here?
Let’s be clear, no platform can prevent 100% of harm. However, the issue isn't just that harm occurs, it’s that some platforms are failing to take reasonable, proven steps to reduce that harm, despite having the money, talent, and influence to do so. Most are being willfully blind, while some are taking small steps but are not all in!
Consider this:
Platforms often allow underage users in violation of their own terms of service.
Reports of child abuse material can sit for days or weeks before being acted on.
Livestream features are still being abused for real-time exploitation.
Encryption tools are being rolled out without adequate safeguards to protect children.
At the same time, this is not just a big tech issue. Parents and caregivers have a crucial role to play given we are the first line of defence when it comes to our children’s safety both online and offline. (2) However, what happens when that line of defence is also willfully blind?
For more than a decade, there has been no shortage of public information highlighting the risks some youth face online, especially when given internet-connected devices that aren’t age or developmentally appropriate and provided to youth with no supervision or limits. Still, we continue to see children handed fully functioning smartphones, with no parental filters, monitoring, or boundaries in place. In doing so, some families are offering a direct line to their children for those who wish to do harm.
Here’s an example to support the above noted statement, research in Australia by the eSafety Commissioner found (3) :
For children aged 8 - 12 who used social media, 36% had their own account, and 77% of those had their account helped set up by a parent or caregiver
Another 54% used social media through a parent’s or caregiver’s account.
As a parent or caregiver, when you give such access to social media platforms, bad things will likely happen in these circumstances, the consequences can be severe, and sadly, not all that surprising.
Many parents and caregivers in Canada may not realize that they can be held civilly liable for their children's actions online. Parental responsibility and oversight online matters and is recognized in Canadian law. Some recommendations made by a Canadian Law firm to decrease parent and caregiver liability include (4)
· Monitoring their child’s internet use;
· Educating their child about online ethics and laws;
· Using parental control software;
· Responding promptly to complaints or warnings;
· Documenting any efforts they make to supervise or intervene.
Without proper safeguards, some families may expose their children and themselves to greater risk. This isn’t about blame, it’s about parent and caregiver responsibility, and equipping them with the knowledge and tools they need. However, let’s be clear, lack of parental involvement does not excuse tech companies from doing their part. Platform negligence is still negligence.
We are watching very closely what is happening in the UK, and soon to be Australia, surrounding their implementation of their age gating legislation in an attempt to protect youth. However, If we truly want to protect youth and teens online, age gating or creating other nanny legislation only targets the symptoms and not the cause. It is our opinion that we need to instead enact “Corporate Accountability Legislation” (5), combined with digital literacy and internet safety education for both parents and their children. Legislation should:
Mandate Transparency
Tech platforms must be legally required to operate with full transparency when it comes to how they handle abuse-related content. This includes releasing regular, detailed reports that outline how many incidents of abuse or exploitation were flagged, what actions were taken, how quickly those actions occurred, and how often content was successfully removed or allowed to persist. Right now, the public is largely in the dark about how these systems work, or don’t. Transparency would not only create public pressure for improvement but would also allow researchers, advocates, and policymakers to identify gaps and push for meaningful reforms. If companies can show us how quickly they remove spam or copyrighted material, they can do the same for child protection.
Enforce Safety-by-Design
Protecting youth and teens online shouldn’t be something tech companies try to retrofit into existing systems once harm is already happening. Safety by design means building child protection measures into the platform from the ground up, before a single user signs on. This includes age appropriate design features, effective content moderation tools, default privacy settings for minors, and real-time reporting mechanisms that are easy to access and use. Rather than reacting to abuse, companies should be expected to prevent it from happening in the first place. This isn’t a technological hurdle, it’s a prioritization issue. When profit is placed above protection, children are put at risk.
Impose Real Penalties
Accountability only works when there are meaningful consequences. If a company fails to address child exploitation on its platform, it shouldn’t be allowed to continue business as usual. Financial penalties should be significant enough to matter, not just a cost of doing business. For repeat or severe violations, governments should consider placing restrictions or temporary suspensions on platform operations until compliance is met. The message must be clear, if you provide a digital space where children are present, you are responsible for their safety. Failing to act should come with real and enforceable consequences.
Support Effective Legislation
Good policy is possible. We can create laws that compel tech companies to protect children without eroding the privacy rights of adults or stifling legitimate expression. Striking that balance takes careful, informed lawmaking, not reactionary bans or vague regulations. What is needed are clear standards for child safety, independent oversight bodies, and mechanisms for public reporting and appeals. Parents and advocates should support legislation that holds platforms accountable while ensuring that digital spaces remain open, inclusive, and respectful of fundamental rights. It’s not about choosing between safety and freedom, we can and must protect both.
This isn’t about banning technology or painting the internet as the enemy. It’s about holding tech companies to the same standard of responsibility we expect from any other business that serves the public, especially when children and teens are involved.
Parents and caregivers, your voice counts. You have every right to demand better, ask tough questions, and push back against the status quo of big tech, and support thoughtful legislation that protects both safety and privacy for all.
At the same time, parents and caregivers need to understand that they play a direct and the most important role in keeping their children safer online. If we are the ones providing the tools and access, then it’s also our responsibility to make sure our kids are educated on how to use them, and that we stay actively involved in overseeing that use.
Parents need to parent, companies need to be held accountable, government need to legislate, and collectively as a society, we need to stop pretending that what happens online is somehow less serious than what happens offline. Because the stakes are just as high. Sometimes, they are even higher.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
4/ https://collettreadllp.com/know-your-rights/are-parents-liable-for-crimes-their-kids-commit-online/