From Digital Rabbit Holes to Digital Black Holes: How Algorithms Are Changing the Way Youth and Teens Access Content Online
- The White Hatter
- Apr 25
- 7 min read
Updated: May 8

Caveat: We want to thank Rick Lane, Iggy Ventures LLC., who used the analogy of “black holes” when it comes to describing algorithms.
When the internet first began making its way into homes, the concern for many parents and caregivers was simple, “What if my child stumbles onto something they shouldn’t?” Back then, the online world felt like a vast but disorganized library. A youth or teen might be researching for a school project, click the wrong link, and suddenly find themselves on a webpage filled with content not meant for young eyes. It was often accidental, like tripping and falling down a rabbit hole.
Fast forward to today, and the onlife landscape looks very different. The internet is no longer a passive place waiting to be discovered; it's now a dynamic, curated environment that seeks out youth and teens just as much as they seek it out. The path to inappropriate, misleading, or emotionally manipulative content is rarely accidental, it’s often by design for financial profit.
That’s the difference between falling down a digital rabbit hole by accident, and being pulled into a digital black hole by design.
In the early days of youth and teen internet use, falling into inappropriate or problematic content was more a matter of bad luck or poor search filters. It wasn’t that platforms wanted youth and teens to find explicit material, it was more that protections simply hadn’t caught up.
However, today’s internet operates differently. Social media and video-sharing platforms run on extremely powerful algorithms designed not just to show content, but to keep users engaged for as long as possible, commonly known as “dark patterns”. (1) These algorithms learn what a person likes, how long they watch, what they scroll past, what they linger on, and then by design they feed more of that to the user. The longer a youth or teen stays on a platform, the more personal data is gathered and then monetized to the benefit of these social media platforms. That’s their business model.
For youth and teens, especially those whose brains are still developing, this creates an environment that feels less like exploration and more like exploitation.
Unlike the rabbit hole, which you fall into by accident, a black hole pulls you in by design, with an immense gravitational force that becomes impossible to escape once you’re close enough. Algorithms function in much the same way. Once a youth or teen watches one video about extreme fitness, toxic beauty standards, conspiracy theories, or even mild sexualized content, the algorithm doesn’t just show them more of the same, it amplifies and intensifies the exposure.
It doesn’t ask whether the content is age-appropriate, balanced, or healthy. It only asks, “Will this keep them watching?”
What This All Means for Parents and Caregivers
As a parent or caregiver, this evolution from digital rabbit holes to digital black holes means we need to change how we think about digital literacy and online safety:
It’s not just about blocking content, it’s about understanding the mechanisms that surface it.
Many parents and caregivers rely on content filters and parental controls to keep their children safe online. While those tools can be helpful, they only address part of the issue. Today’s digital platforms don’t just passively host content; they actively deliver it through powerful algorithms designed to predict and influence behaviour. These algorithms aren’t filtering content based on what’s good or bad for your child, they’re surfacing content based on what keeps your child’s attention the longest. That means even if a platform is technically "safe," your child can still be nudged toward problematic patterns of content consumption. Understanding how and why content is being shown, rather than just blocking what’s visible, is a critical step in supporting your child’s digital well-being.
It’s not always about your child seeking something out, sometimes it’s about something seeking them out.
There’s a common misconception that if a youth or teen encounters harmful or inappropriate material online, it must be because they went looking for it. But in reality, today’s digital platforms are often the initiators of exposure. Through endless scrolling, autoplay features, and algorithmic recommendations, platforms are constantly pushing content toward users, even when they haven’t asked for it. This means your child can start watching an innocent video and, within minutes, be shown increasingly extreme or inappropriate material without ever having typed a single search query. The idea that kids “stumble across” content is outdated; more often than not, that content is being pushed directly to them.
It’s not enough to tell kids to "be careful”, they need tools to recognize when they’re being manipulated by technology designed to keep them hooked.
We often tell youth and teens to be cautious online, but vague warnings don’t prepare them for the very real psychological tactics baked into digital platforms. Today’s apps and websites are designed using principles from behavioural psychology, things like variable rewards, infinite scrolling, and notification triggers keep users coming back. Youth and teens aren’t just consuming content; they’re being targeted by systems that study their behaviour in real time. To truly protect youth and teens, we need to teach them how to recognize when they’re being nudged, manipulated, or emotionally triggered by digital design. Equipping them with this awareness gives them the power to make conscious, informed decisions instead of falling into patterns driven by tech that’s built to exploit their attention.
Talk about algorithms
One of the most powerful tools a parent can provide their child is understanding. Start by explaining what an algorithm is in simple terms, it’s a set of rules that decides what shows up in their social media feed or on YouTube. These algorithms learn from what we watch, like, or scroll past, and they use that data to predict what we’ll want to see next. Let your child know that the internet is not random, rather it’s tailored, and often that tailoring is meant to keep them online for as long as possible, not necessarily to help or inform them. When youth and teens understand that their online experiences are being shaped by invisible systems, they’re more likely to approach digital content with a critical eye.
Watch together
Spending time online with your child might not be your idea of a good time, but it can be an incredibly effective way to foster digital awareness. Watching a few TikToks, YouTube shorts, or even scrolling through an Instagram feed side by side can lead to natural, judgment-free conversations about what they’re seeing. Ask your child: “Why do you think this video was recommended?” or “How does this content make you feel?” The goal isn’t to criticize their interests but to encourage them to think about why certain types of content are being shown. This shared activity can also build trust, making it more likely that your child will come to you if they encounter something troubling online.
Don’t rely solely on parental controls
While parental controls are a helpful tool, especially for younger children, they are not a silver bullet. The problem isn’t always overtly explicit content; more often, it’s about exposure to repetition of ideas that shape values, beliefs, and behaviours over time. For instance, a child might repeatedly see videos glorifying extreme dieting, toxic masculinity, or get-rich-quick schemes. None of these are likely to be flagged by filters, but the pattern of exposure can still be harmful. That's why having regular, open conversations is far more effective in the long term. Your child needs guidance to recognize problematic content and question its intent, rather than simply being shielded from it.
Encourage variety
One of the best defences against algorithmic manipulation is content diversity. Encourage your child to follow a wide range of creators and topics such as educational channels, positive role models, content from different cultures, or even lighthearted humour that’s age-appropriate. The more varied their online diet, the less likely it is that they’ll get funnelled into an echo chamber of extreme or unhealthy content. Think of it like a balanced nutritional diet, but for their brain. This doesn’t mean banning certain content, but rather promoting a healthier mix so they have multiple perspectives and aren’t stuck in one digital lane.
Be involved without spying
It’s a delicate balance, but it’s crucial to be a presence in your child’s digital life without becoming a source of fear or resentment. Youth and teens are more likely to open up about their online experiences if they don’t feel like they’re being constantly monitored or judged. Let them know you’re there to help, not to punish. Create a space where they can talk about what they’re watching, who they’re following, and even the weird or uncomfortable stuff they might encounter. When a youth or teen trusts that their parent will listen first and react later, they’re much more likely to come forward when they need support.
We need to stop thinking of youth and teen online experiences as a series of accidental clicks. Today’s platforms are engineered to be captivating, and they’re very good at what they do. As parents, our role isn’t to fear the internet, but to help our kids understand and navigate it with awareness and critical thinking.
As the onlife world continues to evolve, so too must our approach to parenting within it. The shift from digital rabbit holes to algorithmic black holes represents a fundamental change in how youth and teens interact with content online. What was once accidental has now become intentional, and driven by technology designed to influence, predict, and profit from our attention. Simply telling youth and teens to "be careful" no longer equips them for what they're actually facing. Today’s youth and teens aren’t just navigating content; they’re navigating a system built to guide their clicks, shape their interests, and keep them scrolling.
This means our role as parents and caregivers can’t stop at installing parental controls or setting screen use limits. We need to be proactive co-navigators in their digital lives, helping them understand how algorithms work, what dark patterns look like, and why certain content appears more often than others. It’s about creating digital resilience, not digital paranoia.
By being present, asking questions, watching together, and fostering critical thinking, we help our youth and teens become active participants in their online experiences rather than passive consumers. We give them the tools to spot manipulation, to question intent, and to recognize when something is trying to pull them in too deeply.
Ultimately, our goal isn’t to scare our kids away from technology, but to prepare them for it, to shift the narrative from control to curiosity, from fear to understanding or as we like to say, “pave the way”. When youth and teens understand the system, they can learn how to move through it on their own terms, with confidence, clarity, and intention.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References: