top of page

Support | Tip | Donate

Recent Posts

Featured Post

Two Emerging Patterns in Online & Offline Grooming and Exploitation We Have Witnessed

  • Writer: The White Hatter
    The White Hatter
  • 10 minutes ago
  • 5 min read



As online rules change, so do exploitation tactics. Drawing from recent real-world cases we have been involved in recently, this article explores two emerging patterns we are seeing in both online and offline grooming. From the shift toward encrypted messaging, gaming spaces, and AI-driven platforms, to the growing use of older teens as recruiters, these trends challenge common assumptions about risk and reveal why literacy, conversation, and early trust matter more than ever.


Observation #1:


We are seeing, empirically and repeatedly, a change in tactics surrounding online and offline grooming and exploitation. Just this past week, we have assisted several families where we saw the same tactics being used.


For years, many online predators relied on mainstream social media and messaging apps to initiate and escalate contact. Platforms such as Snapchat, Instagram, and WhatsApp were commonly used to move conversations from public spaces into private, often disappearing message environments where grooming could continue with less visibility.


We are now observing a move away from these familiar platforms toward services that offer stronger end-to-end encryption, fewer moderation controls, and less transparency for parents, caregivers, and investigators. One messaging platform that appears repeatedly in recent disclosures and cases we have been involved in is “Telegram.” This pattern is also being seen by a trusted youth outreach worker we spoke with before writing this article, who works closely with young people at risk of exploitation.


From an exploitation perspective, this shift is logical. Fully encrypted messaging platforms make monitoring and reporting more difficult. Messages, images, and videos can be exchanged in ways that leave limited digital evidence. For families, this can mean fewer warning signs. For law enforcement, it often means longer, more complex investigations.


We also believe this shift is, in part, an unintended byproduct of age gating laws that focus primarily on large, well-known social media platforms. While this legislation may reduce exposure in certain spaces, it can also create gaps. Those gaps are increasingly being used to reach and connect with youth and teens in less regulated environments.


We are increasingly observing that initial contact is shifting toward platforms that sit outside many of the new age gating restriction frameworks being introduced or already implemented in other countries outside of Canada. These include community based chat spaces such as Discord, multiplayer online games with integrated voice and text chat, and emerging AI driven applications like Sora, and we suggest that this shift was predictable. When access becomes restricted in one space, because of age gating legislation, those intent on harm simply redirect their efforts to environments where barriers are lower and youth presence remains high.


These environments are not inherently dangerous. Many youth and teens use them daily for play, creativity, learning, and social connection. The concern is not the platform itself, but how it can be misused as a gateway to exploitation.


In gaming spaces, group chats, and interest based communities, interactions often feel informal and low risk. Youth connect over shared games, hobbies, fandoms, or creative interests. Defences are naturally lower in these contexts. Conversations feel organic and relationships develop slowly.


By the time a young person is invited to continue a conversation on an encrypted messaging app, the interaction may already feel familiar and safe to them. From the outside, the move can look sudden. From the youth’s perspective, it can feel like a natural next step with someone they already trust. This staged approach makes detection harder and reinforces why focusing only on platform access misses a key part of the picture.


Observation #2:


Another troubling pattern we are seeing involves the recruitment of youth into the commercial sex trade beginning offline and later moving online through subscription based platforms. What often starts as in-person exposure or relationship building is then normalized, monetized, and scaled in digital spaces.


Rather than targeting younger teens directly, sexual exploiters are increasingly focusing on older teens first. These youth may be groomed, pressured, or manipulated into participating, and over time may be encouraged to act as recruiters themselves. Once involved, they are often pushed to introduce younger teens to the exploiter, frequently framed as friendship, mentorship, or a low risk way to make money.


This shift challenges many common assumptions parents and caregivers hold about what exploitation looks like. Today, it does not always involve a stranger or an obvious controller. It may involve a peer, a trusted friend, or an older teen who appears confident and successful. In some cases, that teen is not simply an intermediary but is actively involved in recruitment and control into the sex trade.


For some teen girls, this pathway is presented as a lifestyle rather than exploitation, one that promises attention, validation, and financial reward while downplaying the long term risks (1). Understanding how these recruitment patterns operate is critical for parents and caregivers who want to recognize early warning signs and have informed, protective conversations with their children before harm escalates.


At The White Hatter, our experience continues to show that online safety is not achieved by chasing the latest platform or app. It is built through digital literacy, context, and ongoing, age appropriate conversations with young people.


What we are seeing reinforces an important reality for families. When regulations or platform rules change, those who intend harm adapt quickly. Risk does not disappear, it shifts. Encrypted messaging apps are typically the end point of a grooming process, not the beginning. Initial contact often happens in gaming environments, group chats, or newer AI driven spaces that feel familiar and low risk to youth, and therefore attract less adult attention.


This is particularly concerning when it comes to sexual exploitation. In some cases, older teens are being targeted and manipulated, not only as victims themselves, but as a means to reach younger youth. These dynamics highlight why conversations about online safety must focus on behaviour, boundaries, and critical thinking, rather than relying solely on platform names, age limits, or bans.


Age gating laws may reduce certain risks, but they are not a complete solution.  We believe, and the early evidence is now showing, they can also unintentionally reshape where and how harm occurs, pushing risky behaviour and exploitative tactics into spaces that receive less public attention and oversight. When access to one platform is restricted, those who intend harm do not stop. They adapt, often moving into private messaging, gaming environments, group chats, or newer digital spaces that fall outside many regulatory frameworks.


This is why protection cannot rely on legislation alone. They are not the cure all many parents and caregivers believe they will be. Real safety comes from helping young people understand how grooming works, including how trust is built slowly, how boundaries are tested, and how manipulation can be disguised as friendship, support, or opportunity. Youth who can recognize these patterns are better equipped to pause, question, and seek help when something feels off.


Equally important is ensuring that young people feel safe bringing concerns to trusted adults early. Fear of punishment, device removal, or shame and embarrassment, especially when it comes to sexual exploitation, can delay disclosure and allow situations to escalate. Ongoing, non-judgmental conversations give youth the confidence to speak up sooner, when intervention is most effective. As the digital landscape continues to evolve, informed guidance and open communication remain the most reliable forms of protection for families.


Related Article: “When Missing Youth are not Just Missing: Exploitation Comes In Many Forms”




Digital Food For Thought


The White Hatter


Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech



References:



Support | Tip | Donate
Featured Post
Lastest Posts
The White Hatter Presentations & Workshops
bottom of page