When Innocent Images and Video Are Turned Into Sexualized Content
- The White Hatter
- 11 minutes ago
- 4 min read

CAVEAT - this article includes some sexualized language that some may find offensive or disturbing.
Parents and caregivers are often told to watch for obvious warning signs online, such as explicit content, risky behaviour, and direct contact from people they have never met face-to-face. What receives far less attention is how ordinary, age-appropriate images of youth and teens are being taken and repurposed in sexualized ways without their knowledge or consent. This is not a theoretical concern, it’s something teens themselves have been reporting to us consistently for years.
In 2020, we wrote about how a teen follower contacted us to describe what they were seeing in underground online spaces sometimes referred to by youth as “TikTok porn groups.” These were not hosted on TikTok. Instead, they existed in hidden forums such as subreddits and private messaging channels.
The process was consistent:
Youth created hypersexualized TikTok videos were downloaded without consent
The videos were paired (commonly known as duets) with separate, faceless footage of someone masturbating
The combined content was shared privately as pornified material
"TikTok Duet"

In most cases, the youth in the original videos were completely unaware their content had been taken. The original posts were not sexual. The harm came from how they were morphed and reused.
At the time, we confirmed that this content existed and that the behaviour was ongoing but more underground.
Last year we wrote about how we assisted two teens whose bikini photos from their Instagram accounts were screen captured and altered using AI-powered deepfake “nudification” apps (1). The resulting fake nude images were then used for sextortion.

This is a critical shift for parents and caregivers to understand. Artificial intelligence has removed many of the technical barriers that once limited image manipulation. Today, altering an image no longer requires advanced skills. It often requires only an app and intent. The teens involved did not create explicit material, someone else did, using their likeness.
Last week, an older teen alerted us to a trend circulating among some peer groups that they refer to as “cum tributes” or “face targets.”
We should also note that this trend was independently raised by an outreach worker we spoke with recently, someone we deeply respect and who works directly with at risk youth, further confirming that this is not an isolated or unknown issue.
While the language may be unfamiliar to parents and caregivers, the behaviour follows a recognizable pattern:
A still image or screen recording is taken from a self published TikTok video, Instagram account, or any other social media platform of a teen girl. This also includes pictures and videos being posted by parents and caregivers of their children.
The teen girl’s username is often visible.
Someone records themselves anonymously masturbating to that image and ejaculating onto the picture or video.
The video is shared privately through apps such as Signal or Telegram
This content can be posted publicly or posted and circulated in private or encrypted spaces, making detection and reporting more difficult. Here are two pictures, that we have heavily blurred, that we found on TikTok within seconds of searching.
"Cum Tribute"

"Face Target"

When we verified what this teen shared, we found that this behaviour has already been investigated and documented in mainstream media, including an in-depth article by Glamour Magazine (2), that we think didn’t get the coverage that it should have.
This reinforces what this teen reported to us, and aligns with documented patterns of exploitation. It is also important to clarify that the teen who raised this did not indicate that peers at his own school were creating this content, only that videos of this nature are not uncommon to encounter or see being shared in many popular social media platforms.
A critical point for parents and caregivers is this, an image does not have to be sexual to be sexualized. A swimsuit photo, dance video, casual selfie, or a “get ready with me” clip can be copied, and then misused and posted publicly. This also includes pictures that parents and caregivers post about their kids online in their social media platforms. As an example, here’s a picture we blurred of two young teens wearing a bikini that was posted into a parent’s social media feed, and then commented on by a participant in the parent’s feed.

This issue sits where everyday socially accepted online sharing is taken out of context, and used as a tool for harm. The exploitation does not stem from what a young person posted, but from how others chose to misuse it. Addressing this reality requires informed adults, empowered youth, and honest, ongoing conversations about some of the challenges of today’s onlife world.
At The White Hatter, we do not frame digital safety as risk elimination. We frame it as risk reduction through awareness, preparation, and strong parental relationships. If an image or video is posted, it can be copied, altered, and shared far beyond its original audience. Pretending otherwise does not protect young people. Honest, informed conversation does.
This is why youth, teens, and even parents and caregivers need to pause and think more critically about the types of images and videos they share. Not because they have done anything wrong, but because the online environment no longer guarantees context, consent, or control once content leaves their device, especially now given AI. A photo that feels harmless today can be stripped of its original meaning tomorrow. A video meant for friends can be saved, screen recorded, or manipulated by someone the creator has never met, a pedophile, or a peer, and sometimes by those they do know, love, and trust.
Reconsidering what to post is not about fear or restriction, it’s about understanding how images can be repurposed in ways that have nothing to do with intent and everything to do with opportunity. In an onlife world shaped by anonymity, private sharing spaces, and AI tools, visibility itself can carry risk.
When youth and teens understand that reality, they are better equipped to make informed choices. When parents and caregivers approach these conversations with calm, clarity, and respect, youth are more likely to listen and engage. Digital safety is not built through silence or control. It is built through shared understanding, thoughtful decision making, and relationships strong enough to navigate uncomfortable truths together.
Knowledge, and the understanding and application of that knowledge, is power!
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References:
1/ https://www.thewhitehatter.ca/post/deepnudes-undressing-ai-generated-intimate-image-abuse-material














