327 results found for "AI"
- Why Google Search Is Not Designed for Youth and Pre-Teens
Google is an incredibly powerful tool, arguably one of the most influential inventions since its launch in 1998. With a few keystrokes, it grants access to the vast expanse of human knowledge. But while Google is designed to serve billions of users worldwide, it is not designed with young children in mind. Many parents and caregivers assume that because Google is widely used in education and by adults for everyday tasks, it must be safe for youth and pre-teens. However, that assumption overlooks the fundamental reality: Google is a search engine built for an adult world, not a child-friendly one. Google does not inherently block violent, sexual, or otherwise inappropriate content from search results. While Google SafeSearch settings exist, they are not foolproof and can be easily turned off by tech-savvy kids. Even with SafeSearch enabled, children can still encounter disturbing content, misinformation, or extremist views that they may not have the maturity to process. A simple misspelling or an innocent search query can lead to explicit results. For example, a child searching for "cute kitten videos" may eventually stumble upon content that is not at all related to kittens, but rather a porn site. The internet is unpredictable, and Google's job is to find results and not to determine what is developmentally appropriate. There are two kid friendly search engines that we have tested that we believe can help reduce risks of accessing inappropriate content: #1: Kiddle https://www.kiddle.co/ #2: WackySafe https://wackysafe.com/ NOTE - both platforms use cookies to personalize content and adds, and to analyze their traffic. They also share information about your use of their site with their advertising and analytics partners who may combine it with other information you've provided to them or they've collected from your use of their services. If you wish to opt out of Google cookies you may do so by visiting the Google privacy policy page. While Google can be an invaluable resource for older teens and adults, it is not inherently designed with children in mind. Assuming it is safe for young users without safeguards can expose them to inappropriate content, misinformation, or harmful material. Instead of outright banning access, parents and caregivers should take an active role in guiding their children’s online experiences. By using kid-friendly search engines, enabling parental controls, and fostering digital literacy, families can create a safer and more educational online environment. A balanced approach, combining supervision, education, and age-appropriate tools will empower children to navigate the internet responsibly while minimizing risks. Digital Food For Thought The White Hatter Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
- The Hidden Business of Social Media: How Ads Target Teens
Social media (SM) platforms are often perceived as free services that allow users to connect, share, and engage with content. However, beneath the surface, these platforms operate as sophisticated attention and data marketplaces. Their real business is not about providing a service but rather brokering user attention and data to advertisers who want to market products, often in highly targeted ways, including to teenagers geolocated near schools. Unlike traditional businesses that sell tangible products, SM platforms generate revenue by capturing and holding user attention. Every time someone scrolls, watches a video, or engages with a post, they contribute to the platform’s most valuable asset: their attention. The longer users stay engaged, the more data the platform collects, and the more advertising opportunities it can offer to businesses. To maximize profits, social media companies collect vast amounts of user data. Every post, like, comment, and share helps these platforms build detailed user profiles. These profiles allow SM companies to segment users into specific groups based on their interests, behaviours, and demographics. For instance, a person who frequently engages with fitness content might be placed in a segment for health and wellness advertisers. Data brokers and social media platforms can also use geolocation tracking to refine their ad targeting. If a teen has location services enabled on their smartphone, social media platforms and third-party data aggregators can detect frequent visits to specific places, including schools. Advertisers, such as fast-food chains and snack companies, can then purchase access to these location-based audience segments and deliver targeted ads to teens who are in or near school zones. (1) Social media apps use geolocation tracking to monitor user’s real-time and historical movements through a combination of GPS, Wi-Fi networks, and Bluetooth beacons. (2) GPS tracking provides precise location data, while Wi-Fi networks help determine proximity to specific places, such as a school or a store. Bluetooth beacons, which are often used in malls, stadiums, and retail locations, can detect when a user is near a specific product or display. This means that even if a teen isn’t actively using an app, their location can still be tracked. For example, if a teen visits a shopping mall every Friday evening, their phone may passively send this data to the app, allowing advertisers to recognize patterns in their behaviour and target them accordingly. The data collected through geolocation tracking is often sold to data brokers, which are companies that compile, analyze, and sell consumer data to businesses for advertising purposes. These brokers aggregate information from multiple sources, including social media activity, app usage, and even purchase history, to create detailed user profiles. This data allows advertisers to segment users based on their habits, routines, and preferences. For instance, if a teen frequently visits a gaming arcade, a data broker might sell this information to gaming companies, which can then target the teen with ads for the latest video games or gaming accessories. In some cases, this data is so detailed that advertisers can even determine the times when users are most likely to be in a particular location, ensuring that ads are shown at moments when they are most relevant. In addition to location tracking, social media platforms use behavioural targeting to refine the ads users see based on their interests and interactions. This means that the content a teen engages with, such as liking posts, following certain accounts, or watching specific videos, helps determine the types of ads they will be shown. For example, if a teen frequently watches videos about baking, follows dessert recipe pages, and comments on posts about new snack products, the platform's algorithm may classify them as someone interested in food and beverages. As a result, they might start seeing ads for new snack brands, local bakeries, or limited-time promotions from fast-food chains. The more they engage with related content, the stronger the association becomes, making their ad experience even more personalized. When it comes to ad delivery, social media platforms use a combination of geolocation and behavioural insights to determine when and where an ad appears. This means that ads are not just based on interests but also on a user’s physical location at a given time. If a teen opens their social media app during school hours, for example, they might see ads tailored to their environment, such as promotions for nearby cafés offering student discounts. Similarly, if they are near a movie theatre on a Friday night, they might receive ads for new movie releases or concession stand deals. This hyper-targeted approach ensures that ads reach users when they are most likely to act on them. For instance, if a teen walks by a fast-food restaurant and receives a push notification offering a 20% discount on their favourite meal, the combination of convenience and relevance increases the likelihood that they will visit the restaurant. This kind of marketing is not just theoretical, it has already happened. For example, McDonald’s has previously used Snapchat geo-filters to target teens near high schools with promotional deals. So why does this matters for parents, caregivers, and educators? The ability of advertisers to target young users raises ethical concerns, particularly when it comes to food advertising. Research has shown that teens are highly susceptible to marketing influences, and targeted ads for unhealthy food can contribute to poor dietary habits. (3) Furthermore, the passive nature of data collection means that many teens (and their parents or caregivers) are unaware of how much personal information is being used for marketing purposes. Parents and educators play a crucial role in helping teens navigate the digital landscape, starting with teaching digital literacy. Social media platforms are not just spaces for connection and entertainment—they are also powerful marketplaces for attention and data. Teens need to understand that the content they engage with, from the videos they watch to the posts they like, is being analyzed to shape their online experience. By explaining how algorithms prioritize certain content and how their engagement fuels advertising strategies, parents and educators can equip teens with the skills to critically evaluate what they see online. For example, discussing how a simple search for a pair of sneakers can lead to a flood of shoe ads across multiple platforms can help teens recognize the mechanisms at play. One of the most effective ways to reduce ad targeting is by managing geolocation settings. Many teens may not realize that their location is being tracked even when they aren’t actively using an app. Parents and educators can guide them through the process of disabling location tracking on social media platforms and other apps to minimize data collection. This can include turning off GPS tracking, adjusting privacy settings, and limiting app permissions. For instance, showing a teen how to prevent an app from accessing their location when it’s not in use can significantly reduce the likelihood of location-based ads. Regularly reviewing these settings together can help reinforce the importance of digital privacy. Beyond geolocation, it’s essential to discuss data privacy and the concept of a digital dossier, the collection of data points that companies compile about users over time. Teens should understand that their online behaviours, including the content they consume, the websites they visit, and even the apps they install, contribute to this digital footprint. This data can then be used to create targeted advertising profiles. Helping teens visualize this process, perhaps by demonstrating how data brokers work or showing real examples of ad targeting based on browsing history, can make the issue more tangible. Encouraging them to think before they click, avoid oversharing personal details, and use privacy-focused browser settings can all contribute to greater control over their digital identity. Another key aspect of digital literacy is promoting critical thinking about advertisements. Teens should be encouraged to ask themselves why they are seeing a particular ad and what tactics are being used to grab their attention. For example, they can analyze whether an ad uses urgency, exclusivity, or peer influence to create a sense of need. A discussion about influencer marketing can also be valuable, helping teens recognize when a seemingly authentic recommendation is actually a paid promotion. By fostering a mindset of inquiry and skepticism, parents and educators can help teens become more mindful consumers of digital content. Parents, caregivers, and educators can advocate for policy changes that protect minors from overly aggressive advertising, particularly in areas like unhealthy food marketing. Many advocacy groups and policymakers are pushing for stricter regulations on targeted advertising to young users, recognizing the impact that digital marketing has on teen behaviour. Supporting these efforts, whether by staying informed, signing petitions, or engaging in discussions with lawmakers, can contribute to broader systemic changes that prioritize youth well-being over corporate profits. By taking an active role in both education and advocacy, parents and educators can help create a safer, more transparent digital environment for the next generation. By recognizing that social media platforms operate as attention and data marketplaces, parents, caregivers, and educators can better prepare youth to use social media with awareness, skepticism, and intention. Teaching youth and teens to navigate these platforms critically ensures they are not simply passive consumers but informed onlife participants. Ultimately remember, when it comes to social media platforms and technology we are not their customers, we are their inventory! Related Article: Digital Food For Thought The White Hatter Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech References: 1/ https://www.franchisewire.com/geofencing-secret-weapon-of-fast-food-franchising/ 2/ https://www.business.com/articles/how-can-beacons-integrate-with-traditional-marketing/ 3 / https://www.ctvnews.ca/canada/article/canadian-children-see-thousands-of-digital-ads-for-unhealthy-food-every-year-report/
- We All Need Friends Like This - How A Friend Is Helping A Family In Crisis When Their Child Went Missing
Caveat: This case is still under active investigation by police, so we will not disclose the country or region where this incident has taken place. For those who are reading this article who may also know about this incident, we would also ask you not to identify the country or region where this incident took place. Also of note, we are currently working with the person who spawned this article, we will call her “Cathy”. This week, we were involved in a heartbreaking, yet sadly all-too-familiar case, involving two teens who appear to have been groomed and recruited by individuals connected to the human sex trafficking industry. The case involved a young teen girl and her friend who went missing. What followed was a remarkable act of determination and digital sleuthing, not by trained law enforcement or professional investigators, but by a concerned friend of the family (Cathy) who has no official training in online investigations, but a passion for helping other in crisis. Cathy, who reached out to us directly, was motivated by a deep concern for a family she knows well, and was frustrated by what appeared to be a lack of immediate police action, so she took matters into her own hands. Think Carmen Sandiego meets Nancy Drew, Cathy carried out a deep and thorough dive into the missing teen’s online life. What she uncovered may very well be the key to retrieving these two girls from a horrifying fate. With no formal investigative background, Cathy spent hours combing through multiple social media platforms, piecing together a timeline of events, interactions, and images. What she found painted a disturbing but clear picture: two teens, one who was homeless, were being groomed and recruited by two males in their early 20s , likely human sex traffickers. Cathy was able to locate social media postings that included content such as: Text messages and public posts Videos and photos showing human sex trafficking red flags Significant changes in the teens girls appearance (new clothes, hair, nails) Pictures and video of interactions with much older “boyfriends” Frequent mentions of hotel parties as well as pictures of these parties Evidence of being picked up and dropped off by unfamiliar vehicles Even snapshots of the suspect’s cars, license plates, and police-involved incidents Most critically, Cathy was able to identify a recent vehicle crash involving one of the suspected traffickers, including the date and police jurisdiction where it took place. That piece of information alone could help police investigators pinpoint the the identity of a suspect. Cathy also discovered a post about a hotel party bust by another police jurisdiction involving the girls, and other teen girls, again with enough details that we believe can help police investigators track down the police department that was involved to see what information they had on the possible suspects. Why This Incident Matters for Parents and Caregivers This story underscores a hard truth: police resources are limited, and digital evidence isn’t always acted upon quickly enough. But it also highlights something equally powerful - parents and caregivers can play a vital role in online safety, especially when they understand the platforms their kids use and know what signs to watch for when it comes to human sex trafficking. Here are the grooming red flags that Cathy identified in social media postings she located: Sudden changes in wardrobe, nails, and hairstyle without financial explanation Flashing lots of cash Isolation from friends and a new older boyfriend Secrecy around phone use and social media accounts Frequent disappearing or lying about whereabouts Being picked up/dropped off by unknown individuals Mentions of, or invitations to parties at hotels where there was lots of alcohol, drugs, and food These aren’t just “teens being teens”, they are potential indicators of grooming or sex trafficking. What Cathy did was nothing short of AMAZING. She didn’t wait. She didn’t doubt her instincts. She gathered facts, followed digital leads, and delivered a package of information to police that may be the key to bringing two young girls home. This case is a sobering yet powerful reminder that while technology can be exploited by those with harmful intentions, it can also be a force for good. In a moment of crisis, it wasn’t a detective or digital forensics expert who acted first, it was a compassionate adult who trusted her instincts, stepped into the uncomfortable unknown of the onlife world of human sex trafficking, and chose to take action by finding evidence that WILL help police. Cathy’s story is not just one of digital vigilance; it’s one of concern for the families impacted by this incident. She didn't allow frustration or fear to stop her from helping others in need. Instead, she used the very tools that were used to interact with these teens as a way to trace, understand, and hopefully reverse the trajectory of their disappearance. The signs of grooming and trafficking are not always obvious, and they’re often easy to dismiss as “normal teenage behaviour.” But as this case shows, a pattern of changes, when viewed together, can reveal a much more troubling story. We can’t rely solely on the legal systems to keep our children safe. Law enforcement plays a critical role, but they can’t be everywhere, see everything, or act fast enough on every lead. That’s why digitally literate, proactive parenting is so essential today. When adults are equipped to spot red flags, navigate social media, and take initiative, they become the first line of defence, and sometimes, the key to maybe saving a life. Let this case serve as a call to action, not just a cautionary tale. Know your child’s world, online and offline. Talk to them. Monitor their online spaces with care, not control. And when your gut tells you something’s wrong, don’t wait. Be the advocate your child needs. To Cathy who did what many wouldn’t or couldn’t: we tip our White Hat to you. May your example inspire more adults to get involved, get informed, and take action, because in today’s onlife world, it truly can take just one person to make all the difference. For the parents and caregivers reading this posting, here’s a FREE link to a chapter in our web book where we speak in detail about online predation and exploitation that every parent and caregiver should read https://thewhitehatter.ca/online-sexual-predation-and-exploitation/ Lastly, if you believe a child is being groomed or trafficked, contact local authorities immediately. Trust your instincts, and remember, inaction can be dangerous. But action, as this story proves, could provide the evidence and leads needed that could be lifesaving. Digital Food For Thought The White Hatter
- The Rise of “Looksmaxxing” And How Teen Boys See Themselves
Caveat - our friends at the University of Dalhousie in Nova Scotia released a study on “Looksmaxxing” that was recently reported by the CBC news. (1)(2) For those who have been following us for a while, you know that recently, we have been writing a lot about how the “manosphere” is influencing youth and teen boys. Looksmaxxing is a concerning byproduct of the manosphere as you will read. This is a follow-up article to the one we just posted titled, “Masculinity Influencers Are Shaping How Young Men See Themselves - And It’s Affecting Their Mental Health” (3) Imagine being a teenage boy today, scrolling through TikTok or Instagram, and suddenly coming across a flood of videos with titles like “Transform Your Jawline in 30 Days” or “How to Looksmaxx Your Face.” These videos promise quick fixes and dramatic enhancements to one’s appearance, all under the banner of “self-improvement.” While at first glance this content may seem like a harmless push toward better grooming or fitness, it can sometimes draw you in like a black hole into a deeper and potentially dangerous trend known as “looksmaxxing.” and it’s gaining serious traction among “some” teens and young adults, especially teen boys. Looksmaxxing has also been the topic of a 2022 animated Netflix series called "Lookism" (4) which is a South Korean animated adaptation of the popular Naver webtoon of the same name. The story follows Park Hyung-seok, a high school student who, after enduring bullying due to his appearance, wakes up one day with a second body that is tall and conventionally attractive. He discovers he can switch between these two bodies, leading to a double life that explores themes of identity, beauty standards, and self-worth. “Looksmaxxing” is the term used to describe the pursuit of maximizing one's physical appearance. It often starts with basic self-care habits, like grooming and working out, but can escalate quickly to include extreme, sometimes risky methods, like cosmetic procedures or do-it-yourself (DIY) enhancements performed at home. The goal? To achieve an idealized version of physical beauty often defined by rigid and unrealistic social media standards that teen girls will crave. This movement finds its roots in online forums like looksmax(dot)org (5) , which is closely tied to a web of online communities known as the “manosphere.” (6) These forums promote not just appearance enhancement, but also serve as echo chambers for toxic ideas, including misogyny, nihilism, and self-loathing. The looksmaxxing world breaks down into two main categories: “Softmaxxing” and “Hardmaxxing.” Softmaxxing This approach involves non-invasive, lifestyle-based methods that are generally low-risk. These include: Grooming and hygiene Skin care routines (cleansing, moisturizing, sunscreen) Haircuts, beard shaping, and dental care Healthy eating and regular exercise Posture improvement and confident body language “Mewing” (tongue posture exercises claimed to enhance jaw definition) (7) Updating personal style and wardrobe Softmaxxing, when practiced in moderation, can be a healthy form of self-care. But the issue arises when these efforts become obsessive, driven by social comparison and the constant pursuit of perfection. Hardmaxxing Hardmaxxing refers to more extreme, invasive, and often expensive measures. This category includes: Cosmetic surgeries like rhinoplasty, jaw reshaping, or chin implants Non-surgical procedures like Botox or fillers Hair transplants and chemical peels DIY procedures like at-home fillers or even dangerous methods such as “face hammering” (8) to alter bone structure Steroid use and crash dieting to achieve rapid physical transformation Many of these methods are glamorized within online forums and social media posts, often with little regard for the medical or psychological risks involved. While personal improvement is not inherently problematic, looksmaxxing pushes young people toward unattainable beauty standards. The pressure to conform can lead to: Body dysmorphia Low self-esteem Depression, anxiety, and self-harm Risky or harmful behaviour, including unregulated procedures and steroid use For teen boys, who are the primary audience for looksmaxxing content, his can create a toxic loop. They’re often encouraged to believe their social value depends on their physical attractiveness and that drastic changes are necessary to be accepted, loved, or respected. At first glance, looksmaxxing seems like a superficial trend utilizing skincare tips, fitness advice, and grooming routines. But dig a little deeper, and you’ll find that for some, it becomes a dangerous gateway into radicalized online spaces, particularly those rooted in the “manosphere.” Here’s how it happens; a teen boy stumbles into looksmaxxing content while feeling insecure or rejected. He’s told that his worth is tied to his jawline, his height, or his eye spacing. If he doesn't “make it” through physical transformation, he's not just unattractive, he's unworthy. This black-and-white thinking primes him for the next step where he is drawn into communities that feed on that despair and promise to explain it. That’s when the rhetoric shifts. What began as beauty advice morphs into toxic narratives such as, “Women are hypergamous.”, “Only 10% of men get all the attention.”, and “If you’re not born with good looks, you’re invisible.” Some forums go even further, pushing conspiracy-like content, blaming feminism, dating apps, and even entire ethnic or social groups for their perceived lack of success. It's not just pseudoscience, it becomes a full-blown worldview. In these echo chambers, bitterness is validated, anger is nurtured, and radical ideas are dressed up as “hard truths.” For some, this spiral ends in deeper involvement in incel ideology or even participation in extremist online communities that normalize misogyny and dehumanization. The shift is subtle but powerful, moving from self-improvement to self-hatred, and then to hatred of others, primarily females. To understand how looksmaxxing exploded in popularity with some teen boys, we need to go beyond TikTok and other social media platforms, and look at the deeper cultural and technological roots that allowed it to thrive, particularly its ties to incel (involuntary celibate) communities. At its core, looksmaxxing is about optimizing one’s physical appearance, often to extreme lengths, in the hope of improving social and romantic outcomes. While this concept might sound like a standard self-improvement trend, it takes on a much darker tone in the corners of the internet where it originated. Platforms like YouTube, Instagram, and TikTok helped normalize the idea that appearance is currency. TikTok’s algorithm, for example, is often believed to favour conventionally attractive users, regardless of their follower count. (9) This created an environment where beauty isn’t just admired, it’s rewarded. Social media has amplified perfectionism, leading teen boys to feel an ever-increasing pressure to meet unattainable alpha male beauty ideals. During the pandemic, with more people stuck indoors and turning to self-improvement, looksmaxxing gained traction. But its roots stretch back even further to male-dominated online forums like Reddit, lookism(dot)net (2018), which is no longer available, many of which are closely associated with incel subcultures. Historically, women have been the primary targets of beauty standards. That hasn’t changed, but in recent years, men have also come under growing pressure to embody ideals of masculinity such as sharp jawlines, broad shoulders, flawless skin. Research shows that most looksmaxxing content is targeted toward men, mirroring this shift. Within incel communities, physical attractiveness is seen as the only viable path to success in dating, work, and life. Looksmaxxing becomes not just a choice, but a survival strategy. This is where the trend becomes disturbing. In these spaces, looksmaxxing is not about feeling better, it’s about curing deep-seated insecurities with external, often harmful changes. Today, the main online spaces shaping this subculture is Looksmax.me (now rebranded as Looksmax.org as of 2024). This site caters to users seeking actionable tips on improving their appearance. Discussions include everything from mewing and skincare routines to dieting and posture correction. However, it also promotes risky practices like face hammering and at-home orthodontics. Looksmax(dot)org has become an echo chamber of insecurity, reinforcing toxic ideologies and fostering obsession with appearance. Women are frequently dehumanized on these platforms, referred to as “femoids” and stereotyped in crude, misogynistic ways. The forums often reject the idea that self-worth can exist outside of physical attractiveness and promote “black pill” ideology, the belief that those deemed unattractive are doomed to fail no matter how hard they try. Some teen boys begin their journey on looksmaxxing forums and gradually become radicalized, moving into more extreme online manosphere spaces , where conversations shift from self-improvement to open resentment and, in some cases, hatred, violence, including self-harm. Where there's insecurity, there's someone ready to exploit it. The looksmaxxing community has given rise to a slew of predatory "coaches" who offer expensive, often useless supplements (10) or dangerous advice. These self-proclaimed experts, often operating on TikTok, Instagram, and within forums, claim to offer custom guidance to help men achieve a "Chad transformation.”, an incel term! They promise facial analyses, grooming hacks, surgical suggestions, and guaranteed results, all without credentials or scientific backing.These services can cost hundreds or even thousands of dollars, with no real benefit provided. The world of looksmaxxing isn’t just about beauty. It’s about vulnerability, toxic comparison, and a deeply flawed system of value rooted in appearance. While it might present itself as a form of self-improvement, at its core, it's a desperate response to systemic pressures and societal expectations, and it's one that often leads users down a dark and dangerous path. As we stated earlier, where there’s vulnerability, there’s often someone ready to exploit it, and the looksmaxing space is no different. These forums and TikTok comment sections have become hunting grounds for self-proclaimed “looksmaxxing experts” who offer personal coaching, promising to turn any average guy into a so-called “Alpha Chad.” What do they offer? For a hefty price, they'll claim to analyze your facial symmetry, your jawline angle, your posture, and even your skin tone. They'll suggest routines, diets, workouts, grooming habits, and in some cases, straight-up surgical procedures, all under the illusion of scientific precision. But in reality, most of these services are built on pseudoscience, marketing buzzwords, and false hope. (11) Many of these so-called coaches charge hundreds, sometimes thousands, of dollars for advice that's often vague, recycled from Reddit threads, or dangerously unregulated. They use high-pressure sales tactics, buzzwords like “guaranteed results,” “black pill escape,” or “full Chad transformation,” and display fake credentials to establish false credibility. In truth, most of them have no training in dermatology, fitness, psychology, or personal development. This kind of exploitation isn’t just about draining wallets, it’s about feeding insecurities for profit. And when clients inevitably fail to meet impossible standards, they're blamed for “not trying hard enough,” reinforcing the idea that the problem isn’t the system, it’s them. As we have expressed in this article, looksmaxxing isn’t just about haircuts and gym routines, it’s a modern language of insecurity, shaped by algorithms and amplified in online echo chambers. For some teens, it’s harmless self-improvement. For others, it’s a gateway to self-loathing and extremist beliefs dressed up as empowerment. So what can you do a a parent, caregiver, or educator? Don’t mock it. Ask about it. If your teen brings up terms like looksmaxxing, bone smashing, or mewing, your first instinct might be to laugh or express disbelief. Resist that urge. Instead, be curious, not critical. Ask them: “What got you interested in this?” “Where did you first hear about it?” “How does it make you feel when you see these posts?” You might be surprised how open your teen becomes if they don’t feel judged. Teens are often looking for guidance, not lectures. By meeting them with openness, you create a space where they feel safe sharing more in the future. Teach critical thinking, not just “don’t go there.” It’s tempting to just ban certain apps or subreddits or social media platforms and be done with it. But that approach doesn’t address the underlying question your teen is really asking which is, “Am I good enough?” Instead, help them question the content they’re consuming: “Who gains when you believe you’re not attractive enough?” “Do you think the advice you’re seeing is based on real science, or is it playing on insecurities?” “What does “attractive” even mean, and does it change depending on the person or culture?” The goal isn’t to ridicule their interest, but to arm them with the tools to see through manipulative messaging and unrealistic ideals. Watch for signs of withdrawal or despair. While some teens may explore looksmaxxing out of curiosity, others may be engaging with it in response to deeper struggles. If your teen is: Obsessing over perceived flaws, Avoiding social situations, Talking negatively about their appearance or future, Or becoming increasingly isolated, it could be a sign of anxiety, depression, or body dysmorphic disorder (BDD). Looksmaxxing might be the surface-level symptom of something much more serious. Check in regularly and non-judgmentally. If you're concerned, consider speaking to a healthcare professional who understands adolescent mental health. Talk early and often about self-worth. Don't wait until your teen is down a black hole of online forums to start conversations about confidence, self-image, and what makes someone valuable. Help them explore ideas like: “What do you admire in your friends? Is it mostly about how they look?” “What makes someone a good partner, a good friend, or a good person?” “How do you want others to see you, and how do you see yourself?” Emphasize that attractiveness is subjective, and true self-worth comes from character, compassion, resilience, and authenticity. Keep the digital dialogue open. Looksmaxxing communities often thrive in secrecy and shame. The more your teen feels they have to hide their online life from you, the more susceptible they become to harmful ideologies. Instead of monitoring or punishing, focus on digital mentorship. Say things like: “I want to understand what you’re into online.” “I know there’s a lot of weird stuff out there, let’s figure it out together.” “Even if I don’t get it at first, I promise I’ll listen.” When your teen sees you as someone they can trust to help them navigate digital culture, rather than just shut it down, you become a powerful ally in their journey toward healthy identity development. Your teen needs to understand that they can come to you no matter what, and that you will always love them no matter what! Looksmaxxing, while initially framed as a harmless push for better self-care or fitness, is far more than a superficial trend. For many teens, especially boys, it has become a toxic black hole where personal insecurities are exploited, unrealistic ideals are normalized, and self-worth becomes synonymous with physical appearance. What starts as a desire to feel more confident can quickly escalate into a cycle of obsession, self-loathing, self-harm, and even radicalization within dangerous online communities. This isn’t just about jawlines and skincare routines. It’s about a cultural shift where young men are being told that they are inherently deficient unless they conform to hyper-specific, unattainable beauty standards. These messages are amplified by algorithms, monetized by self-proclaimed “experts", and weaponized in manosphere forums where misogyny, nihilism, and black-and-white thinking flourish. What looksmaxxing often disguises as “self-improvement” is, in many cases, a conduit to deeper psychological and physical harm and radical ideologies. Parents, educators, and caregivers need to understand that this isn’t a fringe issue, it’s increasingly mainstream in the digital lives of teens. TikTok, Instagram, YouTube, Reddit, and countless forums are teeming with content that praises aesthetic transformation while quietly nurturing body dysmorphia and low self-worth. We must resist the urge to dismiss these videos as vanity or harmless teenage experimentation. Instead, we must treat looksmaxxing as a red flag that deserves attention, empathy, and open conversation. The most powerful tool parents, caregivers, and educators have is connection. Rather than lecturing or banning, we must listen and ask thoughtful questions: such as, “What do you think makes someone attractive?”, “Do you feel pressure to look a certain way?”, “Where do you think those ideas come from?” These conversations build trust and open the door to discussions about self-esteem, digital literacy, agency, and critical thinking. We must also challenge the underlying message of looksmaxxing that a person’s value is determined by how they look. Whether through education, mental health support, or digital literacy, we need to reinforce that self-worth comes from within and not from a “perfect” jawline, a skincare routine, or a YouTube tutorial. Ultimately, looksmaxxing is a mirror reflecting our culture’s fixation on appearance and it’s our responsibility as parents, caregivers, and educators to shift the narrative. Let’s help teen boys understand that they are more than their image. Let’s give teen boys the tools to see through the filters, the forums, and the false promises. Let’s stand between them and the toxic voices telling them they’ll never be good enough. Because they already are! Related Resource: We’ve put together a comprehensive resource guide on our website called “Parent, Caregiver, and Educator Resource Guide: Understanding and Addressing Youth Online Radicalization.” It includes everything you need to better understand the manosphere, online radicalization, and how we can respond effectively as parents, caregivers, and educators. (12) Digital Food For Thought The White Hatter Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech References 1/ https://www.cbc.ca/news/canada/nova-scotia/how-looksmaxxing-sites-can-harm-young-men-and-boys-1.7499752 2/ https://pubmed.ncbi.nlm.nih.gov/40069550/ 3/ https://thewhitehatter.ca/blog/masculinity-influencers-are-shaping-how-young-men-see-themselves-and-its-affecting-their-mental-health/ 4/ https://www.netflix.com/ca/title/81177634 5/ https://looksmax.org/ 6/ https://thewhitehatter.ca/blog/the-rise-of-the-manosphere-a-growing-challenge-for-schools-parents-caregivers/ 7/ https://aaoinfo.org/whats-trending/is-mewing-bad-for-you/ 8/ https://www.unmc.edu/healthsecurity/transmission/2023/10/17/bone-smashing-tiktok-trend-here-are-dangers-of-hammering-your-face/ 9/ https://centennialworld.com/tiktok-new-tumblr-counters-body-positivity-movement/ 10/ https://looksmax.org/threads/list-of-supplements-for-looksmaxxing.157086/ 11/ https://www.mdpi.com/2411-5118/3/4/43 12/ https://thewhitehatter.ca/parent-resource-guide-understanding-and-addressing-youth-online-radicalization/
- From Digital Rabbit Holes to Digital Black Holes: How Algorithms Are Changing the Way Youth and Teens Access Content Online
Caveat: We want to thank Rick Lane, Iggy Ventures LLC., who used the analogy of “black holes” when it comes to describing algorithms. When the internet first began making its way into homes, the concern for many parents and caregivers was simple, “What if my child stumbles onto something they shouldn’t?” Back then, the online world felt like a vast but disorganized library. A youth or teen might be researching for a school project, click the wrong link, and suddenly find themselves on a webpage filled with content not meant for young eyes. It was often accidental, like tripping and falling down a rabbit hole. Fast forward to today, and the onlife landscape looks very different. The internet is no longer a passive place waiting to be discovered; it's now a dynamic, curated environment that seeks out youth and teens just as much as they seek it out. The path to inappropriate, misleading, or emotionally manipulative content is rarely accidental, it’s often by design for financial profit. That’s the difference between falling down a digital rabbit hole by accident, and being pulled into a digital black hole by design. In the early days of youth and teen internet use, falling into inappropriate or problematic content was more a matter of bad luck or poor search filters. It wasn’t that platforms wanted youth and teens to find explicit material, it was more that protections simply hadn’t caught up. However, today’s internet operates differently. Social media and video-sharing platforms run on extremely powerful algorithms designed not just to show content, but to keep users engaged for as long as possible, commonly known as “dark patterns”. (1) These algorithms learn what a person likes, how long they watch, what they scroll past, what they linger on, and then by design they feed more of that to the user. The longer a youth or teen stays on a platform, the more personal data is gathered and then monetized to the benefit of these social media platforms. That’s their business model. For youth and teens, especially those whose brains are still developing, this creates an environment that feels less like exploration and more like exploitation. Unlike the rabbit hole, which you fall into by accident, a black hole pulls you in by design, with an immense gravitational force that becomes impossible to escape once you’re close enough. Algorithms function in much the same way. Once a youth or teen watches one video about extreme fitness, toxic beauty standards, conspiracy theories, or even mild sexualized content, the algorithm doesn’t just show them more of the same, it amplifies and intensifies the exposure. It doesn’t ask whether the content is age-appropriate, balanced, or healthy. It only asks, “Will this keep them watching?” What This All Means for Parents and Caregivers As a parent or caregiver, this evolution from digital rabbit holes to digital black holes means we need to change how we think about digital literacy and online safety: It’s not just about blocking content, it’s about understanding the mechanisms that surface it. Many parents and caregivers rely on content filters and parental controls to keep their children safe online. While those tools can be helpful, they only address part of the issue. Today’s digital platforms don’t just passively host content; they actively deliver it through powerful algorithms designed to predict and influence behaviour. These algorithms aren’t filtering content based on what’s good or bad for your child, they’re surfacing content based on what keeps your child’s attention the longest. That means even if a platform is technically "safe," your child can still be nudged toward problematic patterns of content consumption. Understanding how and why content is being shown, rather than just blocking what’s visible, is a critical step in supporting your child’s digital well-being. It’s not always about your child seeking something out, sometimes it’s about something seeking them out. There’s a common misconception that if a youth or teen encounters harmful or inappropriate material online, it must be because they went looking for it. But in reality, today’s digital platforms are often the initiators of exposure. Through endless scrolling, autoplay features, and algorithmic recommendations, platforms are constantly pushing content toward users, even when they haven’t asked for it. This means your child can start watching an innocent video and, within minutes, be shown increasingly extreme or inappropriate material without ever having typed a single search query. The idea that kids “stumble across” content is outdated; more often than not, that content is being pushed directly to them. It’s not enough to tell kids to "be careful”, they need tools to recognize when they’re being manipulated by technology designed to keep them hooked. We often tell youth and teens to be cautious online, but vague warnings don’t prepare them for the very real psychological tactics baked into digital platforms. Today’s apps and websites are designed using principles from behavioural psychology, things like variable rewards, infinite scrolling, and notification triggers keep users coming back. Youth and teens aren’t just consuming content; they’re being targeted by systems that study their behaviour in real time. To truly protect youth and teens, we need to teach them how to recognize when they’re being nudged, manipulated, or emotionally triggered by digital design. Equipping them with this awareness gives them the power to make conscious, informed decisions instead of falling into patterns driven by tech that’s built to exploit their attention. Talk about algorithms One of the most powerful tools a parent can provide their child is understanding. Start by explaining what an algorithm is in simple terms, it’s a set of rules that decides what shows up in their social media feed or on YouTube. These algorithms learn from what we watch, like, or scroll past, and they use that data to predict what we’ll want to see next. Let your child know that the internet is not random, rather it’s tailored, and often that tailoring is meant to keep them online for as long as possible, not necessarily to help or inform them. When youth and teens understand that their online experiences are being shaped by invisible systems, they’re more likely to approach digital content with a critical eye. Watch together Spending time online with your child might not be your idea of a good time, but it can be an incredibly effective way to foster digital awareness. Watching a few TikToks, YouTube shorts, or even scrolling through an Instagram feed side by side can lead to natural, judgment-free conversations about what they’re seeing. Ask your child: “Why do you think this video was recommended?” or “How does this content make you feel?” The goal isn’t to criticize their interests but to encourage them to think about why certain types of content are being shown. This shared activity can also build trust, making it more likely that your child will come to you if they encounter something troubling online. Don’t rely solely on parental controls While parental controls are a helpful tool, especially for younger children, they are not a silver bullet. The problem isn’t always overtly explicit content; more often, it’s about exposure to repetition of ideas that shape values, beliefs, and behaviours over time. For instance, a child might repeatedly see videos glorifying extreme dieting, toxic masculinity, or get-rich-quick schemes. None of these are likely to be flagged by filters, but the pattern of exposure can still be harmful. That's why having regular, open conversations is far more effective in the long term. Your child needs guidance to recognize problematic content and question its intent, rather than simply being shielded from it. Encourage variety One of the best defences against algorithmic manipulation is content diversity. Encourage your child to follow a wide range of creators and topics such as educational channels, positive role models, content from different cultures, or even lighthearted humour that’s age-appropriate. The more varied their online diet, the less likely it is that they’ll get funnelled into an echo chamber of extreme or unhealthy content. Think of it like a balanced nutritional diet, but for their brain. This doesn’t mean banning certain content, but rather promoting a healthier mix so they have multiple perspectives and aren’t stuck in one digital lane. Be involved without spying It’s a delicate balance, but it’s crucial to be a presence in your child’s digital life without becoming a source of fear or resentment. Youth and teens are more likely to open up about their online experiences if they don’t feel like they’re being constantly monitored or judged. Let them know you’re there to help, not to punish. Create a space where they can talk about what they’re watching, who they’re following, and even the weird or uncomfortable stuff they might encounter. When a youth or teen trusts that their parent will listen first and react later, they’re much more likely to come forward when they need support. We need to stop thinking of youth and teen online experiences as a series of accidental clicks. Today’s platforms are engineered to be captivating, and they’re very good at what they do. As parents, our role isn’t to fear the internet, but to help our kids understand and navigate it with awareness and critical thinking. As the onlife world continues to evolve, so too must our approach to parenting within it. The shift from digital rabbit holes to algorithmic black holes represents a fundamental change in how youth and teens interact with content online. What was once accidental has now become intentional, and driven by technology designed to influence, predict, and profit from our attention. Simply telling youth and teens to "be careful" no longer equips them for what they're actually facing. Today’s youth and teens aren’t just navigating content; they’re navigating a system built to guide their clicks, shape their interests, and keep them scrolling. This means our role as parents and caregivers can’t stop at installing parental controls or setting screen use limits. We need to be proactive co-navigators in their digital lives, helping them understand how algorithms work, what dark patterns look like, and why certain content appears more often than others. It’s about creating digital resilience, not digital paranoia. By being present, asking questions, watching together, and fostering critical thinking, we help our youth and teens become active participants in their online experiences rather than passive consumers. We give them the tools to spot manipulation, to question intent, and to recognize when something is trying to pull them in too deeply. Ultimately, our goal isn’t to scare our kids away from technology, but to prepare them for it, to shift the narrative from control to curiosity, from fear to understanding or as we like to say, “pave the way”. When youth and teens understand the system, they can learn how to move through it on their own terms, with confidence, clarity, and intention. Digital Food For Thought The White Hatter Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech References: 1/ https://thewhitehatter.ca/blog/understanding-algorithms-dark-patterns-and-how-to-educate-your-kids-about-them/
- How Youth & Teens Search Discord For Porn
Caveat - Discord in their Terms of Service state that the minimum age to join their app is 13+. However, in the Apple Store they recommend 17+ to download the app. Presently, we believe that Discord is better suited for older teens. Today’s youth live in an onlife world where platforms like TikTok, Snapchat, Roblox, and Discord dominate their social and gaming lives. While these spaces can offer community, creativity, and entertainment, they are not without risks. No matter the platform's original purpose, whenever a space becomes popular with young people, some individuals inevitably push hyper-sexualized content and pornography into those spaces, Discord is no exception. Discord is a communication platform originally built for gamers but has grown into a massive online space for anyone to gather around shared interests. It combines elements of FaceTime (voice and video calling) with Reddit-style discussion boards, allowing users to text, talk, or video chat in group spaces called "servers." Today, Discord hosts about 6.7 million active servers and 140 million active monthly users. Servers can be private, invite-only spaces among friends, or public communities where strangers gather around everything from favourite video games to TV shows, hobbies, and, unfortunately, even explicit sexualized content. One of the biggest challenges with Discord is the sheer amount of “Not Safe For Work” (NSFW) content. There are over 10,000 NSFW servers, spaces dedicated to sharing hyper-sexualized and pornographic material. (1) While Discord technically requires users to be 18+ to join these NSFW servers, the "age gate" is extremely easy to bypass. Discord does not use real age verification like ID checks or facial recognition, instead all a user needs to do is simply click a button stating they are of age. This ease of access means a young person doesn’t have to type "porn" into a search engine or navigate to a known adult website, which makes is harder for parents and caregivers to detect their child is accessing pornography. Instead, with just a few taps inside the Discord app, they can access a buffet of explicit images and videos, often more extreme and varied than what they'd typically find on some porn sites. Even more alarming is that some servers illegally share Child Sexual Abuse Material (CSAM), which is illegal and highly harmful. While Discord actively works to detect and remove CSAM, the platform’s vastness make total enforcement very challenging. Besides the availability of explicit content, Discord also poses other safety risks that parents and caregiver should be aware of. Discord allows users 18+ to link payment methods and accept donations, another feature that can be exploited, especially by individuals seeking to manipulate or exploit vulnerable users when it comes to sharing intimate images (nudes). Again, even though you need to be 18+ to make this feature work, age verification is easily bypassed. So, what can parents do? Start a Conversation Open communication is your first and most important line of defence. Talk to your child about Discord in a calm and non-judgmental way. Ask them if they use the app, what servers they are part of, and who they are communicating with. Instead of turning the conversation into an interrogation, make it an ongoing dialogue where your child feels comfortable sharing their online experiences with you. The more open the line of communication, the more likely your child will come to you if they encounter something uncomfortable or unsafe. Explore Together One of the best ways to understand and guide your child’s experience is to step into their onlife world. Create a Discord account yourself and spend some time learning how the platform works. Explore how servers are organized, how chats and voice channels operate, and what privacy settings are available. By familiarizing yourself firsthand, you’ll not only feel more confident having conversations with your child, but you’ll also be better equipped to spot potential red flags and guide them on safer practices. Use Privacy Settings Privacy settings on Discord are critical for protecting your child’s personal information. Teach them how to set their account to private, block unwanted messages, and control who can send them friend requests. It's also important to disable location-sharing features if they are active, as sharing physical location data can expose them to additional risks. Encourage your child to only accept friend requests from people they know in real life and to be cautious when interacting with new users, even if they seem friendly or have mutual connections. Monitor Server Memberships Just like you would want to know where your child hangs out in the real world, it’s important to know where they spend time online. Regularly check what servers your child has joined on Discord. While many NSFW servers are labeled clearly, others are hidden behind innocent-sounding names or themes to lure in younger users. Make it a regular part of your digital check-ins to review these spaces together and have open discussions about why certain servers may not be appropriate or safe. Educate About Grooming Risks Unfortunately, Discord, like any platform that allows anonymous communication, can be a hunting ground for predators. It's crucial to educate your child that not everyone online is who they claim to be. Teach them about the signs of grooming behaviour, such as adults trying to build private conversations, offering gifts, asking for personal information, or making conversations sexual in nature. Make sure they know that if anyone ever makes them feel uncomfortable or pressured, they should come to you immediately without fear of getting in trouble. Set Rules About Usage Setting clear rules about how and when Discord can be used is key to minimizing risks. Depending on your child's age and maturity level, you might decide to delay access to Discord altogether, limit which servers they can join, or require that they use it only when an adult is nearby. Rules should also include time limits, appropriate behaviour expectations, and what to do if they encounter something disturbing. Revisiting these rules regularly as your child grows will ensure that their online habits stay age-appropriate and safe. Discord, like many online platforms, offers young people opportunities for creativity, community, and connection. However, it also presents very real risks, particularly around exposure to explicit content, predatory behaviour, and privacy vulnerabilities. As parents and caregivers, it’s not enough to simply hope our children stay safe, we must actively guide them through these digital landscapes with education, involvement, and clear boundaries. By starting open conversations, exploring the platform ourselves, setting strong privacy settings, monitoring server memberships, educating about online risks, and setting clear usage rules, we can empower our children to navigate Discord more safely. Technology itself isn’t the enemy, lack of awareness is. The more we engage, the better prepared our children will be to recognize danger, make smart choices, and come to us when they need help. Discord, like any online platform, isn't inherently bad an most youth and teens use it in an appropriate way, but it’s not designed with child safety at its core. It’s vital for parents to be proactive. By understanding how Discord works and staying engaged in your child's onlife world, you can help protect them from dangers they might not even realize exist. Related Article: Digital Food For Thought The White Hatter Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech Resources: 1/ https://beta.discodus.com/servers/tag/nsfw?utm_source=chatgpt.com
- How TikTok Has Become a Gateway to OnlyFans Other hyper-sexualize & Pornographic "Pay-for-Play" Streaming Services
CAVEAT: This article builds on the one we recently wrote called “Parents, Caregivers, and Educators Do You Know What “BOP” Stands For, & What The BOP House Is?” (1) When parents think of TikTok, many still picture it as a light-hearted platform filled with dance challenges, pet videos, and viral memes. There are some in our industry, such as ourselves, who also use this platform to push important digital literacy and internet safety messages. However, while that side of TikTok certainly still exists, there's a more complex and concerning evolution happening just beneath the surface, especially when it comes to how young users, especially teen boys, are being exposed to and funnelled toward adult subscription services like OnlyFans and other hyper-sexualized and pornographic "pay-for-play" streaming platforms, a shift that every parent and caregiver needs to be aware of. TikTok has built a culture where becoming "TikTok famous" is a dream for many young users. With fame often comes the expectation, or pressure, to monetize that following. TikTok itself offers some creator payments, but they're small unless a user reaches massive audiences. (2) As a result, influencers often turn to outside platforms where they can make real money directly from their fans. Enter OnlyFans (3) , and other similar subscription-based services that are piggybacked and linked on TikTok by some users. Originally, these streaming platforms were marketed for a range of content creators, from fitness trainers, to artists, and to musicians, but they are now overwhelmingly associated with adult content. The appeal is simple, users can pay a subscription fee for exclusive, often sexually explicit, photos, videos, or interactive live streams. TikTok’s algorithm excels at showing users content similar to what they already engage with. Once a young person interacts with content from users who subtly hint at adult material, they can quickly find themselves on a slippery slope toward more explicit content. Although OnlyFans requires users to be 18 or older, age verification processes are notoriously easy to bypass. Teens might be exposed to adult creators or, worse, be encouraged to start creating adult content themselves in hopes of making money or gaining popularity. One of the ways adult content creators skirt TikTok’s guidelines is through subtle promotion. Rather than openly naming platforms like OnlyFans, which could result in content removal or account suspension, many creators rely on coded language to hint at their presence on adult sites. Phrases like “exclusive private content in bio,” “subscribe for more,” or even emojis, like an eggplant, that suggest adult themes are commonly used. These teaser-style videos, which often don’t violate TikTok’s Terms of Service, may appear tame on the surface but are carefully crafted to pique curiosity and lead viewers to follow external links where more explicit material is offered. To facilitate these connections, many creators use link aggregator tools like Linktree (4) , AllMyLinks (5) , or Beacons (6) . These tools allow users to include a single URL in their TikTok bio that leads to a custom landing page with multiple destinations. While some links may lead to legitimate platforms like Instagram or a merchandise store, others direct users to adult subscription pages like OnlyFans. Because TikTok's automated moderation often struggles to track what lies behind these link trees, these indirect paths to adult content remain largely unchecked. What accelerates this exposure is TikTok’s powerful algorithm. Once a user engages with a video from a creator who promotes adult content, even if it's just watching a video to the end, liking it, or leaving a comment, the algorithm takes notice. (7) TikTok begins populating the user’s "For You" page with similar content, which can include more creators hinting at or linking to adult material. This creates a feedback loop where a single interaction leads to increasingly sexualized content, making it far easier for young users to stumble into digital spaces they’re not prepared for. Part of the draw for young audiences comes from the way some creators frame their success on these platforms. Many TikTok’ers openly share stories of how they’ve made thousands of dollars, or even quit their day jobs, through selling adult content. These narratives often glamorize the lifestyle where expensive cars, designer clothes, and financial independence are shown. What’s rarely mentioned, however, are the emotional, reputational, or long-term safety concerns that come with monetizing one’s body online. To a teen who’s financially insecure or craving validation, this path can seem both desirable and accessible, even though it carries significant and often invisible risks. Exposure to this content can normalize the idea that selling access to one’s body is not only acceptable but aspirational. For impressionable young minds, especially those struggling with self-esteem, validation, or financial insecurity, this can be dangerously seductive. Additionally, some predators use TikTok to target young people, posing as agents or managers offering them opportunities to make money through these platforms (grooming), aka “digital pimping” What may appear to be harmless scrolling through TikTok can, over time, lead youth down a path toward adult content, sometimes without them even realizing it. The platform's algorithm, creator monetization culture, and subtle cues in videos and bios all contribute to a pipeline that can expose teens, particularly boys, to adult subscription services like OnlyFans. The content itself isn’t always explicit on the surface, but the underlying messages and redirections can be deeply influential, normalizing and glamorizing a pay-for-play model that positions sexual content as an easy route to fame and fortune. Parents need to understand that the risk isn’t just about what their child might see, it’s also about what ideas are being planted. When selling access to one's body is framed as aspirational, and when adult content creators are algorithmically rewarded and socially celebrated, young people begin to internalize the message that this is a viable or even desirable path. Combined with lax age verification systems and predatory tactics disguised as opportunity, this can quickly become more than just an issue of exposure, it becomes one of safety, identity, and exploitation. This article is not a call for panic or blanket bans. It’s a call for awareness, conversations, and active digital parenting. Parents and caregivers must engage their kids in ongoing, age-appropriate discussions about online content, the realities behind the glamour, and the long-term consequences of sharing or consuming sexualized material. Understanding the tools, platforms, and language being used is the first step. Staying involved, without shaming or overreacting, is how we help our children navigate these complex digital spaces with clarity, confidence, and critical thinking. TikTok can still be a place for creativity, education, and entertainment, but like any tool, it depends on how it's used. TikTok isn’t inherently bad, but it’s algorithm can become predatory, it seeks out specific users for specific content in our opinion, and like any social media platform, it reflects the complexity of human behaviour, including the darker sides. By staying informed and engaged, parents can help their children navigate TikTok safely, understand the real risks behind adult subscription platforms, and make empowered choices about their digital lives. Having said this, we continue to believe and promote that this app should only be used by those 16+ at a minimum. Note - While this is also occurring on other platforms like Snapchat and Instagram, we wanted to highlight how it's unfolding on TikTok, given its current popularity among teens Digital Food For Thought The White Hatter Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech References: 1/ https://thewhitehatter.ca/blog/parents-caregivers-and-educators-do-you-know-what-bop-stands-for-what-the-bop-house-is/ 2/ https://support.tiktok.com/en/business-and-creator 3/ https://thewhitehatter.ca/blog/how-covid-19-has-spawned-the-rise-of-the-app-onlyfans/ 4/ https://linktr.ee/ 5/ https://allmylinks.com/ 6/ https://beacons.ai/ 7/ https://thewhitehatter.ca/blog/from-digital-rabbit-holes-to-digital-black-holes-how-algorithms-are-changing-the-way-youth-and-teens-access-content-online/
- Why Apple’s Communication Safety and Google’s Sensitive Content Warnings Aren’t Enough to Protect Teens from Viewing or Sending Intimate Images
Despite the promise of AI-powered nudity detection and on-screen warnings, these tools do very little Their AI models are already capable of detecting nudity at a high level of accuracy. Real safety would require not just smarter AI, but braver decisions, ones that recognize the difference
- The Onlife Shift - How Social Media Has Transformed Connection and Why Digital Literacy Rather Than Banning Is More Crucial Than Ever for Our Kids
world we knew even just a few years ago, especially with the rapid rise of artificial intelligence (AI Today, it’s increasingly being run by artificial intelligence (AI). example, doesn’t necessarily show users content from people they know, it often shows them what its AI AI systems don’t have ethics. They optimize for engagement, not wellbeing. That data may be used to feed AI models or improve targeted advertising strategies.
- Creating an Online Balance With Your Kids
Instead of detoxing, aim to educate and mentor balanced digital behaviour. https://thewhitehatter.ca
- Should You and Your Child Watch The Netflix’s Series Adolescence? - Our Review
part series, we can say that “Adolescence” is a well-crafted, thought-provoking fictional drama that aims With social media, AI, and algorithms designed to amplify outrage and reinforce existing beliefs, it's
- Your Photos Could Reveal Your Location Even Without Your Phone’s Geotags Turned On!
With the rise of powerful AI tools with image capabilities, it is now possible to analyze photos and AI can detect these subtle clues much faster and more effectively than the human eye.












