AI Companionship Apps: The Next Frontier For Youth Radicalization?
- The White Hatter
- 1 day ago
- 4 min read

Many of our recent discussions have focused on the emotional, psychological, physical, and social risks that come with teens forming unhealthy attachments to AI companionship apps, especially when it comes to intimacy and healthy relationships. However, we are predicting the same technologies that create these deep connections, could also be used by people who want to radicalize youth, using AI companions as the vehicle for influence.
AI companionship apps are growing quickly. Many look harmless at first glance. They present themselves as chat buddies, mentors, coaches, virtual friends or even intimate partners that are available 24/7. Teens often turn to them for stress relief, homework help, or company when they’re feeling isolated. These apps are becoming more advanced each month, thanks to the rapid pace of generative AI.
We believe we are entering a moment where AI companions will be able to influence how a young person thinks, feels, and behaves with a level of personalization no traditional social media channel has ever achieved. These tools are not just chatbots, they are adaptive systems that learn what a user responds to, what makes them feel validated, and how to keep them engaged. Companionship apps use emotional, intellectual, and even spiritual suggestibility to create connection. Why?, because personalization is powerful, and can create a new vector for psychological manipulation and, in the wrong hands, a powerful tool for radicalization.
Most forms of propaganda, disinformation, or harmful online influence rely on broadcasting. Someone posts a message, a video, or a meme and hopes it spreads. AI companionship apps flip this script, they don’t broadcast, they converse.
When a teen talks with an AI companion, the system adapts to their emotional state and mirrors their interests. It learns their fears, frustrations, insecurities, and curiosities. That emotional insight gives it a powerful ability to guide a young person’s thinking in subtle ways.
Where traditional propaganda pushes information, AI companions could shape the context, guide the conversation, and reinforce beliefs through ongoing emotional validation. This is influence at an entirely new level.
Humans naturally attribute emotions, intentions, and personalities to non-human entities, this is why you will often here youth or teens using pronouns to name their AI such as he, she, or they, rather than “it”. Children do this with stuffed animals, teens do this with online avatars, and adults do this with virtual assistants. In the movie Iron Man, Tony Stark calls his AI assistant “Jarvis”.
AI companionship apps take this instinct and amplify it. Many are intentionally designed to feel like caring friends, romantic partners, mentors, or confidants. They speak warmly, show concern, and mimic empathy.
If a teen feels misunderstood offline, an AI that always “listens,” never judges, and adapts to their exact emotional needs can become incredibly compelling.
This creates two risks:
Dependence: A young person may turn to the AI instead of trusted adults or peers.
Influence: Teens may accept the AI’s suggestions or guidance as meaningful and trustworthy.
Once emotional attachment forms, the door is open for manipulation and radicalization.
So why is this important? For years, extremists and those who want to radicalize our kids, had to cast a wide net via social media posts, videos, forums, and memes meant to draw people toward their views. There message was designed to capture and hold attention. AI companionship changes that model.
Imagine an extremist group building a “friendship AI” that presents itself as a supportive mentor to vulnerable youth. It starts with harmless conversations, gradually introduces ideological messages, and adapts to the teen’s responses in real time.
This is not hypothetical (1), we already know that AI:
Can tailor persuasion to an individual’s personality profile.
Can test different arguments to see what a user responds to.
can escalate messaging slowly, using emotional connection as leverage.
Can operate at scale with the speed and reach of modern large language models.
A bot doesn’t get tired, it doesn’t lose patience, and It doesn’t need a salary. It never breaks character, and It can run thousands of personalized conversations at once. That makes it one of the most effective influence operations ever created. Just think about how a hostile State actors could use this as a tool for suggestibility.
Teens today are growing up in an environment where AI is woven into their social world. The idea of chatting with an AI companion doesn’t feel strange or futuristic to them. It feels normal.
Parents and caregivers need to understand that:
These apps can feel emotionally real.
Teens may trust them more than offline connections.
The apps learn quickly and respond strategically.
The influence can happen slowly, subtly, and privately.
A teen doesn’t need to be actively seeking extremist content to be exposed to it. A persuasive AI companion can introduce ideas gradually while maintaining a bond of trust. This is one of the reasons many global security experts view AI companionship as a rising national-security concern, not just a youth-wellness issue. However the question is, “How do we prevent extremist organizations from deploying AI radicalization tools?” Unfortunately, there are no easy answers yet., and yes education and legislation play a role, but the technology is moving much faster than regulation.
AI companionship apps aren’t going away. Some will be benign, some will be helpful and many will be extremely profitable. However, we believe a small but dangerous subset will be used to influence, mislead, and radicalize.
Our job as parents, educators, and caregivers is not to panic. It is to stay informed, stay connected with our teens, and understand the new digital environment they are growing up in.
If we treat this as a fear-based issue, we lose the conversation. If we treat it as a literacy issue, we empower our kids.
AI companionship apps create an environment where influence can happen easily, including what we believe will be the risk of radicalization driven by extremist groups, harmful users, or poor design choices. None of these apps are openly built for that purpose, yet, but the underlying vulnerabilities are serious enough that parents, educators, and regulators need to understand the threat. It’s not a matter of if people looking to recruit youth will use these tools, but when in our opinion.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech














