UpScrolled: What Parents and Caregivers Need to Know About a Fast-Growing App, and Its Radicalization Risks
- The White Hatter

- 4 hours ago
- 8 min read

In early 2026, a new social media app called UpScrolled surged to the top of app stores in both Canada and the United States, drawing in millions of users in a matter of weeks (1). For many youth and teens, it looks and feels like a familiar mix of TikTok, Instagram, and X (old Twitter) however, beneath that familiar interface are some important differences that parents and caregivers need to understand.
UpScrolled is a short form video and microblogging platform created in 2025 by tech developer Issam Hijazi and as advertised for those who are 16yrs and older, although there is no real age verification process. It was built around a core idea that there is no algorithmic suppression, no “shadow banning,” and a strong emphasis on free expression.
Shadow banning is a term used to describe a situation where a user’s content is quietly limited or restricted on a social media platform, often without the user being clearly told it is happening. From the outside, everything appears normal. A young person can still post, comment, and interact just as they always have. However, behind the scenes, the platform may be reducing how widely their content is shown, sometimes limiting it to only a small audience.
Shadow banning is when you believe you are are sharing content with a wide audience, but the platform’s algorithm may be quietly deciding otherwise. This can happen on platforms such as TikTok, Instagram, or X, where content visibility is largely controlled by automated systems. These systems determine what gets amplified, what gets limited, and what may not be shown at all.
There are a number of reasons why this might happen. In some cases, content may be flagged as potentially violating community guidelines, even if it is not explicitly removed. Certain hashtags or keywords can also reduce visibility. Repeated reports from other users or patterns of activity that appear unusual, such as rapid posting or commenting, may also trigger these restrictions. Importantly, these decisions are often made by algorithms, not people, and they are not always clearly explained to the user.
What makes shadow banning particularly confusing is the lack of transparency. Unlike a formal warning or account suspension, there is often no notification that anything has changed. A young person may simply notice that their posts are getting fewer views, likes, or comments, without understanding why. This can lead to frustration and self-doubt, especially for teens who may already place value on online engagement as a form of social feedback.
From a parenting perspective, this is where understanding the environment matters. Reduced engagement is not always a reflection of your child’s effort, creativity, or social standing. Often, it is a reflection of how the platform is designed to control and filter content. Helping youth and teens understand this distinction can play an important role in protecting their confidence and supporting a healthier relationship with technology.
Unlike platforms that curate content using complex algorithms, UpScrolled promotes a more chronological feed, where posts are shown largely in the order they are shared and engaged with. Its rapid growth has been fuelled in part by users who believe mainstream platforms are censoring certain viewpoints, particularly around global political issues.
At face value, this may sound appealing given that transparency and open expression are important values. However, when we look deeper, this same design philosophy can also create conditions where harmful content spreads more easily.
UpScrolled positions itself as a platform that avoids hidden manipulation and prioritizes user voice. However, when a platform reduces or struggles to scale moderation while experiencing explosive growth, content doesn’t just become more visible, it becomes less filtered. Even the platform’s own founder has acknowledged that moderation has struggled to keep pace with user growth (2). This matters because online spaces are not neutral environments. They are shaped by what is allowed to circulate, how quickly it spreads, and who amplifies it.
Radicalization online is not just about extremist ideology, it also includes any content that pulls a young person toward increasingly rigid, polarized, or harmful worldviews. That can happen across the political spectrum, within conspiracy communities, the manosphere, or through identity based narratives. Several structural features of UpScrolled could increase this risk:
Ideology Driven Migration
Many of the people joining this platform are not discovering it by chance. They are intentionally migrating to it because they believe their perspectives were limited, filtered, or suppressed on other social media platforms. That sense of shared experience becomes a powerful draw, bringing together individuals who already see the world in similar ways.
When this happens, a pattern begins to form that researchers often describe as “ideological clustering”. Users gravitate toward others who think like they do, and the content they encounter begins to reflect and reinforce those same beliefs. As more like minded voices enter the space, alternative viewpoints naturally become less visible, not always because they are removed, but because they are drowned out.
Over time, this environment can evolve into what is commonly referred to as an echo chamber. In these spaces, ideas are not regularly tested or challenged. Instead, they are repeated, affirmed, and strengthened through constant exposure. This repetition can gradually push beliefs further in one direction, not necessarily because they are more accurate, but because they are consistently supported without meaningful opposition.
Rapid Growth Outpacing Safety Systems
UpScrolled’s rise has been exceptionally fast, moving from a relatively unknown project to a platform with millions of users in less than a year. While that kind of growth may signal popularity, it can also place significant strain on the systems designed to keep users safe.
When platforms expand at this pace, there are often growing pains. Moderation teams may be too small to manage the volume of content being posted. Rules that exist on paper can be applied inconsistently, and harmful content may remain visible longer than intended simply because there are not enough resources to address it quickly.
While testing and evaluating this app, we found concerning material on the platform, including hate speech, conspiracy driven content, and posts that appear to support or glorify extremist ideologies. These types of content are not unique to any one platform, but their impact is closely tied to how quickly and effectively they are addressed.
This is where an important distinction needs to be made. Having community guidelines is one thing. Enforcing them in real time, at scale, and with consistency is something entirely different. In online environments, the speed and effectiveness of enforcement often matter far more than the policies themselves.
“Free Speech” Framing Without Friction
UpScrolled presents itself as a space built on freedom of expression, with very little interference in what users can post or share. For many people, that approach feels refreshing. It taps into a growing desire for transparency and open dialogue, especially among those who feel restricted on more heavily moderated platforms.
However, when barriers to posting and sharing are reduced, the same openness can also allow problematic content to circulate more easily. Misinformation can spread without being quickly challenged, extremist messaging can find an audience, and emotionally charged or polarizing content can gain traction simply because it provokes a strong reaction.
Environments like this can unintentionally create conditions where radicalization is more likely to take hold. Content can move quickly, reactions can escalate just as fast, and opposing viewpoints may struggle to gain visibility. When ideas are repeatedly reinforced without meaningful challenge, and emotional responses are continually heightened, it becomes easier for individuals to be drawn deeper into more rigid or extreme ways of thinking.
Emotional and Identity Based Content
A significant amount of the content gaining momentum on the platform is centred around global conflicts, identity, and issues tied to perceived injustice. These are important topics and not inherently harmful. However, they tend to carry a level of emotional intensity that can make them highly polarizing and, in some cases, more susceptible to manipulation.
For youth and teens, who are still developing their sense of identity and strengthening their critical thinking skills, this kind of content can feel especially powerful. It can come across as urgent, deeply personal, and framed in ways that present issues as clear cut or morally absolute. When content is experienced in this way, it can reduce space for nuance and reflection.
This is where vulnerability to influence can increase. When emotions are heightened and perspectives feel definitive, it becomes easier for young people to adopt viewpoints without fully exploring the complexity behind them.
Cross Ideological Risk (Not Just One Side)
It’s important for parents and caregivers to recognize that this is not an issue tied to any single belief system or political viewpoint. The risks associated with platforms like this are not confined to one side of the spectrum. Instead, they can emerge across a wide range of ideologies, depending on the type of content a young person is exposed to and how that content is presented.
We witnessed antisemitic narratives, extremist messaging, and conspiracy content that promotes or normalizes violence. While these examples may differ in origin or perspective, they share a common thread in how they can influence thinking and shape beliefs over time.
Radicalization, in this context, is less about a specific group and more about a process. It involves content that gradually pulls an individual toward more rigid, absolute ways of thinking, often limiting their ability to consider alternative viewpoints. As that process unfolds, balanced perspectives can become harder to access, and the space for critical reflection begins to shrink.
For many youth and teens, platforms like UpScrolled can feel different from the apps adults are more familiar with. They can come across as more authentic, less filtered, and more aligned with the idea of “real voices” being heard without interference. There is often a strong sense of community, where users feel like they are part of something shared, connected by common beliefs or experiences. That can be especially appealing during a stage of life where identity and belonging are still being shaped.
However, those same qualities can also introduce challenges that are not always immediately visible. When a platform leans heavily into one type of perspective or community, it can naturally limit exposure to a wider range of viewpoints. At the same time, content that is emotionally charged or polarizing tends to gain more attention, which can increase how often youth are exposed to it. Over time, repeated exposure to similar narratives can begin to normalize more extreme or one-sided ways of thinking, even if that was not the original intention.
What is important for parents and caregivers to understand is that this process is rarely sudden. It does not usually involve a dramatic shift in thinking overnight. Instead, it tends to unfold gradually, through small, consistent exposures that shape how a young person sees the world. Because of that, it can be difficult to notice in the moment, but its impact can build over time if left unexamined.
We see UpScrolled as a reflection of a broader shift we are seeing online with some apps, where platforms built around transparency and free expression can unintentionally create environments where harmful content spreads more easily.
For parents and caregivers, the purpose of this article is to bring attention to a relatively new app that, based on what we have observed, is not suited for youth and teens. Rather than reacting out of fear, our goal is to offer insight into how the platform functions, what kind of content is gaining traction, and the environment it creates for those who use it.
When we take the time to understand the structure and culture of an app, we move into a much stronger position as parents. It allows us to move beyond assumptions and make informed, thoughtful decisions about whether a platform aligns with our child’s level of maturity, critical thinking skills, and ability to navigate complex online spaces.
Ultimately, this is not about labeling an app as simply good or bad. It is about understanding the environment it creates, the types of interactions it encourages, and how those factors may influence a young person over time. With that understanding, parents and caregivers are better equipped to guide, support, and, when necessary, set appropriate boundaries around its use.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech
References














