The New Mexico and LA Social Media Rulings: Some Of Our Thoughts and Concerns
- The White Hatter

- 20 hours ago
- 7 min read

The recent civil trials in New Mexico and Los Angeles involving major social media platforms like Meta and YouTube has captured significant public attention, and rightly so. For many parents, caregivers, and digital literacy advocates, the outcome is a long awaited acknowledgment that something has not been right when it comes to youth, teens, and their experiences online.
At its core, the jury’s findings in both trials validated a concern that many families, educators, and researchers have been raising for years. The design of these platforms are not neutral and can contribute to harm, particularly when it comes to some young users, and that matters! It’s an important legal step forward in recognizing that what happens online is not just about individual choices, but also about how digital environments are built to manipulate that choice, which can sometimes cause harm, especially when social media and tech companies are willfully blind to this fact!
However, there is a risk in how we interpret this moment. If the conversation stops at assigning blame to companies alone, we will miss the bigger and more important issue. The real story is not just about who is responsible, it’s about understanding the greater echo system that shapes behaviour in the first place.
As developer Justin Philips put it:
“Youth and teens don’t just use social media platforms, they follow pathways shaped by design and development.”
When youth and teens log onto a platform, they are not stepping into a neutral space. They are entering an environment that has been carefully and purposefully designed to guide attention, influence decisions, and shape behaviour over time.
Features like infinite scroll, autoplay, push notifications, and recommendation algorithms are not random. They are intentionally design choices built to keep users engaged for as long as possible. Here at the White Hatter, we have argued for years that problematic social media use is real, and financially exploitive design amplifies this fact.
For adults, this can be challenging, however, for youth and teens, whose brains are still developing and who are naturally more sensitive to reward, social feedback, and novelty, these pathways can be even more influential.
So, when we see some youth and teens spending hours scrolling, comparing themselves to others, or being drawn deeper into certain types of content, it is not just about willpower or discipline, it's about how the system is designed to lead them there. To quote Justin Philips one last time:
“Less drift. More scaffolding.”
It is easy, and often emotionally satisfying, to look for a clear place to assign blame.
Blame the platform.
Blame the algorithm.
Blame the child.
Blame the parent.
However, none of these, on their own, give us a complete or helpful picture. Yes, holding social media companies accountable for how their products are designed is important when it comes to harm, and these two US based civil cases reflect a shift in that direction. At the same time, accountability at the corporate level does not replace the role we play as parents and caregivers, and both need to exist at the same time.
So, where do these two trials, and the hundreds of others that will follow, leave families?
#1 It reinforces something we have been saying for years, technology is not neutral. The platforms our kids use are designed environments, and those designs matter.
#2 It highlights the importance of helping young people understand how those environments work. Instead of only asking, “How much time are you spending online?” We should also be asking:
“What is this platform trying to get you to do?”
“How does it keep you coming back?”
“How do you feel after using it?”
These are the kinds of questions that build agency, resilience, and awareness in our kids, not just compliance.
These two trial do not negate the fact that youth and teens still need parent, caregiver, and educator guidance. They need help recognizing persuasive design, understanding how algorithms shape what they see, and learning how to set boundaries that support their well being.
Our concern, headlines, lawsuits, and strong opinions can create a sense that the only solution is to pull young people away from technology altogether. But that approach often overlooks reality. Technology is a part of our kid’s social world, their learning environment, and increasingly their future opportunities. The goal is not to raise youth and teens who avoid technology, the goal should be to raise children who understand it. When young people begin to recognize that what they see online is not just organic, but shaped by design, they are in a much stronger position to make informed choices.
The New Mexico and LA court decisions open the door to an important shift. Not just in how we think about responsibility, but in how we approach solutions. It invites policymakers to look more closely at product design. It encourages companies to take greater responsibility for how their platforms function, and it gives parents a powerful entry point for meaningful conversations at home.
However, this is not just about holding companies accountable, which is important, it’s also about helping our youth and teens navigate a world that is being shaped, more than ever, by systems designed to capture their attention and now with AI, there affection and emotions. Parents and caregivers are the keystone in this strategy. Not because they can control every interaction, but because they can build the awareness, critical thinking, and emotional resilience that young people need to navigate these systems for themselves.
One last thought, the New Mexico verdict came in at $375 million, while the Los Angeles case resulted in a $6 million award. Those are significant figures by most standards. However, when placed alongside the scale of these companies, the picture changes. Meta Platforms generates roughly $160 billion annually, and YouTube brings in about $60 billion. In that context, these awards represent a relatively small financial impact.
Both companies have also indicated they intend to appeal, a process that could take years to fully play out. The broader question remains whether cases like these will lead to meaningful changes in how these platforms are designed and operate. Based on past performance, we believe these social media juggernaughts will play the long legal game, while continuing to maximize their profits to their stakeholders as long as they can.
There is also a larger issue to consider. What happens when similar platforms are based in countries that do not recognize or enforce U.S. court decisions? That raises important questions about the limits of national rulings in a global digital ecosystem.
As Diana Graber, from Cyber Civics, has stated:
“Waiting for tech companies or legislation to protect kids isn't a strategy. Teaching them to think critically about the digital world they live in is”
When we understand these interconnected systems, we are better equipped to guide them. Not with fear, not with blame, but with clarity, intention, and a focus on what actually works.
When we focus only on blame, we risk narrowing our response. Blame is rooted in the past. It asks who is responsible, but it often stops short of helping us understand what comes next.
A systems based approach does something different. It looks at how platforms are designed, how algorithms shape behaviour, and how emerging technologies like social AI are beginning to influence not just attention, but relationships, emotions, and decision making. That forward looking lens matters, because the environment our kids are growing up in is not standing still, it is evolving quickly.
Digital literacy is what prepares young people for that reality. It gives them the ability to recognize patterns, question design, and adapt as platforms shift in response to legal pressure, public scrutiny, and market forces. These companies will continue to adjust, pivot, and innovate. The question is whether our kids will be equipped to do the same.
Many people are drawing comparisons between these cases and the tobacco trials of the past, and there are some parallels worth noting. Like those earlier cases, these social media trials have brought internal documents into the public eye that proved companies were aware that aspects of their products could cause harm for some youth. What the documents revealed in these two social media trials is that harmful design isn’t incidental or unintended, it’s built into the model itself. Engagement isn’t just a goal, it’s the product, and in many cases young users become the resource that fuels it.
However, there is an important distinction that needs to be made. Tobacco is a substance and a drug that is inherently harmful. There is no safe level of use, and for minors, abstinence is the appropriate response. Social media is different, it’s not a substance, and for many young people it can offer real value, including connection, creativity, learning, and a sense of community. We shouldn’t conflate these two realities.
The tobacco industry eventually faced serious legal and financial consequences. However, they did not disappear, they adapted. They shifted from traditional cigarettes to new delivery methods such as vaping and oral nicotine products, opening up new markets and continuing to generate billions in revenue.
We have predicted, and are seeing, a similar pattern beginning to emerge in the tech space. As pressure builds around traditional social media platforms, there are clear signs of a pivot toward social AI. These are not just tools for content consumption, but systems designed for interaction, companionship, and influence at a much deeper emotional level.
This matters because the underlying business model has not fundamentally changed. Attention, engagement, and now emotional connection remain central. If history teaches us anything, it is that industries under pressure rarely stand still, they evolve.
The question for parents, caregivers, educators, and legislators, isn’t just what these platforms have done, but where they are going next, and whether we are preparing young people to understand and navigate that shift.
Yes, these social media platforms needed to be held accountable for the role they played in contributing to harm. However, if we truly want to support our kids over the long term, we cannot stay focused only on what has already happened, we also need to prepare them for what comes next. This is why digital literacy matters more than ever, particularly in the wake of recent jury verdicts and the emergence of social AI.
Digital Food For Thought
The White Hatter
Facts Not Fear, Facts Not Emotions, Enlighten Not Frighten, Know Tech Not No Tech














