“Predatory Tech”: Silicon Valley on Trial in Landmark Youth Social Media Addiction Case
Quick Read
Summary
Takeaways
- ❖The ongoing trial against Meta and Google is the first major legal challenge seeking accountability for social media's addictive design and its impact on youth mental health.
- ❖Plaintiff Kaye, who started using YouTube at six and Instagram at nine, experienced severe depression and suicidal thoughts, spending up to 16 hours a day on platforms.
- ❖Parents of deceased children testified, linking social media to fentanyl poisoning, 'blackout challenges,' and body image issues leading to suicide.
- ❖Meta's internal researchers knew body dysmorphia filters harmed children but the company took no action.
- ❖Facebook whistleblower Frances Haugen confirmed Meta's underinvestment in child safety teams and prioritization of engagement over user well-being, even when aware of negative impacts.
- ❖Social media algorithms are designed for 'engagement first,' often pushing extreme or harmful content to keep children hooked, rather than the content they initially seek.
- ❖TikTok and Snapchat settled out of court before the trial, suggesting an unwillingness to face public scrutiny alongside Meta.
Insights
1Deliberate Design for Addiction and Harm
Attorneys argue that social media companies intentionally built 'machines designed to addict the brains of children,' likening platforms to 'digital casinos' that profit from addictive behavior. Internal Meta research showed 18 out of 18 researchers knew body dysmorphia filters harmed children, yet no action was taken.
Kay's attorney stated, 'These companies built machines designed to addict the brains of children, and they did it on purpose.' Lori Shott mentioned, '18 out of 18 researchers, internal researchers for META knew that body dysmorphia filters were harming children and they did nothing is unacceptable.'
2Underinvestment in Child Safety and Prioritizing Engagement
Facebook whistleblower Frances Haugen revealed that Meta's child safety teams were severely under-resourced. The company conducted experiments showing that features like not sending alerts in the middle of the night made kids less stressed and slept better, but these were not implemented because they reduced Instagram usage by 1%.
Haugen stated, 'The team that was responsible for finding people who were distributing child abuse material... was so strapped for resources that if you'd given them a single engineer more, they probably would have accomplished 10 times as much.' She added, 'They knew that the kids said these changes, things like don't send me an alert in the middle of the night made kids less stressed, let them sleep better, and yet they didn't launch them because it also made them use, you know, Instagram 1% less.'
3Misleading Language Around 'Addiction'
Meta executives, including Instagram CEO Adam Mosseri, deny users can be 'clinically addicted' to social media, using a precise medical definition of addiction. However, this downplays the reality of 'compulsive use,' a scientific term for behavioral dependence that changes brain chemistry and significantly impacts children's ability to focus and interact.
Haugen explained, 'Addiction is a medical term... from a medical standpoint, that behavioral dependence is not considered medically to be addiction. But when you come in there and downplay what happens with compulsive use, which is the scientific term of art around these things, it really downplays how having a generation of children who get cooked at 7 8 9... it changes how their ability to sit still in class, to interact meaningfully face to face with their family or friends.'
4Algorithms Push Harmful Extremes, Not User Intent
Social media algorithms are programmed for maximum engagement, often overriding a child's stated intent. For example, a child looking for 'inspirational quotes' might be fed content about 'breakup and suicide,' or a search for 'gay pride' could yield 'Westboro Baptist Church, you're going to hell' content, because these extremes are deemed more 'unlookawayable'.
Laura Marquez Garrett described, 'Children will look for uplifting speeches, inspirational quotes and they will get breakup and suicide.' She added, 'When I would look up gay pride on Instagram, I would get half gay pride and half Westboro Baptist Church, you're going to hell.'
Bottom Line
The legal system is forcing transparency from tech giants, revealing internal documents and testimonies that expose deliberate harmful design choices.
This transparency could be a catalyst for public pressure and legislative action, potentially leading to stronger regulations and liability for tech companies.
Advocacy groups and legal centers can leverage these revelations to strengthen their cases and push for comprehensive digital safety reforms, including independent audits of platform algorithms and safety features.
There is a significant market gap for 'ethical tech' platforms that prioritize user well-being and community over engagement and profit.
Current tech leaders are seen as 'lazy' and lacking innovation in creating truly safe digital spaces, leaving an unmet need for alternative platforms.
Innovators can develop and market social media platforms designed with built-in safety features, age-appropriate content filters, and user controls (e.g., slowing down algorithms) that appeal to parents and young people seeking healthier online environments.
Key Concepts
Predatory Tech
This term, used by Lori Shott, describes how big tech companies, instead of being neutral platforms, actively design their products with features and algorithms that exploit user vulnerabilities, particularly children, to maximize engagement and profit, much like a predator targets its prey. It implies a deliberate, harmful intent behind platform design.
Engagement-First Algorithms
This model explains that social media algorithms are primarily programmed to maximize user engagement (time spent on platform, interactions) above all other considerations, including user well-being or safety. This design choice can lead to algorithms pushing extreme, controversial, or emotionally charged content, even if it's detrimental to the user's mental health, because it is effective at keeping them 'hooked'.
Lessons
- Parents should be aware that social media platforms may actively hide their true usage or content from them (e.g., apps hidden under calculator icons).
- Advocate for independent, third-party verification of social media safety features, as companies' self-reported measures are insufficient.
- Demand better from tech companies and policymakers, pushing for regulations that prioritize child safety and well-being over engagement-driven profit models.
Notable Moments
Juliana Arnold, whose 17-year-old daughter died of fentanyl poisoning after meeting someone on Instagram, shared her story, noting the trial coincided with her daughter's birthday.
This moment humanized the devastating consequences of social media's lack of safety, connecting the abstract legal proceedings to profound personal tragedy.
Todd Miner recounted his 12-year-old son Matthew's death from the 'choking or blackout challenge' promoted on social media, emphasizing the dangerous algorithms.
It highlighted how algorithms can amplify dangerous trends, directly leading to fatal outcomes for vulnerable children, and the role of law enforcement in identifying these links.
Lori Shott described her daughter Annalie's suicide, attributing it to body image issues exacerbated by social media and the discovery that Annalie had seen a live suicide on TikTok, which she had hidden on her phone.
This illustrated the hidden harms and the depth of content children are exposed to, as well as the lengths children go to conceal their usage from parents.
Attorney Laura Marquez Garrett displayed a tattoo on her forearm with her children's names and 296 rays of sun, each representing a child lost to social media or AI products.
This visual representation powerfully conveyed the scale of the tragedy and the personal commitment of those fighting for accountability, making the abstract number of complaints tangible.
Quotes
"These companies built machines designed to addict the brains of children, and they did it on purpose."
"They're putting profits over our kids' lives and that's why this trial is so important because it's the first time the public and legislators are going to get the truth."
"It's not about a verdict. It's not about who's right. It's not about a win. It's about our system working the way it's supposed to. It's about transparency. It's about truth."
"It's really not big tech, it's predatory tech. You know, the predators always blame victims and we were the victims of this."
"If they just gave people a choice and said, 'Hey, we notice you're looking at more and more depressing content. Do you want to keep doing this?' You do very simple things like this. But when we look at the oligarchs that run these tech companies, we've set a norm that we are supposed to just trust them."
Q&A
Recent Questions
Related Episodes

PBS News Hour full episode, Feb. 18, 2026
"Mark Zuckerberg testifies in a landmark social media addiction trial, while the U.S. shifts its Syria strategy, Iran braces for potential strikes, and a massive sewage spill impacts the Potomac River."

Matt McCusker | This Past Weekend w/ Theo Von #652
"Comedians Matt McCusker and Theo Von explore the absurdities of modern life, from ruthless backyard gardening and extreme extermination methods to the psychological impact of social media, AI's societal shifts, and the brutal realities of historical conquests."

The 2026 Midterms. Voters Demand Results, Not Promises | #TheOtherSideOfChange
"Progressive voters, particularly young, Black, and Brown communities, are deeply frustrated with the Democratic Party's incrementalism and lack of bold action, demanding tangible results over symbolic gestures as the 2026 midterms approach."

PBS News Hour full episode, March 26, 2026
"This episode details the escalating US-Iran conflict, Germany's debate on banning social media for children, the expansion of medical aid in dying laws, and Major League Baseball's new automated ball-strike system."