Democracy Now
Democracy Now
February 19, 2026

“Predatory Tech”: Silicon Valley on Trial in Landmark Youth Social Media Addiction Case

Quick Read

A landmark trial against Meta and Google exposes how social media algorithms are purposefully designed to addict children, leading to severe mental health crises and even death, as victims and whistleblowers demand accountability.
Meta CEO Mark Zuckerberg testified in a landmark trial over youth social media addiction, facing allegations of intentionally designing addictive platforms.
Victims' families shared harrowing accounts of children's deaths linked to social media, from fentanyl exposure to 'blackout challenges'.
Facebook whistleblower Frances Haugen revealed Meta's deliberate underinvestment in child safety and optimization for engagement despite knowing the harm to young users.

Summary

A landmark trial in Los Angeles has put Meta and Google on trial for youth social media addiction. The lawsuit, brought by 20-year-old Kaye, alleges that platforms like YouTube and Instagram fueled her depression and suicidal thoughts due to their addictive design. Meta CEO Mark Zuckerberg testified, facing scrutiny over Instagram's practices. Parents of children harmed or killed by social media, including those whose children died from fentanyl poisoning met on Instagram or from social media challenges, gathered to demand accountability. Facebook whistleblower Frances Haugen detailed how Meta underinvests in child safety and knowingly optimizes for engagement over user well-being, even when internal research showed harm. Attorney Laura Marquez Garrett explained that algorithms are programmed for engagement first, often pushing harmful or extreme content to vulnerable children, rather than what they seek, leading to "compulsive use" that alters brain chemistry and behavior.
This trial is a pivotal moment for corporate accountability in the tech industry, aiming to expose the deliberate design choices that prioritize profit over child safety. It highlights the urgent need for transparency, independent verification of safety features, and potential regulatory changes to protect young users from algorithm-driven harms, mental health deterioration, and dangerous online interactions.

Takeaways

  • The ongoing trial against Meta and Google is the first major legal challenge seeking accountability for social media's addictive design and its impact on youth mental health.
  • Plaintiff Kaye, who started using YouTube at six and Instagram at nine, experienced severe depression and suicidal thoughts, spending up to 16 hours a day on platforms.
  • Parents of deceased children testified, linking social media to fentanyl poisoning, 'blackout challenges,' and body image issues leading to suicide.
  • Meta's internal researchers knew body dysmorphia filters harmed children but the company took no action.
  • Facebook whistleblower Frances Haugen confirmed Meta's underinvestment in child safety teams and prioritization of engagement over user well-being, even when aware of negative impacts.
  • Social media algorithms are designed for 'engagement first,' often pushing extreme or harmful content to keep children hooked, rather than the content they initially seek.
  • TikTok and Snapchat settled out of court before the trial, suggesting an unwillingness to face public scrutiny alongside Meta.

Insights

1Deliberate Design for Addiction and Harm

Attorneys argue that social media companies intentionally built 'machines designed to addict the brains of children,' likening platforms to 'digital casinos' that profit from addictive behavior. Internal Meta research showed 18 out of 18 researchers knew body dysmorphia filters harmed children, yet no action was taken.

Kay's attorney stated, 'These companies built machines designed to addict the brains of children, and they did it on purpose.' Lori Shott mentioned, '18 out of 18 researchers, internal researchers for META knew that body dysmorphia filters were harming children and they did nothing is unacceptable.'

2Underinvestment in Child Safety and Prioritizing Engagement

Facebook whistleblower Frances Haugen revealed that Meta's child safety teams were severely under-resourced. The company conducted experiments showing that features like not sending alerts in the middle of the night made kids less stressed and slept better, but these were not implemented because they reduced Instagram usage by 1%.

Haugen stated, 'The team that was responsible for finding people who were distributing child abuse material... was so strapped for resources that if you'd given them a single engineer more, they probably would have accomplished 10 times as much.' She added, 'They knew that the kids said these changes, things like don't send me an alert in the middle of the night made kids less stressed, let them sleep better, and yet they didn't launch them because it also made them use, you know, Instagram 1% less.'

3Misleading Language Around 'Addiction'

Meta executives, including Instagram CEO Adam Mosseri, deny users can be 'clinically addicted' to social media, using a precise medical definition of addiction. However, this downplays the reality of 'compulsive use,' a scientific term for behavioral dependence that changes brain chemistry and significantly impacts children's ability to focus and interact.

Haugen explained, 'Addiction is a medical term... from a medical standpoint, that behavioral dependence is not considered medically to be addiction. But when you come in there and downplay what happens with compulsive use, which is the scientific term of art around these things, it really downplays how having a generation of children who get cooked at 7 8 9... it changes how their ability to sit still in class, to interact meaningfully face to face with their family or friends.'

4Algorithms Push Harmful Extremes, Not User Intent

Social media algorithms are programmed for maximum engagement, often overriding a child's stated intent. For example, a child looking for 'inspirational quotes' might be fed content about 'breakup and suicide,' or a search for 'gay pride' could yield 'Westboro Baptist Church, you're going to hell' content, because these extremes are deemed more 'unlookawayable'.

Laura Marquez Garrett described, 'Children will look for uplifting speeches, inspirational quotes and they will get breakup and suicide.' She added, 'When I would look up gay pride on Instagram, I would get half gay pride and half Westboro Baptist Church, you're going to hell.'

Bottom Line

The legal system is forcing transparency from tech giants, revealing internal documents and testimonies that expose deliberate harmful design choices.

So What?

This transparency could be a catalyst for public pressure and legislative action, potentially leading to stronger regulations and liability for tech companies.

Impact

Advocacy groups and legal centers can leverage these revelations to strengthen their cases and push for comprehensive digital safety reforms, including independent audits of platform algorithms and safety features.

There is a significant market gap for 'ethical tech' platforms that prioritize user well-being and community over engagement and profit.

So What?

Current tech leaders are seen as 'lazy' and lacking innovation in creating truly safe digital spaces, leaving an unmet need for alternative platforms.

Impact

Innovators can develop and market social media platforms designed with built-in safety features, age-appropriate content filters, and user controls (e.g., slowing down algorithms) that appeal to parents and young people seeking healthier online environments.

Key Concepts

Predatory Tech

This term, used by Lori Shott, describes how big tech companies, instead of being neutral platforms, actively design their products with features and algorithms that exploit user vulnerabilities, particularly children, to maximize engagement and profit, much like a predator targets its prey. It implies a deliberate, harmful intent behind platform design.

Engagement-First Algorithms

This model explains that social media algorithms are primarily programmed to maximize user engagement (time spent on platform, interactions) above all other considerations, including user well-being or safety. This design choice can lead to algorithms pushing extreme, controversial, or emotionally charged content, even if it's detrimental to the user's mental health, because it is effective at keeping them 'hooked'.

Lessons

  • Parents should be aware that social media platforms may actively hide their true usage or content from them (e.g., apps hidden under calculator icons).
  • Advocate for independent, third-party verification of social media safety features, as companies' self-reported measures are insufficient.
  • Demand better from tech companies and policymakers, pushing for regulations that prioritize child safety and well-being over engagement-driven profit models.

Notable Moments

Juliana Arnold, whose 17-year-old daughter died of fentanyl poisoning after meeting someone on Instagram, shared her story, noting the trial coincided with her daughter's birthday.

This moment humanized the devastating consequences of social media's lack of safety, connecting the abstract legal proceedings to profound personal tragedy.

Todd Miner recounted his 12-year-old son Matthew's death from the 'choking or blackout challenge' promoted on social media, emphasizing the dangerous algorithms.

It highlighted how algorithms can amplify dangerous trends, directly leading to fatal outcomes for vulnerable children, and the role of law enforcement in identifying these links.

Lori Shott described her daughter Annalie's suicide, attributing it to body image issues exacerbated by social media and the discovery that Annalie had seen a live suicide on TikTok, which she had hidden on her phone.

This illustrated the hidden harms and the depth of content children are exposed to, as well as the lengths children go to conceal their usage from parents.

Attorney Laura Marquez Garrett displayed a tattoo on her forearm with her children's names and 296 rays of sun, each representing a child lost to social media or AI products.

This visual representation powerfully conveyed the scale of the tragedy and the personal commitment of those fighting for accountability, making the abstract number of complaints tangible.

Quotes

"

"These companies built machines designed to addict the brains of children, and they did it on purpose."

Kaye's Attorney
"

"They're putting profits over our kids' lives and that's why this trial is so important because it's the first time the public and legislators are going to get the truth."

Juliana Arnold
"

"It's not about a verdict. It's not about who's right. It's not about a win. It's about our system working the way it's supposed to. It's about transparency. It's about truth."

Laura Marquez Garrett
"

"It's really not big tech, it's predatory tech. You know, the predators always blame victims and we were the victims of this."

Lori Shott
"

"If they just gave people a choice and said, 'Hey, we notice you're looking at more and more depressing content. Do you want to keep doing this?' You do very simple things like this. But when we look at the oligarchs that run these tech companies, we've set a norm that we are supposed to just trust them."

Frances Haugen

Q&A

Recent Questions

Related Episodes