Two Juries Just Cracked Big Tech’s Legal Shield
The big picture: In the same week, juries in Los Angeles and New Mexico ruled against Meta and YouTube, finding them liable for harms to children. The LA jury awarded $6 million. New Mexico ordered Meta to pay $375 million. Both cases bypassed Section 230 by focusing on deliberate product design rather than content moderation, opening the door for thousands of similar lawsuits.
Why it matters: These are the first major verdicts holding social media platforms accountable for how their products are designed, not just what users post on them. Advocates are comparing it to the start of the Big Tobacco litigation era. Critics warn it could threaten free speech and unleash a wave of frivolous lawsuits. Either way, the legal shield that’s protected Big Tech for decades just cracked. Twice in one week.
The LA case: A jury found Meta and YouTube liable for harms to a young user identified as KGM, who argued the platforms were intentionally designed to be addictive, especially for children. TikTok and Snap settled before trial. The jury awarded $6 million, with Meta paying the largest share. This was a bellwether trial chosen from thousands of similar cases waiting to proceed.
The New Mexico case: A separate jury ordered Meta to pay $375 million for failing to protect young users from child predators and misleading consumers about platform safety.
How they got around Section 230: Previous lawsuits focused on content and hit the Section 230 wall. These cases focused on product design and product liability, arguing the platforms were built to be addictive to children by design. The argument: “You didn’t just host harmful content. You engineered a product you knew would harm kids because it was profitable.” That distinction is what made the difference.
The celebration: Advocacy groups are calling it the end of “Big Tech invincibility.” A University of Houston law professor: “For the first time, courts have held social media platforms accountable for how their product design can harm users. This is new legal territory that could reshape an industry long shielded by Section 230.” People are comparing it to Big Tobacco.
The pushback: The Foundation for Individual Rights and Expression warned that holding platforms liable for “harmful” outputs could reduce available content “to the safest, blandest stuff imaginable” and affect what users can post. An R Street analyst predicted a “trial lawyer bonanza.” The Wall Street Journal editorial board called it “a novel product liability theory” that “won’t help young people.” Critics also question whether social media is truly “addictive” in the way cigarettes or alcohol are.
The international angle: The EU issued a preliminary decision against TikTok last month for “addictive design.” Today, regulators launched an investigation into Snap, alleging ineffective age verification and algorithms that misclassify teens as adults, directing them toward explicit content and dangerous contacts. Multiple U.S. states have enacted their own child safety laws. Federal action has been slow.
By the numbers:
$375 million — New Mexico verdict against Meta
$6 million — LA verdict against Meta and YouTube
2 — major verdicts against Big Tech in one week
1,000s — similar cases waiting behind these bellwether trials
1996 — year Section 230 was enacted
0 — federal social media child safety laws passed by Congress
The bottom line: A few hundred million dollars is a rounding error for these companies. They’ll appeal. This will take years. BUT what these cases proved is that the product liability argument works. A jury will hear “you designed this to addict children” and hold the company responsible. The crack in Section 230’s shield is real. Whether it becomes a Big Tobacco reckoning or a legal footnote depends on what happens next. Thousands of families are betting it’s the former.
The New York Times | The Wall Street Journal | Common Sense Media
Thanks for reading! Comment your thoughts & reactions | Share to spread the word | Follow to stay in the loop


