Recent jury decisions against major social media firms have triggered a wave of lawsuits claiming these platforms are harming children’s mental health.
In two pivotal cases, jurors found against social media giants, hinting at a possible shift in how courts regard the responsibilities of tech companies toward younger users. This could represent a significant moment for the industry’s approach to safeguarding kids online.
In New Mexico, a jury recently imposed a $375 million penalty against Meta, concluding that the company negatively impacted children’s mental health while hiding details about child sexual exploitation on its platforms. They determined a fine of $5,000 for each infraction of state consumer protection laws, which adds up given the extensive number of children using these accounts.
A Meta spokesperson shared their disagreement with this ruling, announcing plans to appeal. They stressed their commitment to user safety and acknowledged the challenges of identifying harmful content and actors. The spokesperson maintained confidence in their efforts to protect teens on the platform.
In California, a jury decided that Meta Inc. and Google’s YouTube owe a minimum of $3 million to a 20-year-old woman who claimed she developed an addiction to social media as a child, leading to mental health issues. The jury also suggested an additional $3 million in punitive damages, which is awaiting a judge’s final assessment. Meanwhile, TikTok and Snap reached settlements before the trial commenced.
Again, a Meta representative reiterated their intention to appeal, highlighting the complex nature of teen mental health and the assertion that it isn’t solely linked to one app. They argued in favor of their history of protecting younger users.
The rulings don’t currently require any redesign of social media platforms, which cater to billions globally. However, the upcoming second phase of the New Mexico trial might yield a court order mandating changes for local users. A state judge will assess if there’s been a public nuisance and could impose restrictions on Meta or require it to fund programs addressing potential risks to children.
New Mexico Attorney General Raul Torres, who initiated the lawsuit against Meta, aims to enhance enforcement around age restrictions and manage the presence of sex offenders. His suggestions even include decrypting communications that might obstruct police work.
Meta asserts that they continually strive to improve safety measures, mentioning changes like limiting access to explicit content, blocking unsolicited adult messages to minors, and regulating screen time for younger users. Both trials underscored the addictive qualities of platform algorithms and their adverse effects on children’s mental health.
In light of the ruling in California, Google defended YouTube, clarifying that it operates differently from social media platforms and is designed as a responsible streaming service.
This California case holds significant legal implications, labeled as a bellwether that might influence the outcome of numerous other lawsuits. Thousands of such cases are pending, particularly in California. The New Mexico ruling could foreshadow further lawsuits from public defenders.
Attorneys general from over 40 states have taken legal action against Meta, claiming its role in the mental health crisis facing young people. Many of these cases are currently in U.S. federal courts.
Despite these developments, technology companies continue to enjoy protections under Section 230 of the Communications Decency Act of 1996, a law that safeguards them from liability regarding user-generated content. This legal shield has historically complicated efforts to hold these platforms accountable for any alleged damages caused by their services.





