SELECT LANGUAGE BELOW

As juries shift their stance on social media’s impact on children, the strength of Big Tech begins to weaken.

As juries shift their stance on social media's impact on children, the strength of Big Tech begins to weaken.

Landmark Verdicts Against Social Media Companies over Child Welfare

For years, a chorus of parents, teenagers, medical professionals, educators, and insiders has argued that social media negatively impacts the mental health of young users, potentially leading to addiction, eating disorders, sexual exploitation, and even suicide.

In a significant development, juries in two different states recently sided with those concerns.

A jury in Los Angeles found both Meta and YouTube responsible for the harm caused to children using their platforms. Meanwhile, a New Mexico jury concluded that Meta knowingly endangered children’s mental health and concealed knowledge about child sexual exploitation occurring on its services.

This verdict was celebrated by advocacy groups, families, and child welfare organizations.

“The era of Big Tech invincibility is over,” declared Sacha Haworth, the executive director of The Tech Oversight Project. This newfound legal accountability, he added, reflects years of evidence and testimony that validate the experiences of young people and their families.

It remains to be seen whether these verdicts will foster real changes in the way social media companies engage with their younger users. Still, the outcomes indicate a shift in public sentiment against tech giants, potentially paving the way for more lawsuits and regulatory efforts. Historically, these companies have dismissed claims that their platforms cause psychological harm, attributing any issues to broader societal challenges or opportunistic bad actors.

In his testimony during the Los Angeles trial, Meta CEO Mark Zuckerberg appeared uncertain when questioned about the addictive nature of his company’s platforms, stating, “I’m not sure what to say to that. I don’t think that applies here.”

These verdicts reflect an increasing public desire for accountability from tech companies. However, it remains unclear if companies like Meta and Google will acknowledge this shift; both have indicated they intend to challenge the verdicts legally.

Arturo Béjar, a former engineering director at Meta who raised concerns about the impacts of Instagram internally before testifying before Congress in 2023, pointed out that jury trials help level the playing field against these large corporations. However, he emphasized that real regulatory measures would be necessary to enforce change.

Béjar mentioned that effective change often occurs only when regulatory bodies step in and impose requirements on companies. He noted that the officials involved in these recent trials have a significant opportunity to push for important reforms.

While both cases focused on the risks posed to children, there are notable differences. The New Mexico lawsuit, initiated by Attorney General Raúl Torrez in 2023, involved state investigators posing as children on social platforms to collect evidence on sexual solicitations and assess Meta’s response. The jury was tasked with determining whether Meta violated state consumer protection laws.

In contrast, the Los Angeles trial featured a lone plaintiff referred to as KGM, who brought the case against Meta, YouTube, TikTok, and Snap. Notably, TikTok and Snap reached settlements before the trial. KGM’s case emphasized that the design of Meta and YouTube’s platforms intentionally encourages addiction among younger users. Many families have filed similar lawsuits, with KGM and a few others selected as test cases aimed at shaping broader legal strategies, reminiscent of past lawsuits involving tobacco and opioids.

By targeting product design and liability rather than content distribution, these lawsuits effectively navigated around Section 230, which typically protects internet companies from liability for user-generated content. Previous legal efforts focused on content distribution have often faltered due to this legal shield.

“For the first time, courts have held social media platforms accountable for the potential harm arising from their product design,” said Nikolas Guggenberger, an assistant professor at the University of Houston Law Center. This marks a notable shift in legal policy that could significantly alter an industry previously protected by Section 230.

The resolution of these cases may take years due to appeals and settlement negotiations. However, experts agree that public awareness regarding social media’s risks is already evolving. In a recent Pew Research Center survey, nearly half of teens expressed concern about the harms posed by social media, up from just over a third two years earlier.

As social media faces increased scrutiny, the emergence of artificial intelligence chatbots presents a new challenge in ensuring technology remains safe for young users.

“While we can address current harms, we can’t predict future risks,” said Sarah Kreps, a professor and director at Cornell University’s Tech Policy Institute. She emphasized that with each new technology, there’s always a potential for new issues to arise. “People will be drawn to whatever new service emerges to meet that demand,” she added.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News