SELECT LANGUAGE BELOW

Jury in L.A. holds Meta and Google responsible for harmful and addictive app designs aimed at children

Jury in L.A. holds Meta and Google responsible for harmful and addictive app designs aimed at children

Jury Rules Against Meta and Google in Social Media Addiction Case

After extensive deliberation lasting nine days and over 40 hours, a jury in Los Angeles made a significant ruling on Wednesday, finding Instagram (owned by Meta) and YouTube (owned by Google) responsible for intentionally designing their platforms to “hook” young users.

The jury concluded that these platforms were constructed to promote addictive behaviors, resulting in a $3 million compensation award. Meta was held 70% liable for the damages, while Google was assigned the remaining 30%.

The case involved a 20-year-old plaintiff from California, referred to as “Kaley” to maintain her privacy. She shared in her testimony that the addictive nature of these platforms contributed to her depression and suicidal thoughts. Both companies denied these allegations, claiming their platforms are safe and have robust parental controls. Still, this verdict represents a notable change in how social media design is scrutinized legally.

“I stopped engaging with family because I was spending all my time on social media,” Kaley said.

In response, José Castañeda, a spokesperson for Google, expressed disagreement with the ruling, stating they plan to appeal. “This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site,” he mentioned.

Meta also voiced its disagreement, indicating, “We respectfully disagree with the verdict and are evaluating our legal options.”

However, the jury found that both companies acted with “malice, oppression, or fraud,” leading to additional punitive damages beyond the $3 million in compensatory damages.

Initially, TikTok and Snap (the parent company of Snapchat) were named defendants but settled before the trial for undisclosed amounts, leaving only Meta and YouTube to continue facing legal scrutiny.

Kaley recalled starting her social media journey with YouTube at age six and Instagram at nine. She explained that checking Instagram was her first action daily and the last before sleep. This constant engagement, she noted, significantly impacted her school performance, family life, and mental health.

Her legal team, headed by Mark Lanier, highlighted specific design elements, such as YouTube’s “autoplay” feature, which encourages continuous viewing. Kaley revealed that she has been diagnosed with anxiety, depression, and body dysmorphia, stating, “No, I didn’t,” when asked if she had faced these issues before using social media.

In their defense, attorneys for Meta and Google contended that “social media addiction” isn’t recognized as an official medical condition and pointed out that Kaley had not sought treatment for such an issue.

This trial is now seen as a crucial moment in determining whether tech companies can be held legally accountable for their younger users’ mental well-being. Historically, large tech firms have been protected by Section 230 of the Communications Decency Act of 1996, which shields them from liability over user-generated content. However, this trial highlights that product design—specifically how users are engaged—does not necessarily fall under that protection.

This ruling comes on the heels of other significant legal setbacks for Meta, including a $375 million judgment in a separate case in New Mexico related to the platform’s failure to prevent child exploitation.

New Mexico Attorney General Raúl Torrez praised the jury’s decision as a crucial step toward justice, coinciding with a Delaware court ruling that released Meta’s insurers from financial responsibility for potential damages. This change imposes more financial burdens on Meta, requiring them to handle legal defenses and settlements without insurance support.

“Juries in New Mexico and California have recognized that Meta’s public deception and design features are putting children in harm’s way. My priority remains to change the company’s dangerous practice of prioritizing profits over children’s safety. We will seek court-mandated changes to Meta’s platforms to protect kids,” Torrez stated.

There are already 2,407 lawsuits against Meta, TikTok, Snapchat, and YouTube regarding social media-related harm, with legal actions initiated by over a thousand school districts and 43 states on behalf of affected children.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News