Concerns Over Social Media’s Impact on Youth
“Those who ignore history are condemned to repeat it.” This statement rings true as we look back at the 1990s when tobacco executives swore that nicotine wasn’t addictive. They were, without a doubt, misleading us. Later revelations showed that cigarettes were engineered to enhance their addictive properties, and children were specifically targeted to secure future smokers.
Now, we’re seeing a similar situation but, interestingly, it’s not just about cigarette brands. Today’s focus is on algorithms.
A new set of giants like Meta, TikTok, Snap, and Google has developed systems designed to enthrall children in ways that we might not fully grasp yet. Instead of lung damage, the harm is happening within their developing brains.
On February 9, a pivotal jury trial kicked off in California that could potentially reshape social media regulation. Attorney Mark Lanier highlighted a crucial point during his opening remarks: “These companies are producing machines designed to poison children’s brains, and they did it intentionally.”
The plaintiff, referred to as KGM, is taking Meta (Instagram) and YouTube to court, claiming significant mental health damage due to social media addiction. It’s noteworthy that Snap and TikTok settled prior to the trial, avoiding scrutiny that could have surfaced internal documents and compelled executive testimonies.
This lawsuit is significant. It challenges typical defenses that Big Tech usually relies on, like Section 230 and First Amendment arguments, asserting that the issue lies not in user content but in design flaws—specifically, algorithms designed to keep users engaged for as long as possible.
Just as Big Tobacco once used ammonia in cigarettes to enhance nicotine absorption, Big Tech has developed dopamine loops that exploit our impulse control.
This isn’t merely speculation. It’s been documented.
The platforms utilize variable rewards—think of the addictive nature of slot machines. When children refresh their feeds, there’s a sense of unpredictability. That uncertainty triggers dopamine responses, creating a cycle that’s hard to break.
Infinite scrolling and autoplay features eliminate natural pauses, and push notifications are crafted to draw users back the moment they lose focus. These aren’t just communication tools, they’re systems aimed at modifying behavior.
Internally, Meta employees referred to Instagram as a “drug.” They acknowledged the platform heightened body image issues for a significant number of teenage girls. Yet, what seems to matter more often is “time on device,” the crucial metric for ad revenue.
Some employees left, sounding alarms and presenting evidence. An internal study revealed that 32% of teenage girls who felt negatively about their bodies reported feeling even worse after using Instagram. Alarmingly, 40% of teenage boys experienced destructive social comparisons. The cycle continued: users who engaged with content related to eating disorders increased their app usage as their mental health declined.
Engagement leads to revenue—that’s the model. When findings from these studies came to light, instead of debating how to address these harms, company leaders discussed how to obscure the findings.
Former Meta Engineering director Arturo Bejar conducted an internal review after his daughter experienced unwanted attention on the platform. The findings were striking: over half of users reported negative interactions, and nearly a quarter of teens faced unwanted sexual advances. But only about 2% of harmful content was actually removed.
This situation is concerning. Recommendations for safety tools on Instagram—many of which were touted as solutions—were often either ineffective or entirely absent. Even vulnerable kids could easily access disturbing content despite age restrictions. Meanwhile, parents believed their children were safe.
In 2023, the US Surgeon General issued warnings about significant threats to youth mental health.
But the harm extends beyond anxiety and self-harm. Social media platforms have transformed into digital marketplaces for drugs. Apps like Snapchat and TikTok now serve as venues where sellers use emojis and disappearing messages to peddle pills, leading kids to mistakenly purchase counterfeit drugs laced with fentanyl.
Hospitals are witnessing the fallout. So are morgues. In fact, over 40 states are currently pursuing legal action against these tech companies.
This scenario mirrors the decline of Big Tobacco—rejection, whistleblowers, and ultimately, lawsuits leading to major regulatory shifts.
As Congress debates and courts advance, parents are left holding the responsibility of protecting their children from products that are designed to exploit their vulnerabilities. Experts suggest several steps:
- Delay giving your child a smartphone for as long as possible.
- Disable autoplay and push notifications.
- Discuss how these platforms function.
- Support legal actions demanding a responsibility to ensure safety.
Big Tech wagered that it could lure children faster than regulations could catch up. For a time, that gamble reaped rewards. But now, we’re beginning to see the smoke dissipate, revealing the fire beneath.
It’s crucial to recognize that these platforms are not neutral spaces. They’re risky products that should be approached with caution. Transparency, age-appropriate designs, and accountability are essential moving forward.
Big Tobacco had its own calculations. Now, Big Tech faces its moment.





