SELECT LANGUAGE BELOW

Heartbroken parents who have lost children — including myself — urge Congress to take action against social media companies.

Heartbroken parents who have lost children — including myself — urge Congress to take action against social media companies.

Concerns Over Child Safety on Meta Platforms

A former Meta employee calmly raised alarms about the company’s handling of child safety, revealing that Facebook and Instagram are putting young users at serious risk.

Reports indicate that many children face challenges to their self-esteem, which can lead to anxiety, depression, and even suicidal thoughts, after engaging with content on these platforms.

It’s also clear that a significant portion of users in Meta’s virtual reality spaces are kids.

Whistleblowers Jason Suttizaan and Kayce Savage testified this month to the Senate Judiciary Subcommittee, highlighting how the company often prioritizes its interests over the safety of children.

They disclosed troubling practices, including the erasure of evidence related to sexual abuse and manipulation of data to downplay harm to young users.

Interestingly, they noted they were unable to question Meta’s legal team about topics that might elicit unfavorable responses.

This situation isn’t shocking, really. I’ve encountered numerous stories showing how Big Tech can adversely affect children and adolescents.

My family has had a close encounter with this reality. My daughter, Becca, experienced a significant social media-related trauma.

She seemed like any typical kid—hanging out with friends, chatting constantly, and sharing her life online. But at 15, Becca and her friends connected with a group of older boys on the internet, culminating in an incident where she was drugged and assaulted at a party.

The aftermath was devastating. When someone posted a private image of her on Snapchat, it opened the door to relentless cyberbullying, making her situation even worse.

Though Becca had a strong support system, including family and therapists, she eventually turned to drugs. We only fully realized the gravity of her struggles when she suffered an overdose.

In an effort to keep her safe from local dealers, we moved to my sister’s place in Maine.

Unfortunately, social media continued to erode her attempts at recovery.

In Maine, she connected with people on Facebook who facilitated her drug use, right before she was due to enter a rehabilitation program. Tragically, the substances she procured were mixed with fentanyl.

The next day, I found one friend alive but my own daughter was gone.

Access to illegal drugs through social media can be alarmingly easy—like ordering pizza or hailing a ride.

Becca’s story is just one example of why it’s critical to reconsider allowing children on these platforms. If that isn’t feasible, we need robust safeguards to protect them online.

Tech companies must be held accountable for creating environments that facilitate illegal activities leading to child exploitation and, in some cases, death. They should also answer for the implications of their design choices, particularly regarding algorithms.

Both Republican and Democrat senators were visibly disturbed by the revelations from the whistleblowers. They expressed a commitment to urgently advancing online safety legislation for children.

The proposed bill would require major tech companies to adopt reasonable measures to safeguard their young users, comparable to what is expected across various other industries.

Kosa specifically outlines care obligations for platforms like Facebook and Instagram, ensuring that products are designed to prevent risks like addiction, exploitation, and bullying.

Additionally, it would empower parents with tools to opt out of personalized algorithms and enhance their children’s privacy.

Last year, the Senate overwhelmingly supported Kosa, but House leaders have yet to allow it to go to vote. It’s reported that Big Tech is spending millions to lobby against it.

Meanwhile, Meta is invested in massive projects, like a $10 billion AI data center in Louisiana, suggesting strong pressure on Congressional leaders from the tech industry.

It would be a major mistake to dilute Kosa or let it weaken without enforcing meaningful changes.

At the recent hearing, more heart-wrenching accounts emerged from parents who have lost their children. One poignant story was shared by Maurine Molak, whose son David took his life after enduring bullying on Instagram.

Another tragic case involved Brian Montgomery, who lost his son Walker three years ago after he became a victim of an online predator. The predator extorted money from Walker after soliciting explicit photos, leading to a desperate act of suicide.

On average, American teens spend about nine hours a day online, leaving many parents in the dark about what is happening in that digital realm.

Tech companies excel at capturing attention and fueling anxiety, yet they operate with legal safeguards that allow them to act irresponsibly.

Sattizahn, one of the whistleblowers, isn’t just a temporary employee but someone who aimed to improve safety using his PhD in Integrated Neuroscience. Yet, during his six years at Meta, he consistently noticed a troubling trend of prioritizing profit over user safety.

He explained that product teams felt pressured to avoid any decrease in engagement and even reported being guided to restrict research looking into the platform’s harm.

Sattizahn lamented that research was often merely for appearances. “We need some research,” he noted, implying it lacked genuine intent.

Similarly, Savage, with a background in experimental psychology, shared findings from a study on youth safety. It became clear to her that user engagement took precedence over protecting children.

She expressed frustration, revealing that while Meta is aware of the dangers facing young users, it hasn’t taken meaningful steps to address them. There are many reports of children facing bullying, sexual assault, and exposure to inappropriate content on their platforms.

Regrettably, she couldn’t quantify how many children experienced these issues, as Meta did not permit her to conduct that research.

Through voice analysis, she believed she could discern ages, often finding many under 13 years old. Meta, however, benefits from keeping these users active, opting to overlook age verification, as it impacts their user numbers.

If the proposed online safety law takes effect, Meta must no longer ignore virtual assaults or related dangers that frequently occur on its platform.

They also cannot bury valuable research or researchers simply to sidestep uncomfortable truths. The bill mandates that platforms must disclose user experiences and outline their strategies for combating issues like predation and cyberbullying.

I was unaware of what my daughter could access on her phone.

Many parents need to recognize that current parental controls don’t effectively safeguard their children. Whistleblowers suggest less than 10% of parents use such controls, and children often know how to bypass them.

Keeping up with these digital landscapes is simply out of reach for most parents. Ultimately, the solution lies in making tech companies legally responsible for the harm they cause, similar to other industries.

They might lose some profit, but is that really a calamity?

As a mother who lost her child, I miss her every day. No, I don’t think that’s too much to ask.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News