SELECT LANGUAGE BELOW

Parent anger at social media companies boils over ahead of tech CEO hearing 

As the Senate convenes CEOs of social media companies on Wednesday to pursue them over online harm to children, parents and advocates say the time for discussion is over and Congress must He said action was needed to protect teenagers.

Thursday’s Judiciary Committee hearing will include parents who became advocates after losing children to harm allegedly caused by social media companies. The hearing will feature testimony from Meta CEO Mark Zuckerberg, TikTok CEO Shou Zi Chew, X CEO Linda Yaccarino, Snap CEO Evan Spiegel, and Discord CEO Jason Citron.

The hearing will focus on online child sexual exploitation, but advocates say the damage is caused by social media companies amplifying cyberbullying and spreading harmful content that promotes eating disorders and self-harm. He said that it extends to.

“We hope this is the last hearing to discuss the issue of unregulated social media for children and youth,” said Josh, executive director of Fairplay, a nonprofit focused on children’s online safety.・Mr. Gorin said.

“We’ve had a lot of discussions and I think they’ve been enlightening and some very important points have been made. But in the end, we’re saving children’s lives and Building a safer, less addictive internet for kids isn’t about senators fooling CEOs, which they do in public hearings, but actually voting on bills, he added. .

A coalition of teens, parents and other advocates will appear at a hearing on a bipartisan bill that would add regulations to social media companies like the five that are on the floor Wednesday. It plans to promote the Kids Online Safety Act (KOSA).

While advocates have not been shy about accusing tech companies of doing too little to mitigate the risks posed by their services, they have also blamed lawmakers for failing to pass rules to hold companies accountable. There is.

“There’s no question that this is a priority for everyone who cares about children,” Christine McComas told The Hill.

McComas’ daughter Grace committed suicide at age 15 in 2012 after a sexual assault and subsequent cyberbullying on Twitter, a platform now known as X and owned by Elon Musk.

McComas plans to attend Wednesday’s hearing holding up a poster with her face covered in hateful comments about her daughter that were posted before her death.

McComas said parent advocates are “trying to share” their voices, but “the hardest thing is to continue to go out and tell your stories and nothing happens.” he said.

“To think your voice was heard, only to find out again that for some reason nothing was done. It’s really a moral obligation to get this done,” she said.

Efforts to regulate how social media companies operate for minors online have gained momentum in recent years, particularly since Facebook whistleblower Frances Haugen came forward in October 2021.

Despite hearings in recent years with Haugen, Meta’s second whistleblower, the head of Instagram, the CEO of TikTok, and executives from TikTok, YouTube, and Snapchat. Lawmakers have yet to enact laws to protect children online.

Advocates say the lack of rules leaves that job to parents, making it impossible for them to fulfill that role.

McComas said watching harm happen to your child online and not being able to prevent it is like a “slow-motion car crash.” McComas said her daughter didn’t have a Twitter account or a smartphone at the time she was being cyberbullied.

Neveen Radwan diagnosed her then 15-year-old daughter with an eating disorder in 2020 after she was shown content encouraging risky behavior and challenges while searching for workout videos during the coronavirus pandemic. He said he had developed symptoms.

Radwan had been proactive in trying to reduce risk by limiting screen time and using her background in IT to put powerful settings on her children’s phones, but it wasn’t enough. She said it wasn’t.

“I thought all the loopholes were closed. And yet, no, it was me who slipped through the cracks,” Radwan told reporters during a virtual press conference hosted by the Tech Oversight Project.

“They can blame the parents all they want. There’s nothing we can do to fight those algorithms. Parents are beyond redemption. No matter what we do, those You can’t fight the algorithms. They’re the big bad wolves,” Radwan said.

Social media companies are pushing back against widespread criticism that they are not doing enough to protect children and touting the policies they have put in place.

Meta, the parent company of Facebook and Instagram, has faced criticism for harming young people online. Ahead of the hearing, the company announced a series of policy updates, including restricting content related to self-harm and eating disorders from teenage users. Framework released I support legislation that advances proposals to add more parental controls.

Discord explains how its platform’s business model differs from others, focusing on chat rooms rather than algorithms pushing content and offering features aimed at teens, such as a “sensitive media” filter that blurs content. By touting its safety features, the company will aim to distance itself from other companies. A Discord spokesperson said the company sometimes “nudges” teens who are socializing with strangers they don’t have mutual friends with.

When Snap announced its support for KOSA last week, it was independent of other companies’ testimony.

KOSA was introduced at the Committee on Commerce, Science, and Transportation, but is likely to surface as part of a judicial hearing on how to regulate companies.

The bill’s sponsors, Sen. Richard Blumenthal (D-Conn.) and Sen. Marsha Blackburn (R-Tenn.), are also members of the Judiciary Committee and have spent the week promoting the bill. As part of our efforts, we plan to push for passage on Wednesday. Billing will be postponed.

“Big tech executives have been making the same empty promises for years, even as children are being subjected to horrific harm online. Without real, enforceable reform, social media companies will pretend to care about the safety of young people publicly, only to continue to prioritize their own interests privately,” the senators said in a joint statement.

KOSA will create a duty of care for social media companies to prevent and reduce harm to minors. It would also require companies to conduct annual independent audits to assess risks to minors and compliance with regulations.

KOSA left the Commerce Committee in July with bipartisan support, along with COPPA 2.0, a bill that would update data privacy rules for minors. Both bills were introduced in the last Congress, but were not called for a floor vote.

Golin said the bill needs to be passed this Congress.

“Children are dying every day because social media is unregulated,” he said.

“What we’ve seen so far is parents politely asking for online safety laws. If this drags on, we’re going to see some real outrage,” he added.

Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News