SELECT LANGUAGE BELOW

FTC says social networks' data privacy, safety policies are 'woefully inadequate'

Major social media and video streaming companies are facing renewed scrutiny from the Federal Trade Commission, which released a new report on Thursday accusing the platforms of significant violations of user privacy and failing to provide safeguards for children and young people.

of The 129-page report, The report, released Thursday morning, found that over the past four years, multiple social media and video streaming platforms have engaged in practices that “consistently” did not prioritize consumer privacy.

FTC Chairman Lina Khan said the report determined that these platforms “collect and monetize vast amounts of Americans' personal data,” raking in billions of dollars each year.

“While profitable for companies, this surveillance practice can endanger people's privacy, threaten their liberties and expose them to a range of harms, from identity theft to stalking,” she wrote. In the release.

The companies included in the report's investigation included Meta Platforms, YouTube, X, Snapchat, Reddit, Discord, WhatsApp, Amazon, which owns gaming platform Twitch, and ByteDance, which owns TikTok. The Hill has contacted the companies for further comment.

To conduct its investigations, the FTC in 2020 asked the nine companies for information about how they collect and track users' personal and demographic information, and whether they apply content algorithms or data vetting to this information. They were also asked how they decide which ads and other content to show users, and how their platforms may influence young people.

The FTC said the companies' data management and retention practices were “grossly inadequate” and that they collected vast amounts of data “in ways that consumers would not expect,” including through online advertising and purchasing information from data brokers, according to the report.

The FTC report said that some companies were increasingly using data for their artificial intelligence systems, but users often were not informed of how their data was connected to these products.

The FTC noted that each of the report's findings may not apply to each company, and argued that the report is rather a general summary of approximately four years of research.

The agency's report also separately examined the impact of these practices on children and young people, concluding that they expose users to “unique risks.” FTC officials noted that social media algorithms are particularly dangerous to young people because they can push harmful content, such as risky online challenges, that could negatively impact the health of children and young people.

“The failure of some companies to adequately protect children and teens online is particularly disturbing, and the report's findings are timely as state and federal policymakers consider legislation to protect people from abusive data practices,” Khan wrote Thursday.

The report comes as user privacy, especially that of children and teens, has come under the spotlight of various members of Congress and child safety advocates. It comes one day after a House committee passed the Kids Online Safety Act (KOSA), advancing legislation aimed at strengthening children's online privacy and safety.

KOSA aims to regulate the types of features that tech and social media companies can offer to children online, reducing the addictive nature and mental health impact of these platforms.

The KOSA bill passed a House committee with overwhelming support in the Senate but could face challenges on the full House floor, where some Republicans worry the bill gives “broad powers” to the FTC and could lead to censorship of conservative voices, a House leadership source told The Hill this week.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News