SELECT LANGUAGE BELOW

A former employee claims Meta had a 17-strike rule for sex traffickers.

A former employee claims Meta had a 17-strike rule for sex traffickers.

Concerns on Social Media Platforms’ Handling of Human Trafficking

The former safety chief of one of Mark Zuckerberg’s social media platforms has raised issues regarding the company’s approach to human trafficking cases.

This claim emerged from a plaintiff’s brief tied to a lawsuit involving Instagram, Snapchat, TikTok, and YouTube. The lawsuit was initiated in the Northern District of California, asserting that these social media platforms relentlessly prioritized growth while carelessly overlooking the detrimental effects their products have on children’s mental well-being.

Vaishnavi Jayakumar, who previously led safety and welfare at Instagram, expressed her distress upon discovering that the platform had a “17x” strike policy for individuals accused of sex trafficking. She noted, “There are 16 violations related to prostitution and sexual solicitation, and a 17th violation could lead to account suspension.” Jayakumar also mentioned that this standard seemed quite high when compared to other companies in the sector and insisted that Meta’s internal records bolster her statements.

Reports indicate that Jayakumar brought this concern to attention in 2020, yet was informed that addressing it was too challenging. This coincides with claims that it has become significantly easier to report users for issues like spam, intellectual property violations, and firearm promotion.

In response to the allegations, Mehta firmly denied any wrongdoing, stating, “We strongly disagree with these claims, which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading image.” The spokesperson emphasized that for over ten years, the company has listened to parents’ concerns, researched pivotal issues, and made real changes aimed at protecting teens. Initiatives like Teen Accounts with embedded protections were highlighted as evidence of their commitment to safety.

Nonetheless, the lawsuit contends that Meta knew about the harm its platform inflicted and recognized that numerous adults tried to contact minors using its app. It also claims that Meta halted an internal study indicating that individuals who reduced their time on Facebook experienced lower levels of depression and anxiety. Dubbed “Project Mercury,” this study was initiated in 2019 to assess the effects of Meta’s apps on factors such as polarization and social interactions.

Moreover, the lawsuit has drawn parallels between social media platforms and cigarettes, suggesting that these companies market their products to children in a similar way that tobacco firms did.

A spokesperson for Google responded to the allegations, stating that the lawsuit misunderstands how YouTube operates and described the claims as entirely false. They portrayed YouTube as a streaming service, not a social network aimed at connecting friends, and noted that tools have been developed for young users, with input from child safety experts.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News