Elon Musk’s social media platform X (formerly Twitter) plans to hire 100 content moderators to further its goal of cracking down on online child sexual abuse content, X executives said Saturday. Ta. new york post report.
Friday blog post from X provided an update on the company’s efforts to eradicate CSAM from its platform.
“At X, we have zero tolerance for child sexual exploitation (CSE) and are determined to make X an inhospitable place for those who seek to exploit minors. We have made it clear that the issue is to work on online CSE,” the blog said. I read the post.
X announced plans to build a “center of trust and safety excellence” in Austin, Texas, where it will hire more “in-house” content moderators to ensure CSAM is removed from the platform. .
Joe Benarroch, X’s director of business operations, said the center’s “team is currently being built,” the Post reported.
CEO Linda Yaccarino wrote in a recent post: X“We will always do everything we can to keep children and minors safe.@X is a brand new company and over the past 14 months we have strengthened all our policies and enforcement to ensure that malicious Exploitative Content. ”
The company aims to complete the center’s opening and recruitment by the end of the year.
According to X’s latest report, the platform suspended 12.4 million accounts last year for “violating CSE policies.”
“This is an increase from 2.3 million accounts in 2022,” the company noted. “In addition to acting under our rules, we also [National Center for Missing & Exploited Children]. In 2023, X submitted her 850,000 report to NCMEC. This also includes the first fully automated report, which is more than eight times the report Twitter sent in 2022. ”
Prior to the introduction of “Fully Automated NCMEC CyberTipline Reporting,” employees had to manually create and review reports.
Blaze News previously reported that in December, X stepped up efforts to combat CSAM by scanning all uploaded videos and GIFs.
“Not only are we detecting more bad actors faster, we’re also building new defenses that proactively reduce the discoverability of posts containing this type of content,” the company’s press release said. It is stated that. “One such action we recently took resulted in a more than 99% reduction in successful searches for known child sexual abuse material (CSAM) patterns starting in December 2022.”
us senate judicial committee is scheduled to hold a hearing on January 31, where lawmakers will discuss the failures of social media companies to combat online child sexual exploitation content. The senators will hear from Yaccarino, Meta CEO Mark Zuckerberg, TikTok CEO Sho Zhi Chu, Snap CEO Evan Spiegel and Discord CEO Jason Citron.
Do you like Blaze News? Avoid censorship and sign up for our newsletter to get articles like this delivered straight to your inbox. Please register here!
