New Mexico's unredacted indictment against Mark Zuckerberg's meth reveals horrific claims of sexual harassment of children on Facebook and Instagram. One internal statement estimated that 100,000 children are targeted by pedophiles every day on Zuckerberg's platform, including receiving “pictures of adult genitalia.”
Digital Content Next CEO Jason Kindt said in an X/Twitter thread that “most of the edits removed in the New Mexico Attorney General's complaint against Instagram and Facebook” are “more… It pointed out that the content was “shocking, upsetting, and upsetting.” It has come true.Here are the mostly unedited complaints: available here.
Wow, the New Mexico Attorney General's complaint against Instagram and Facebook just removed most of the edits. Seeing what was lifted (shown in yellow) is even more shocking, upsetting, and stomach-churning.
“2021 presentation estimates that 100,000 children per day…” /1 pic.twitter.com/k8vJGfeZvQ
— Jason Kint (@jason_kint) January 21, 2024
One of the highlighted sections of the complaint alleges that “100,000 children per day are subjected to online sexual harassment, including photos of adults' genitals.”
The complaint further states:
In 2020, Meta scrambled to call Apple executives after a 12-year-old boy was solicited on its platform, according to an internal document. This is the kind of thing that would infuriate the company.” '' and asked when “adults should stop messaging minors on IG Direct,'' adding that if they didn't address other accounts with “sugar daddies,'' “if we Then they'll respond with 100 more accounts.” You can't beat them. ”
“Meta's May 2018 presentation to its audit committee confirmed this fact: “While user-provided data shows a decline in usage among young users, a disproportionate number of young users This age data is not available because you registered with your exact age.'' The lawsuit continues.
“Two years later, in a January 2020 presentation entitled 'Success in American Messaging,' Meta's depth of knowledge about children's use of Messenger and how it leverages that usage to reach younger generations with its products” “Meth's ambitions to further encourage the use of To read.
The complaint also states that Meta recognizes that its platform is popular with children as young as 6 years old.
One of Meta's “endgame” goals was to “become the leading children's messaging app in the U.S. by 2022.” The document confirms that “In the US, Messenger is popular with children (13% advantage, US 510th).” Meta knows that her platform was used and “popular” by her 6-year-old children, making the company's failure to protect minors from CSAM and solicitation even more egregious. I am.
Many of these claims are based on inside information at the time of the harm. And there are still edits that we can't see. I can't imagine how bad it would be for items to remain sealed like this. /3 pic.twitter.com/nQM5MMSLnD
— Jason Kint (@jason_kint) January 21, 2024
Elsewhere in the thread, Kindt points to an internal conversation highlighted in the complaint in which one of Meta's employees spoke out against the grooming of a child on the company's social media platforms. He added that he had done nothing for the purpose. bad. “
When one employee asked, “What exactly do you do to discipline your children?'' another employee replied, “It's somewhere between zero and negligible. That's clearly a non-target. I would argue that Interop is making things even worse, but that's a can of worms.”
“Somewhere between zero and negligible.”
This internal quote was included in an Associated Press report on a previous version of this complaint that was unsealed this weekend.
On Instagram and Facebook…
“The prevalence of 'sex talk' to minors is 38 times greater on Instagram Direct vs. Facebook Messenger.”
/Five pic.twitter.com/kMMqdw57K8— Jason Kint (@jason_kint) January 21, 2024
Additionally, Meta's “People You May Know” (PYMK) feature is said to have contributed to 75 percent of all inappropriate contacts between adults and minors. Employees questioned, “Why on earth didn't they ban PYMK between adults and children,” as there was “a direct connection to human trafficking?” the complaint added.
In another revealing internal conversation, an employee said, “Teen self-harm and suicide are very difficult to explain publicly, so the current response appears complex and evasive… The fact that there are unenforceable (unenforceable?) age restrictions, and the fact that there are restrictions such as: understood. Important differences in policy stringency between IG and Blue App [Facebook] That makes it difficult to claim that we are doing the best we can. ”
Someone else then asked if Meta's policies could be improved or if it was an enforcement issue, adding: “I can definitely say we need to improve our enforcement and our policies.”
This is funny. Facebook and Instagram appear to have focused on their branding and infrastructure interoperability following the risk of structural and behavioral remedies from Germany/EU and the US, but as noted here, Genderless enforcement policies create risks. ” /7 pic.twitter.com/wSoPrmfBWl
— Jason Kint (@jason_kint) January 21, 2024
Another graph highlighted in the complaint shows that Instagram's own research found that users between the ages of 13 and 15 are more likely than average to be exposed to adult nudity and sex acts. It seems so.
this chart. Instagram's own internal research shows that 13- to 15-year-olds are more likely to be exposed to adult nudity and sex acts than the average user? Next, refer to the last column for publicly available data that can be misleading or suppressive regarding potential harm. /8 pic.twitter.com/WmEL1MUthh
— Jason Kint (@jason_kint) January 21, 2024
you can Follow Alana Mastrangelo Facebook and on X/Twitter @ARmastrangeloand further Instagram.





