SELECT LANGUAGE BELOW

Apple executive complained to Meta that his 12-year-old was ‘solicited’ on Instagram: lawsuit

In 2020, Meta employees panicked after the company confronted an Apple executive who was “recruiting” a 12-year-old child on Instagram, according to newly revealed details of a lawsuit.

The uproar comes as New Mexico Attorney General Raul Torres claims that underage users of Facebook and Instagram may be exposed to sex offenders and are being bombarded with adult sex content on the apps. The revelations come as part of a broader civil lawsuit he filed in December.

Unedited internal meta documents cited in the complaint show Apple employees “scrambling” to respond to the concerns of an anonymous Apple executive, with the response being that Instagram was forced into Apple's App Store. It is said that this was caused by fear of being kicked out.

“This is the kind of thing that would infuriate Apple enough to threaten them.” [sic] “Remove us from the App Store,” the document said, according to the complaint.

Employees at Mark Zuckerberg's social networking giant discussed when the company would “stop allowing adults to send messages to minors on IG Direct,” according to the complaint. In 2021, Meta began restricting adults over 19 from sending private messages to teenage users they don't follow.

According to unredacted details, Meta employees demanded the immediate removal of accounts associated with the phrase “sugar daddy” and warned that Apple would “reply with 100 more accounts if we are unable to remove them.” did.

Meta faces increased legal scrutiny over the safety of its platform. Reuters

In one internal chat in July 2020, a Meta employee reportedly asked, “What exactly are you doing for your child's grooming (something I just heard is common practice on TikTok)? “Is that so?” he asked. Another employee said it was “zero to negligible,” adding: “Child safety is definitely not on the agenda this half term.”

The New Mexico complaint also alleges in unredacted sections that Mehta “knew that a large amount of inappropriate content was being shared between adults and unknowing minors.” .

The lawsuit cited a 2021 internal announcement that “100,000 children per day experience online sexual harassment, including photos of adults' genitals.”

According to the complaint, another document from 2021 details Meta's discussion of concerns related to Facebook's “People You May Know” feature, which the company claims is “directly related to human trafficking.” It is said that he was aware that there was a

One Facebook employee allegedly wrote, “In the past, PYMK has been involved in up to 75% of all inappropriate contact between adults and minors.” Another worker asked, “Why on earth didn't you disable PYMK between adults and children?” And he described the lack of action as “really, really frustrating.”

New Mexico accused Meta of failing to protect underage users. Just Right – Stock.adobe.com

The complaint alleges that in another internal presentation on child safety issues in March 2021, Meta stated that there was a “lack of investment in sexual representation of minors on IG and “Conspicuously sexual comments,” he was quoted as saying.

A Meta spokesperson said the company has “spent a decade working on these issues and hired people who have dedicated their careers to keeping young people safe and supported online.”

“The complaint mischaracterizes our work using selective quotes and cherry-picked documents,” a Mehta spokesperson said in a statement.

Apple did not immediately respond to a request for comment.

The New Mexico Attorney General's Office's investigation into meth included the creation of test Facebook and Instagram accounts that allegedly depicted children under the age of 14.

An anonymous Apple executive reportedly told Mehta that his company's 12-year-old child was being solicited on Instagram. Maria Witkowska – Stock.adobe.com

The lawsuit alleges that Mehta and CEO Mark Zuckerberg engaged in “unacceptable” conduct by failing to adequately police malicious content.

“For years, Meta employees have worked to raise the alarm about how decisions made by Meta executives are exposing children to dangerous recruitment and sexual exploitation,” Torrez said in a statement. said. “Met executives, including Mr. Zuckerberg, consistently made decisions that prioritized children's growth over their safety.”

The earlier redactions earlier this month brought two major corporate advertisers, Walmart and Tinder's parent company Match Group, into a standoff with Meta for running ads next to content that sexualizes underage users. It became clear that

A coalition of 33 state attorneys general accuses Meta of profiting from “millions” of underage users and fueling a youth mental health crisis by incorporating addictive features into its apps. He is facing a separate lawsuit from.

In the face of increased legal scrutiny, Meta has publicly touted its efforts to improve online safety for teen users.

Earlier this month, Meta announced that it would “automatically place teens on Instagram and Facebook into the most restrictive content moderation settings” and reduce search results for upsetting topics such as suicide, eating disorders and self-harm. announced that they have started restricting the

with post wire

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News