SELECT LANGUAGE BELOW

Former Meta executive claims Instagram put his teenage daughter in danger from ‘predators’ during shocking testimony in New Mexico.

Former Meta executive claims Instagram put his teenage daughter in danger from 'predators' during shocking testimony in New Mexico.

Testimony Reveals Alarming Social Media Risks for Minors

During a pivotal trial in New Mexico, shocking evidence emerged about the dangers minors face on social media. A former meth researcher’s underage daughter encountered unsolicited sexual content shortly after setting up her first Instagram account.

Arturo Bejar, who oversaw safety initiatives at Facebook from 2009 to 2015, shared with jurors that he returned as a consultant in 2019 due to worrying messages his then-14-year-old daughter received, including explicit images. “I was right there when she made the account,” Bejar mentioned, visibly affected. “I had no idea it would expose her to predators or people soliciting inappropriate photos.”

Since leaving Meta in 2021, Bejar has become a vocal critic of the company, acting as a key witness for state prosecutors who contend that the tech giant has neglected the safety of children for profit.

His testimony continued with further questioning from prosecutors and subsequent cross-examination. According to online safety experts, Bejar noted significant changes during his second time at Meta, especially as the platform grappled with competition from newer social media like Snap and TikTok. He commented that efforts to enhance child safety were frequently blocked by higher-ups, including Zuckerberg and Instagram’s head, Adam Mosseri.

For instance, there’s no option for users to report reasons for blocking others. Bejar recounted a troubling first encounter for his daughter when she received a direct message asking, “Would you like to have sex tonight?” without any effective method to report it.

He claimed that Meta lacked sufficient staff to address many reported safety violations, attributing this to algorithms that make it easier for offenders to locate potential victims. “This product—while it connects people effectively—can also link those with harmful intentions to children,” Bejar stated.

In response, Mehta’s representatives strongly rebutted Bejar’s claims, asserting that his notion of compromised safety during his time there was “simply untrue.” They emphasized ongoing research into teenagers’ experiences with Instagram, both positive and negative, which has led to new features aimed at enhancing safety.

Some of these improvements include creating private accounts by default for teens, implementing stricter messaging settings, and tools for parental monitoring, allowing teenagers to report or block unwanted interactions more easily.

However, Bejar alleged that Meta misrepresents the level of harm occurring on its platforms. According to an internal study called the “Bad Experiences and Encounters Framework,” 16.3% of users reported encountering sexual content on Instagram within a week. In contrast, Meta published data indicating that only a minuscule percentage, around 0.02% to 0.03%, of Instagram’s views involved nudity or sexual imagery.

Meta defended its Community Standards Enforcement Report, stating that it outlines prohibited content and draws on expert insights to ensure policies are accurately enforced.

This trial is one of several legal challenges Meta faces, including another in California over claims of fostering teenage social media addiction. In opening statements, Meta’s lawyers highlighted their commitment to safeguarding young users and transparency with parents regarding potential risks.

Just before the trial commenced, a Meta spokesperson criticized New Mexico Attorney General Raul Torres for conducting what he termed an “ethically compromised investigation” that allegedly endangered real children to evaluate the company’s safety practices.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News