Internal research at Meta, led by Mark Zuckerberg, has been limited regarding the safety concerns of virtual reality apps. This includes troubling claims that users of the platform made inappropriate advances toward children under ten, as revealed by whistleblower statements on Monday.
Two whistleblowers, one being Jason Sattizahn, a former safety researcher at Meta, highlighted a disconcerting research trip to Germany taken in April 2023.
A German mother conveyed her reluctance to let her children engage with Meta’s VR headsets for interactions with strangers during this trip.
“Seeing her reaction, there was this wave of sadness that hit me,” Sattizahn reflected. “Her expression displayed a shattered perception of what she thought Meta’s tech was about.”
After they conducted interviews, the researchers were reportedly instructed to erase any recordings and written documentation concerning allegations made by a teenager. Instead, Meta’s concluding report suggested that concerns among German parents were simply about the potential for groomers to exploit children in VR.
The serious allegations and additional claims stemmed from feedback provided by four current and former employees, encompassing thousands of pages of documents, notes, and presentations.
In another context, Meta was under pressure to utilize the Oculus VR headset as early as April 2017, based on leaked information.
“We have a significant issue regarding children, and it might be time to really address it,” read a message from an employee at that time.
There were suggestions that as much as 90% of users in the Metaverse could be minors.
An unnamed employee recounted an incident involving “three young kids (about 6 or 7) chatting with a much older man who was inquiring about their homes.”
“It’s just going to end up making headlines, and not in a good way,” the employee lamented.
A November 2021 presentation revealed that Meta instructed Reality Labs researchers—responsible for VR development—to consider executing “very sensitive research under the privilege of legal protection” to avoid public scrutiny.
Team members were also advised to be “careful” with their wording in internal research, steering clear of terms like “not compliant” or “illegal.”
At one point during 2023, Meta reportedly instructed researchers to refrain from compiling data regarding the number of minors using their VR devices because of regulatory issues.
“To be clear: Meta has directed its researchers to erase evidence that the company is violating the law and knowingly endangering minors,” remarked Sachahaworth, executive director of the Tech Surveillance Project.
“This isn’t merely a significant obstacle; it goes deeper into examining Mark Zuckerberg’s leadership style and the toxic culture within Meta that encourages senior executives to act unlawfully,” she added.
The Senate Judiciary Committee will convene a hearing on Tuesday to discuss these whistleblower allegations.
Last week, Senators Chuck Grassley, Marsha Blackburn, and Josh Hawley criticized Zuckerberg for not adequately responding to inquiries, demanding a follow-up by September 16th.
A joint statement submitted to Congress in May by the whistleblower indicated that Meta’s legal team was systematically trying to monitor and even prevent the release of internal safety research.
This effort appears to be a reaction to a 2021 leak that came from a former employee, Frances Haugen, who revealed that the company recognized the harmful impacts of apps like Instagram on young girls.
The new whistleblower argues that Meta aimed to “create a plausible deniability” about its awareness of safety risks.
Meta spokesperson Dani Lever countered that the whistleblower’s assertions involved claims of research being tailored to fit a misleading narrative.
Since its debut in 2022, Lever stated, nearly 180 real-world studies on social issues, including youth safety, have been approved by META. These studies have contributed to valuable product updates, such as monitoring who parents connect with in VR, their engagement duration, and newly introduced supervision tools.
“We’ve implemented automatic safety measures for teens to minimize unwarranted interactions, like the default voice channel settings in Horizon Worlds. We support our research team’s efforts and are disheartened by mischaracterizations of those efforts,” Lever added.
While Lever did not confirm or deny whether the erasure of the Germany trip details from the final report occurred, they mentioned that it could be necessary to comply with European data protection laws.
According to Lever, these regulations require that data from minors under 13 be discarded if collected without verified parental consent. However, Sattizahn asserted that the mother provided consent through a signed contract and that generally, such data isn’t necessitated for removal post-interviews.
Sattizahn claimed he was terminated from Meta in April 2024 after disagreements with management over the handling of safety investigations. Other researchers who participated in the Germany trip resigned in 2023 due to ethical dilemmas.
The remaining two whistleblowers are still employed by Meta, and all four are supported by a nonprofit organization called Whistleblower Aid.
“From the outset, we implemented safety features aimed at users over 13, clearly stated in the Oculus Safety Center, packaging, and user guides,” Lever’s statement continued. “As more individuals use these devices and as Meta rolls out its games and apps, we’ve increased protections for younger users.”
Zuckerberg was initially very committed to the “Metaverse,” rebranding from Facebook to Meta in 2021 to emphasize this focus. Yet, the company is now shifting most of its resources toward artificial intelligence.





