Allegations Against Meta Regarding Child Safety in VR
Mark Zuckerberg’s Meta is facing allegations from current and former employees that the company may have suppressed research related to potential safety risks of its virtual reality products for children and teens.
Documents shared with Congress reveal that after a 2021 study prompted a Congressional hearing, Meta’s legal team began screening and editing research concerning child safety in VR.
Meta has firmly denied any misconduct in its research processes.
The disclosed documents include directives from Meta’s legal teams, advising researchers on how to handle sensitive topics that might result in negative public perception, legal issues, or regulatory scrutiny. For instance, researchers were told to avoid gathering data showing that children were using the VR devices “due to regulatory concerns.”
A Meta spokesperson responded to these claims, stating that some of the examples are “cherry-picked” to create a misleading narrative. They mentioned that since the launch of their initiatives in 2022, Meta has approved nearly 180 studies focused on social issues, including the safety of young users. The spokesperson highlighted that the research has informed product updates such as parental connections, time management, and new supervisory tools for app access. Additionally, default settings to limit unsolicited contact for teens have been implemented in the Horizon World platform.
Moreover, documents indicate that it wasn’t until the FTC started investigating child protection compliance that Meta developed parental controls for younger VR users. Employees also expressed concerns that children under 13 were bypassing age restrictions to access the company’s VR services.
In response to these concerns, a Meta spokesperson reiterated their commitment to providing “managed experiences for users aged 10-12.”
They emphasized that the safety features incorporated into their devices are designed for users over 13, as detailed in the Oculus Safety Center and other guides. With ongoing enhancements, Meta has added additional protections for young users, including improved reporting features and the ability to manage interactions for users under 13.
However, researchers who submitted evidence to Congress maintain their claims. For example, Jason Satisarne, a former Meta employee specializing in VR safety, alleged he was terminated following a dispute over limitations on his research. Another researcher, who focused on youth and technology, left the company in 2023, citing ethical concerns.
Experts have consistently raised alarms about the risks associated with advanced internet technologies for children, particularly the potential for direct interactions with adult predators, as highlighted by ongoing legal challenges facing platforms like Roblox due to similar issues.
