Los Angeles County has filed a lawsuit against Roblox, a popular gaming platform for children, claiming it has enabled predators to groom and exploit minors.
The lawsuit asserts that Roblox misrepresented itself as a safe space while actually functioning as a “predator breeding ground,” according to authorities.
With over 151 million daily users, more than 40% are reported to be under the age of 13.
The complaint suggests that Roblox has developed an extensive online environment where adults and children interact with minimal oversight, creating, as they say, “a foreseeable conduit for adults to access and target minors.”
The county contends that not only unauthorized users, but also the platform’s very structure facilitates predators in finding and grooming children.
Filed in Los Angeles Superior Court, the lawsuit claims Roblox’s design effectively makes children vulnerable to pedophiles, accusing the company of neglecting to implement reasonable safety measures, such as age verification, limits on communication, strong parental controls, and a reliable reporting system.
Officials highlighted that key protective measures were either lacking or insufficient to thwart exploitation.
Moreover, prosecutors noted that the company seems unwilling to invest in essential safety features, allowing children to create accounts without age checks and permitting interactions that could lead to contact with high-risk users.
“For years, Roblox has intentionally maintained platform conditions that foresee the systematic sexual exploitation and abuse of children,” the filing reads.
Roblox has faced increasing legal troubles recently, including lawsuits from attorneys general in Texas, Louisiana, and Florida, alongside numerous private lawsuits from families across more than 30 states.
The company has continually denied any wrongdoing, asserting that it “strongly disputes” the allegations and plans to defend itself robustly.
A father, Jason Sokolowski, expressed his belief that the suicide of his 16-year-old daughter Penelope last February stemmed from years of grooming that initiated on Roblox.
Another grieving father from Vancouver, British Columbia, mentioned that he thought he was adequately overseeing his girlfriend’s online activities via tracking apps but didn’t grasp the full extent of the platform’s risks.
Sokolowski claims his daughter was initially contacted through Roblox before moving her conversations to Discord, where someone he labeled a predator encouraged her to engage in self-harm.
He discovered two years’ worth of messages on her phone after her passing, including a disturbing photo showing her chest marked with the predator’s username.
He suspects she was targeted by an individual linked to a violent online group, known as 764, that promotes self-harm and violence against minors.
“They groom girls to do anything they can, whether it’s nudity, cuts, blood or violence,” he conveyed.
Reflecting on his daughter’s death a year later, he criticized tech companies for their failure to safeguard children, arguing that stronger safeguards could easily be implemented.
The Post has contacted Roblox for a response.
In its defense, Roblox claims that safety is integral to the platform, insisting that users cannot send images through its chat and that its AI monitors communications around the clock, reporting suspected exploitation to the National Center for Missing and Exploited Children.
