Mother Sues Roblox and Discord After Son’s Tragic Death
The mother of a 15-year-old boy from California has filed a lawsuit against Roblox and Discord, alleging that the two online gaming platforms facilitated her son’s sexual exploitation and subsequent suicide. The lawsuit, lodged in San Francisco County Superior Court by Rebecca Dallas, claims that her son, Ethan, began using Roblox at the age of nine with parental approval and controls in place. However, when he turned 12, Ethan fell victim to an adult predator who impersonated a child on the platform and established a friendship with him.
The lawsuit details how the interactions between Ethan and the predator intensified, eventually leading to explicit exchanges about sexual topics. The predator convinced Ethan to disable parental controls and shift their conversations to Discord, where the requests for explicit photos and videos became more aggressive, with threats of sharing images if Ethan did not comply. The emotional toll of these experiences reportedly led to Ethan’s suicide in April 2024, when he was just 15 years old.
Rebecca’s lawsuit claims both companies ignored warnings about user safety, alleging negligence, misrepresentation, and a lack of proper safety measures that could have prevented Ethan from interacting with such dangerous individuals. She asserts that had Roblox and Discord implemented better age verification and screening practices, her son might have avoided those harmful encounters.
Initially, Rebecca believed the platforms were secure for her son to connect with friends while gaming, supported by the parental controls she set up. However, she argues that Roblox permitted Ethan to disable these controls, opening the door for him to communicate with adults unsupervised.
The lawsuit points out that the design of the Roblox platform lacks adequate safeguards against predators, making it easier for them to target minors. Furthermore, it claims that the default settings allow adults to message children under 13, while kids can simply fabricate their birthdates, thus bypassing any age restrictions.
In a related issue, Roblox is facing another lawsuit from Louisiana Attorney General Liz Murrill, who asserts that inadequate safety protocols on the platform have created an environment conducive to exploitation. Murrill has expressed concern that the platform prioritizes user growth and profits over the safety of children.
Both Roblox and Discord have faced similar lawsuits in the past relating to grooming incidents that, fortunately, did not end in tragedy. One case involved a plaintiff who joined the platforms believing they were safe, only to receive inappropriate messages from strangers, ultimately leading to a disturbing exchange for gift cards.
The ongoing concerns about child safety on these platforms have prompted authorities to take action, signaling a critical need for improved protective measures in online environments.





