Earlier this month, the Supreme Court decided not to hear the case of Doe v. Grindr, which many considered a significant test of Big Tech’s legal immunity under Article 230 of the Communications Decency Act.
This ruling essentially cuts off any hope for justice for my client, who experienced a sexual assault at the age of 15 by four adult men she met on Grindr. By choosing not to pursue the lawsuit, it also neglects the injustices faced by countless others.
It’s crucial to note that this wasn’t just a random act of violence. It stemmed from Grindr’s dangerous business model, which seems to prioritize profit over the safety of children. The app has been found to specifically aim at younger demographics on platforms like Instagram and TikTok, featuring content aimed at high school and middle school students.
Grindr lacks proper verification procedures for photos, names, or ages, making it easier for minors to hide and for predators to locate them. According to some research, around half of sexually active gay adolescents report their first sexual encounter was with an adult they met on Grindr.
Our legal argument pointed out that Grindr was aware of children using its platform and that predators were targeting these minors. The company opted to overlook these risks, likely to boost engagement from underage users and increase ad revenue.
While other institutions have faced scrutiny for ignoring child exploitation, it raises the question: what will it take for more recent victims to receive justice?
Three of the adult men who assaulted my client are currently serving federal prison sentences. Yet, Grindr, the platform that united them with these minors, remains unscathed. Even though the offenders are incarcerated, the platforms that facilitated their actions stay untouched.
Grindr relies on Section 230 as its defense. Initially designed to protect internet platforms from liability for user-generated content, its broad interpretation now serves as a near-complete shield for massive corporations.
Essentially, what began as legislation to promote the early Internet has morphed into a protective barrier for billion-dollar companies.
In my experience, Section 230 has often been used to block survivors from pursuing justice. In some instances, judges have felt constrained by it, leaving companies immune despite troubling allegations.
While we did not seek to abolish Section 230 or limit online speech, our goal was to enforce existing laws. Section 230 safeguards platforms against decisions related to content management, but not against how they design their products. When those design choices create risks, product liability laws should offer a route to justice.
Some justices acknowledge that Section 230 may have been overstepped. Judge Clarence Thomas has hinted that the Court has expanded the law beyond Congress’s original intent. But by declining to hear Doe v. Grindr, the Court has denied restitution for victims like my client and permits tech companies to operate without accountability.
It’s difficult to rely on these companies to self-regulate. Unless the Supreme Court takes action, reform will need to come from Congress.
On a brighter note, there seems to be bipartisan consensus that the time for reforming Section 230 has passed. Last year, the House of Representatives drafted a proposal aimed at revoking Section 230’s protections for platforms facilitating child sexual exploitation. Additionally, recent initiatives led by senators from both parties seek to hold companies accountable for harms related to AI products.
In 2020, the Department of Justice recommended a series of edits to Section 230, including exceptions for companies promoting child abuse, terrorism, and cyberstalking. These alterations could maintain an open Internet while preventing companies from profiting off negligence or exploitation.
The Supreme Court should seize the next chance to clarify Section 230’s boundaries. Meanwhile, Congress must enact significant reforms to ensure that companies like Grindr don’t put children in harm’s way.
Both branches of government possess the ability to safeguard families and restore justice for those affected by technology. They have a moral duty to act before more children are harmed.





