SELECT LANGUAGE BELOW

US court revives TikTok lawsuit over girl’s ‘blackout challenge’ death

A U.S. appeals court has resumed a lawsuit against TikTok by the mother of a 10-year-old girl who died after taking part in the viral “Blackout Challenge,” which asked users of the social media platform to strangle themselves until they lost consciousness.

Social media companies are generally protected by federal law from lawsuits over their content because the content is generated by third-party users.

But the Philadelphia-based U.S. 3rd Circuit Court of Appeals ruled on Tuesday that the law cannot block Nira Anderson's mother from suing TikTok and its Chinese parent company, ByteDance, because an algorithm recommended that she file a challenge.


A US appeals court has reopened a lawsuit brought by the girl's mother against TikTok over the app's algorithm recommending the “blackout” trend. Reuters

U.S. Circuit Judge Patty Schwartz, writing for the three-judge panel, said Section 230 of the Communications Decency Act of 1996 only provides immunity for information provided by third parties, and not for recommendations made by TikTok through the algorithms that underpin its platform.

She acknowledged that the ruling marks a departure from past decisions by her court and others that have found that Section 230 exempts online platforms from liability when they fail to prevent users from sending harmful messages to others.

But she pointed to a US Supreme Court ruling from July that deemed algorithms represent “editorial judgment” that companies use to edit “third-party speech in any way they wish.”

Schwartz said company-specific algorithms reflect speech, and that speech is not protected by Section 230.

“TikTok selects the content it recommends and promotes to particular users, and in so doing engages in first-party speech,” Schwartz wrote.

Nira was rushed to hospital in December 2021 after imitating a blackout challenge she saw online.

She used the strap of a handbag she had taken from her mother's closet and sustained serious injuries.

Naila had been in intensive care for several days before she passed away.

The lawsuit by Tawayna Anderson was initially dismissed in October 2022 by a lower court judge, citing Section 230.

“Big tech has lost its immunity,” Jeffrey Goodman, the mother's lawyer, said in a statement.

The ruling could mean that tech companies could face more lawsuits in the future over content recommended by their algorithms.

In an opinion concurring in part with Tuesday's ruling, U.S. Circuit Judge Paul Matey said TikTok “pursues profits above all other values” and may choose to host content aimed at children that highlights the “vilest tastes” and “the lowest virtues.”

With post wire

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News