Cassis, France (AP) -The moment her world was shattered three years ago, Stephanie Mistol found a 15 -year -old daughter, Marie, in a bedroom where Marie died of suicide.
“I went from light to darkness in a few seconds,” said Mistol and explained the day he marked the beginning of the battle in September 2021.
Editor's memo -This story includes a suicide discussion. If you or if you know, you can use the US suicide and crisis lifeline to 988 and use text messages. 988LIFELINE.ORG。 You can find a helpline outside the United States www.iasp.info/suiCidalthoughts。
After her death, she digged in her daughter's phone, and Mistor discovered a video that committed suicide, tutorials, and comments. She said that Tiktok's algorithm repeatedly pushed her daughter with such content.
“It was brainwashed,” Mistra, who lives in Cassis near Marseille in southern France, said. “They normalized depression and self -harm, and turned it into a twisting sense of attraction.”
Currently, Mistre and the other six families are appealing to the Tiktok France, accusing them to relieve harmful content and expose their children to materials that threaten life. Two of the seven families have experienced the loss of their children.
Asked about the lawsuit, Tiktok stated that the guidelines banned the promotion of suicide and hired 40,000 trusts and safety experts around the world (hundreds of French speaking moderators) to delete dangerous posts. Ta. The company also stated that it would refer to users who searched for mental health services for suicide -related videos.
Before killing himself, Marie Le Tiie made several videos to explain her decision, quoted various difficulties in her life, and was based in Titoku and popular emo. I quoted the song of a suicide of the Lap Group.
Her mother also claims that her daughter was repeatedly bullied at school and online and was harassed. In addition to the lawsuit, the 51 -year -old mother and her husband complained to five Marie's classmates and her former high school.
In particular, Mistre blamed Tiktok and stated that “it puts an app in the hands of empathy and delicate teenagers who do not know the reality from something like a bomb.”
Scientists have not established a clear link between social media and mental health issues and psychological harm, said Gregoir Bolst, a professor of psychology and cognitive neuroscience at the University of Parisite.
“It is very difficult to show clear causes and results in this field,” said the main, saying that only 0.4 % of teenagers could be due to the use of social media. He quoted Pia -reviewed research and said, “It's very difficult.”
In addition, Borst pointed out that Tiktok has no current study that suggests that it is more harmful than rival apps such as Snapchat, X, Facebook, and Instagram.
Most teenagers use social media that does not harm, but the true risks are lying to people who have already faced issues such as bullying and family instability. 。
“When the teenager is already sick about himself and spent time in distorted images and harmful social comparison, it could worsen their mental state. 。
Laure Boutron-Marmion, a representative of seven families who appeal to Tactoku, said their case is based on “extensive evidence.” The company said, “Because we don't create content, we can't hide behind the claim that it's not their responsibilities,” said Boutron-Marmion.
This lawsuit claims that Tiktok's algorithm is designed to confine vulnerable users in the despair cycle for profits, demanding family compensation.
“Their strategy is insidious,” said Mistol. “They are hooked on the contents of depression and stay on the platform and turn them into an advantageous remorse.”
BOUTRON-MARMION stated that Douyin, a Chinese version of Tiktok, has more strict content control for young users. Includes “youth mode”, which is essential for users under the age of 14, restricts the screen time to 40 minutes a day, and provides only approved content.
“It proves that they can alleviate content when they choose,” Boutron-marmion said. “I say that there is no these protection here.”
The report of the title “Children and Screens”, which was entrusted by President Emmanuel Macron in France in April, is considered to be addicted to a specific algorithm function and should be banned from any French app. I concluded that. The report also required to limit social media access to minors under 15 years old in France. Neither scale is adopted.
Tactoku, who faced the US closed in the United States until President Donald Trump stopped banned, has also been scrutinized worldwide.
The United States has seen similar legal efforts by parents. One lawsuit in Los Angeles has accused Meta, its platform Instagram, Facebook, and Snapchat and Tiktok as designed defective products. The lawsuit lists three teenage young people who have died suicide. Another complaint accuses two tribal countries that contribute to major social media companies, including the owner of YouTube, have contributed to the high suicide rate of young indigenous people.
Mark Zuckerberg, a meta CEO, apologized to his parents who had lost their children while testifying in the U.S. Senate last year.
In December, Australia has enacted a groundbreaking law that prohibits social media accounts for children under the age of 16.
In France, Boutron-Marmion hopes that Tiktok Limited Technologies, Tiktok Limited Technologys, a Chinese company that owns Tiktok, will answer the claim in the first quarter of 2025.
When contacted by AP communication, Tiktok said that it was not notified of the French lawsuit submitted in November. Boutron-marmion said that the French judicial system processed complaints and the Irish authorities (Tactoku's European headquarters) could formally notify the company.
Instead, Tiktok's spokesman emphasized the guidelines of a company that prohibit content that promotes suicide and self -harm.
Critics argue that Tactoku's robust modest claim is lacking.
Imlan Ahmed, a center CEO that opposes digital hatred, has rejected Tactoku's claim that more than 98.8 % of harmful videos have been flagged and deleted from April to June.
When asked about the blind spots in the moderation of the Social Media Platform, the user claims that algorithms can use vague language or implications that are struggling to flag to make detections. I said.
The term “Algospeak” is made to explain the skills, such as talking about cutting yourself using Zebra or Armadillo Emojis.
Amed said that such code words were “not particularly sophisticated.” “When independent researchers and journalists are able to find them, the only reason Tactoku cannot find them is because they don't see them hard,” said Amed.
In 2022, Ahmed conducted a study to simulate the experience of a 13 -year -old girl in Tactoku.
“In less than 2.5 minutes, the account was provided with self -harm,” said Amead. “Up to 8 minutes, they saw the content of eating disorders. On average, every 39 seconds, algorithms pushed harmful materials.”
This algorithm “knows that eating disorders and self -harm are particularly addictive for young girls.”
For Mistre, the battle is deeply and personal. She said she had to sit in her daughter's room and leave her decorations for the past three years, so she needed to know the dangers of social media.
She said she wouldn't have allowed her in Tactoku if she knew she was sent to her daughter. Explaining her as a “sunny, funny” teenager who dreamed of becoming a lawyer is broken her voice.
“To commemorate Marie, I will fight as long as I have the strength,” she said. “Parents need to know the truth. We have to confront these platforms and demand their accountability.”





