CASSIS, France (AP) 鈥 In the moment when her world shattered three years ago, Stephanie Mistre found her 15-year-old daughter, Marie, lifeless in the bedroom where she died by suicide.

鈥淚 went from light to darkness in a fraction of a second,鈥 Mistre said, describing the day in September 2021 that marked the start of her fight against , the Chinese-owned video app she blames for pushing her daughter toward despair.

___

EDITOR鈥橲 NOTE 鈥 This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988. There is also an online chat at . Helplines outside the U.S. can be found at .

___

Delving into her daughter鈥檚 phone after her death, Mistre discovered videos promoting suicide methods, tutorials and comments encouraging users to go beyond 鈥渕ere suicide attempts.鈥 She said algorithm had repeatedly pushed such content to her daughter.

鈥淚t was brainwashing,鈥 said Mistre, who lives in Cassis, near Marseille, in the south of France. 鈥淭hey normalized depression and self-harm, turning it into a twisted sense of belonging.鈥

Now Mistre and six other families are suing TikTok France, accusing the platform of failing to moderate harmful content and exposing children to life-threatening material. Out of the seven families, two experienced the loss of a child.

Asked about the lawsuit, TikTok said its guidelines forbid any promotion of suicide and that it employs 40,000 trust and safety professionals worldwide 鈥 hundreds of which are French-speaking moderators 鈥 to remove dangerous posts. The company also said it refers users who search for suicide-related videos to mental health services.

Before killing herself, Marie Le Tiec made several videos to explain her decision, citing various difficulties in her life, and quoted a song by the Louisiana-based emo rap group Suicideboys, who are popular on TikTok.

Her mother also claims that her daughter was repeatedly bullied and harassed at school and online. In addition to the lawsuit, the 51-year-old mother and her husband have filed a complaint against five of Marie鈥檚 classmates and her previous high school.

Above all, Mistre blames TikTok, saying that putting the app "in the hands of an empathetic and sensitive teenager who does not know what is real from what is not is like a ticking bomb.鈥

Scientists have not established a clear link between and mental health problems or psychological harm, said Gr茅goire Borst, a professor of psychology and cognitive neuroscience at Paris-Cit茅 University.

鈥淚t鈥檚 very difficult to show clear cause and effect in this area,鈥 Borst said, citing a leading peer-reviewed study that found only 0.4% of the differences in teenagers鈥 well-being could be attributed to social media use.

Additionally, Borst pointed out that no current studies suggest TikTok is any more harmful than rival apps such as Snapchat, X, Facebook or Instagram.

While most teens use social media without significant harm, the real risks, Borst said, lie with those already facing challenges such as bullying or family instability.

鈥淲hen teenagers already feel bad about themselves and spend time exposed to distorted images or harmful social comparisons," it can worsen their mental state, Borst said.

Lawyer Laure Boutron-Marmion, who represents the seven families suing TikTok, said their case is based on 鈥渆xtensive evidence.鈥 The company "can no longer hide behind the claim that it鈥檚 not their responsibility because they don鈥檛 create the content,鈥 Boutron-Marmion said.

The lawsuit alleges that 罢颈办罢辞办鈥檚 algorithm is designed to trap vulnerable users in cycles of despair for profit and seeks reparations for the families.

鈥淭heir strategy is insidious,鈥 Mistre said. 鈥淭hey hook children into depressive content to keep them on the platform, turning them into lucrative re-engagement products.鈥

Boutron-Marmion noted that 罢颈办罢辞办鈥檚 Chinese version, Douyin, features much stricter content controls for young users. It includes a 鈥測outh mode鈥 mandatory for users under 14 that restricts screen time to 40 minutes a day and offers only approved content.

鈥淚t proves they can moderate content when they choose to,鈥 Boutron-Marmion said. 鈥淭he absence of these safeguards here is telling.鈥

A report titled 鈥淐hildren and Screens,鈥 commissioned by French President Emmanuel Macron in April and to which Borst contributed, concluded that certain algorithmic features should be considered addictive and banned from any app in France. The report also called for restricting social media access for minors under 15 in France. Neither measure has been adopted.

TikTok, which faced being shut down in the U.S. until President Donald Trump , has also come .

The U.S. has seen similar legal efforts by parents. One lawsuit in Los Angeles County accuses Meta and its platforms Instagram and Facebook, as well as Snapchat and TikTok, of designing defective products that cause serious injuries. The lawsuit lists three teens who died by suicide. In another complaint, accuse major social media companies, including YouTube owner Alphabet, of contributing to high rates of suicide among Native youths.

Meta CEO Mark Zuckerberg who had lost children while testifying last year in the U.S. Senate.

In December, Australia enacted a groundbreaking accounts for children under 16.

In France, Boutron-Marmion expects TikTok Limited Technologies, the European Union subsidiary for ByteDance 鈥 the Chinese company that owns TikTok 鈥 to answer the allegations in the first quarter of 2025. Authorities will later decide whether and when a trial would take place.

When contacted by The Associated Press, TikTok said it had not been notified about the French lawsuit, which was filed in November. It could take months for the French justice system to process the complaint and for authorities in Ireland 鈥 home to 罢颈办罢辞办鈥檚 European headquarters 鈥 to formally notify the company, Boutron-Marmion said.

Instead, a Tiktok spokesperson highlighted company guidelines that prohibit content promoting suicide or self-harm.

Critics argue that 罢颈办罢辞办鈥檚 claims of robust .

Imran Ahmed, the CEO of the Center for Countering Digital Hate, dismissed 罢颈办罢辞办鈥檚 assertion that over 98.8% of harmful videos had been flagged and removed between April and June.

When asked about the blind spots of their moderation efforts, social media platforms claim that users are able to bypass detection by using ambiguous language or allusions that algorithms struggle to flag, Ahmed said.

The term 鈥渁lgospeak鈥 has been coined to describe techniques such as using zebra or armadillo emojis to talk about cutting yourself, or the Swiss flag emoji as an allusion to suicide.

Such code words "aren鈥檛 particularly sophisticated,鈥 Ahmed said. "The only reason TikTok can鈥檛 find them when independent researchers, journalists and others can is because they鈥檙e not looking hard enough,鈥 Ahmed said.

Ahmed鈥檚 organization conducted a study in 2022 simulating the experience of a 13-year-old girl on .

鈥淲ithin 2.5 minutes, the accounts were served self-harm content,鈥 Ahmed said. 鈥淏y eight minutes, they saw eating disorder content. On average, every 39 seconds, the algorithm pushed harmful material.鈥

The algorithm 鈥渒nows that eating disorder and self-harm content is especially addictive鈥 for young girls.

For Mistre, the fight is deeply personal. Sitting in her daughter鈥檚 room, where she has kept the decor untouched for the last three years, she said parents must know about the dangers of social media.

Had she known about the content being sent to her daughter, she never would have allowed her on TikTok, she said. Her voice breaks as she describes Marie as a 鈥渟unny, funny鈥 teenager who dreamed of becoming a lawyer.

鈥淚n memory of Marie, I will fight as long as I have the strength,鈥 she said. 鈥淧arents need to know the truth. We must confront these platforms and demand accountability.鈥

___

Associated Press writers Haleluya Hadero and Zen Soo contributed to this story.

The 春色直播 Press. All rights reserved.

More Science Stories

Sign Up to Newsletters

Get the latest from 春色直播News in your inbox. Select the emails you're interested in below.