TikTok sued by French Families over harmful content
In a landmark case, seven families in France are suing social media giant TikTok, accusing it of exposing their children to harmful content that allegedly led two teenagers to take their own lives. The case, filed in the Créteil judicial court, argues that TikTok’s algorithm pushed content promoting self-harm, eating disorders, and suicide, putting vulnerable young users at risk. According to Laure Boutron-Marmion, the lawyer representing the families, this lawsuit is the first of its kind in Europe and aims to hold TikTok legally responsible for its impact on minors.
TikTok, a widely used platform among young audiences globally, said it had not received any formal notification of legal proceedings concerning these claims. In response to the allegations, the company stated that its community guidelines strictly prohibit content that promotes or glorifies self-harm or suicide. TikTok said it employs both advanced technology and human moderation to enforce these standards and maintain user safety.
The group lawsuit follows an earlier criminal complaint filed last year by the parents of Marie, one of the two teenagers who died by suicide after reportedly viewing harmful content on TikTok. Marie, whose last name has not been disclosed, was 15 when she took her life in 2021. Her mother claims that TikTok’s content contributed to her daughter’s mental health struggles and eventual death.
The case involves two teenagers who died by suicide and several others who faced serious mental health challenges. Four of the other five young women represented in the lawsuit reportedly attempted suicide, and one developed an eating disorder linked to the content they encountered on the platform. Boutron-Marmion emphasized the need for TikTok to be held accountable for the safety of its young users, pointing out that the platform is a commercial service offering content to consumers, including minors, and must address its product’s potential dangers.
TikTok, like other major social media networks, has faced rising scrutiny over its safeguarding practices and the potential mental health impacts on young users. In the U.S., more than a dozen states have recently filed lawsuits against TikTok, accusing it of contributing to a mental health crisis among teenagers. The European Union also launched an investigation last year to determine if TikTok had violated new safety regulations aimed at protecting minors.
Boutron-Marmion compared the case to other high-profile incidents involving minors and social media, such as the tragic death of Molly Russell, a British schoolgirl who died by suicide in 2017 after being exposed to self-harm and suicide content on Instagram and Pinterest. This case has raised awareness about the need for greater accountability from social media platforms, which critics argue can expose vulnerable users to graphic and dangerous material.
In a previous interview, Boutron-Marmion noted that more parents are becoming aware of the types of content circulating on social media platforms and are pushing for change. Although awareness is growing, the issue of social media addiction remains widespread, affecting both minors and adults. This lawsuit signals a broader call for social media platforms to take more rigorous action in safeguarding their young users and ensuring harmful content does not reach vulnerable audiences.