Big trouble for the social media giant TikTok in France! A total seven French families have launched a lawsuit against TikTok, accusing the platform of exposing their adolescent children to harmful content that they claim contributed to severe consequences, including two tragic deaths by suicide.
The lawsuit, filed in the Créteil judicial court in Paris, represents the first grouped case of its kind in Europe, according to Laure Boutron-Marmion, the lawyer representing the families.
What are the Allegations against TikTok?
The families assert that TikTok’s algorithm exposed their children, all teenagers, to an array of distressing videos that promoted suicide, self-harm, and eating disorders. These claims echo similar accusations that major social media platforms have faced in recent years regarding their content moderation practices and the potential mental health impacts of their algorithms.
“The parents want TikTok’s legal liability to be recognised in court,” Boutron-Marmion said in a statement to Franceinfo. “This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product’s shortcomings.” The case highlights an ongoing debate over the responsibilities of tech companies to protect vulnerable users, particularly minors, from harmful content.
Impact on affected Families
The legal action has been deeply personal for the families involved, including the parents of Marie, a 15-year-old girl who died by suicide in 2021. Marie’s mother has previously shared that her daughter was able to access distressing content on TikTok without sufficient moderation, which she believes contributed to her tragic death.
Another family also lost their daughter to suicide, while four of the other five teenagers involved in the lawsuit attempted suicide and struggled with severe mental health issues, including eating disorders.
These stories mirror those of other high-profile cases, such as that of Molly Russell, a British schoolgirl who died by suicide in 2017 after viewing content related to self-harm and suicide on Instagram and Pinterest.
Lawyer Boutron-Marmion noted that such cases are leading to increased awareness among parents about the potential risks associated with social media use.
“Parents are starting to wake up. Many of them were unaware of the horrors that were circulating on the platforms. While I have noticed a change in mentalities, the problem remains: addiction persists, including among adults,” Boutron-Marmion stated in an interview earlier this year.
How has TikTok Responded?
TikTok, which is owned by the Chinese company ByteDance, is one of the most popular social media platforms in the world, boasting millions of young users. The company has faced mounting criticism and legal scrutiny over its content moderation policies and potential risks to children’s mental health.
In response to these concerns, TikTok has asserted that its community guidelines do not allow content promoting suicide, self-harm, or eating disorders and that it uses a combination of advanced technology and human moderation to enforce these standards.
In a statement addressing the current legal proceedings in France, TikTok said it had not yet received any formal notification of the lawsuit. The company reiterated that it takes issues related to children’s mental health seriously and emphasized its investments in safety measures. Earlier this year, TikTok’s CEO, Shou Zi Chew, told U.S. lawmakers that the company had committed significant resources to safeguarding young users.
Lawsuits After Lawsuits for TikTok
TikTok’s challenges are not confined to France. In the United States, the company is currently facing hundreds of lawsuits claiming that it and other major platforms like Meta’s Facebook and Instagram have enticed children into excessive use that damages their mental health. Last month, more than a dozen U.S. states and the District of Columbia filed lawsuits against TikTok, accusing it of fostering compulsive and excessive use among teenagers.
ALSO READ: