French Prosecutor Launches Investigation Into TikTok Algorithm’s Alleged Role in Encouraging Suicide

The French judicial authorities announced on Tuesday, November 4, that a criminal investigation has been launched into the overseas version of the Chinese short video platform TikTok, focusing on whether its algorithm may lead to suicidal tendencies among teenagers. This action stems from a report in the French Parliament, accusing TikTok’s recommendation mechanism of posing serious psychological risks to minors.

Paris prosecutor Laure Beccuau stated that the investigation was initiated based on the recommendations of a special parliamentary committee, aimed at clarifying whether TikTok is “endangering the lives of adolescents” through algorithmically pushing content.

Chairman François Jolivet of the French Parliament’s Special Committee on Social Media Influence and Mental Health revealed in the final report on September 11 that TikTok’s design mechanism “intentionally harms the health and lives of users.” The algorithm, with highly personalized content recommendations, engages teenagers in prolonged immersion in negative content related to self-harm and suicide. The committee unanimously decided to hand over the report to the judicial authorities and requested a criminal investigation.

The committee had previously launched an investigation into the impact of TikTok on the mental health of teenagers, sparked by seven families suing TikTok in 2024, accusing the platform of promoting self-harm and suicide content that led their children down a destructive path.

The report indicated that TikTok, owned by ByteDance in China, lacks adequate content moderation, is easily accessible to minors, and its algorithm is highly sophisticated, potentially trapping psychologically vulnerable individuals in a cycle of “self-harm and suicide content.”

According to the Paris prosecutor, the case has been handed over to the police cyber crime department for investigation, with charges including “providing a platform for promoting suicide methods,” which could lead to a maximum sentence of three years in prison if convicted.

A spokesman for TikTok stated via email that the company “strongly denies the allegations” and will vigorously defend itself, emphasizing that the platform has over 50 youth protection features, and 90% of violating videos are removed before being viewed.

ByteDance, the parent company of TikTok, is facing similar legal challenges in multiple countries. Numerous lawsuits in the United States accuse the social platform’s algorithm of exacerbating adolescent anxiety and mental health issues, drawing widespread attention to TikTok.

The Chairman of the French National Assembly Committee pointed out on September 11 that TikTok “intentionally harms the health and lives of users,” hence transferring the case to the judiciary. TikTok responded at the time, stating that the report was “misleading and attempted to make the company a scapegoat for industry-wide issues.”

In addition to the parliamentary report, the prosecution will also refer to a 2023 Senate investigation report warning of risks related to TikTok, including restrictions on freedom of speech, data collection, and algorithm biases. A 2023 report by Amnesty International also highlighted that TikTok’s algorithm is addictive and could lead to self-harm among teenagers.

Furthermore, the French National Agency against Foreign Digital Interference Viginum issued a report in February this year, warning that TikTok’s content recommendation system could be used to manipulate public opinion, especially influencing public opinions during elections. The agency is responsible for monitoring overseas digital interference activities and is an essential unit in the French government’s efforts to enhance network security and protect public opinion.

This investigation is seen as an important step in France strengthening social media regulation. In recent years, the European Union has also been promoting the Digital Services Act (DSA), requiring large platforms to increase algorithm transparency to prevent the ongoing deterioration of psychological harm and information manipulation issues.

(Adapted from Reuters)