Can an algorithm cause death? This is the question being raised by the parents of 15-year-old Marie, who committed suicide in September 2021 in Cassis, to the French justice system. In a complaint filed at the beginning of September, the parents accuse the Chinese social media platform TikTok, a favorite among teenagers worldwide, of « incitement to suicide, » « failure to assist a person in danger, » and « promoting or advertising methods of self-harm. » The complaint is currently being reviewed by the Toulon prosecutor’s office.
Avalanche de vidéos sur le suicide
Like six out of ten teenagers, Marie regularly used TikTok. When her parents analyzed her phone after her death, they discovered that their daughter had posted a video on the social network in which she talked about her unhappiness, the bullying she experienced because of her weight, and her desire to commit suicide. The publication will automatically generate other videos on the same topic on her account.
The social network owned by the Chinese company Bytedance has achieved rapid success in just five years, with 1.7 billion active monthly users, including 15 million in France. Among them, 67% are under 24 years old. Its secret? The format of the videos – less than 8 seconds on average, perfect for scrolling through content on the screen without getting bored – and a « chemically perfect » algorithm, with unmatched precision and power, according to American researcher and sociologist Julie Albright from the University of Southern California.
In her study, the researcher believes that TikTok has found the ideal format because each new video triggers a dopamine rush in the brain, which contributes to creating an addiction to the service and ultimately alters users’ attention span. Indeed, two-thirds of American teenagers spend 80 minutes per day on TikTok, which is more than twice the amount of time spent on any other app.
« I cannot reword »
« TikTok identifie les vulnérabilités et capitalise dessus »
In the case of a teenager with suicidal thoughts, for example, simply showing an interest in the subject – through a video like Marie’s, or just interactions like a like, a share, a pause on an image, a comment, or clicking on associated keywords – allows the algorithm to deduce that it is an area of interest and suggest more similar content. The accuracy is such that the algorithm sometimes deduces things that users are not yet fully aware of, like this woman who wrote an article titled « TikTok’s algorithm knows my sexuality better than I do. »
Ainsi, d’après une étude publiée fin 2022 par le Centre de lutte contre la haine numérique, une ONG anglo-saxonne, il a fallu 2,6 minutes à TikTok pour recommander du contenu suicidaire à un profil d’adolescent « vulnérable ». Et 8 minutes pour voir apparaître des vidéos sur des troubles alimentaires pour un autre faux profil créé par les chercheurs. Or, « cI briefly stopped to watch videos about body image and mental health, and I liked them.« I cannot reword »« I cannot reword »
Et de conclure :
TikTok identifie la vulnérabilité de l’utilisateur et capitalise dessus. Les comptes vulnérables de notre étude ont reçu 12 fois plus de recommandations de vidéos d’automutilation et de suicide que les comptes standards.
What responsibility does TikTok have?
Contrary to the widely spread image portrayed by the political class of the « wild west » that supposedly prevails on the Internet and social media, everything that is prohibited in real life is also prohibited online. Platforms are required to strictly moderate content that clearly violates the law (such as child pornography and online hate) and to combat content that falls into the gray areas of freedom of expression (racism, sexism, harassment, promotion of eating disorders and suicide…).
However, platforms like Meta (Facebook, Instagram), X (formerly Twitter), or TikTok use their status as hosts, which does not make them responsible for the content they host, to regulate content as little as possible. The reason is economic: the more users interact with the content and spend time on the application, the more they disseminate their personal data, which the platforms then sell to advertisers to display highly personalized advertisements. Numerous international studies have proven that the most problematic content is also the one that generates the strongest engagement: this is the famous attention economy.
However, the Digital Services Act (DSA), which came into effect on August 25, 2023, has created a new liability regime for very large platforms. This has led TikTok, for example, to restrict access to problematic keywords such as « suicide » or « pro-ana » (a term referring to content promoting anorexia). However, with the vast amount of videos circulating on TikTok and the constant flow of new content, these initiatives seem more like walls that can be easily bypassed rather than effective barriers.
Instagram has been convicted in the United Kingdom in a similar case.
TikTok claims to have a strong response towards content that violates its « community guidelines. » This includes videos promoting racism, hatred, harassment, and eating disorders.« When a video is published on TikTok, its artificial intelligence analyzes every part of the video: the audio, captions, and hashtags. The goal is to help the application better understand the context and content of the videos. The social network evaluates if the video can be appealing and to whom. After this analysis, TikTok checks if the content complies with community rules and safety regulations, » wrote Victoire Gué, author of a study on TikTok’s algorithm functioning for marketing professionals, for Hubspot.
However, this moderation always appears clearly insufficient. And justice seems to be starting to recognize it. In July 2022, for the first time, a social network, Instagram, was identified by the British justice system as having contributed to the suicide of a 14-year-old girl, Molly Russell. The young girl had liked, shared, or saved over 2,100 posts related to suicide, self-harm, or depression on Instagram during the last six months of her life.