What's new

Welcome

If you already have an account, please login, but if you don't have one yet, you are more than welcome to freely join the community of lawyers around the world..

Register Log in
  • We don't have any responsibilities about the news being sent in this site. Legal News are automatically being collected from sources and submitted in this forum by feed readers. Source of each news is set in the news and a link to its source is always added.
    (Any News older than 21 days from its post time will be deleted automatically!)

Jurist TikTok ‘rabbit holes’ still drive youth towards suicidal content, Amnesty International France finds

Status
Not open for further replies.
  • Thread starter
  • Staff
  • #1

Dadparvar

Staff member
Nov 11, 2016
10,196
0
6
Amnesty International accused TikTok of insufficiently protecting vulnerable children and teenagers from psychologically harmful content on Monday after reconfirming 2023 findings that young users interested in mental health-related content are often led to a “rabbit hole” of videos featuring depressive themes and suicidal ideation.

The study focused on TikTok-use in France. Using three test accounts and appropriate research controls, the group found that after seeking mental health content on the platform, accounts increasingly displayed depressive messaging and videos that romanticized suicide on the app’s “For you” page. For instance, two videos of the reportedly self-harm-inducing “lip balm challenge,” which caused major media attention in France and the existence of which has been denied by TikTok, surfaced on the feed of one test account.

Experts claim this algorithmic tendency has led to concerning outcomes. In 2021, 15-year-old Marie Le Tiec took her own life and after viewing harmful TikTok content. Her mother, with other aggrieved parents and family members, sued the company for failing to moderate such content. TikTok has denied any wrongdoing, stating that it has 40,000 moderators and forwards users to mental health support if they search for alarming topics.

The organization alleged that TikTok’s disregard for systemic harm arises from its engagement-driven model, its “addictive design,” and hyper-personalized algorithm. Consequently, TikTok allegedly fails to abide by Business and Human Rights standards, as well as the 2023 European Digital Services Act (DSA), which requires platforms to identify and mitigate systemic risks to children. The European Commission opened formal proceedings against TikTok in 2024 under the DSA, which Amnesty aims to supplement with its report.

Psychology experts are unsure whether TikTok content creates depressive and suicidal ideation in all children or whether it primarily affects psychologically vulnerable kids. Marion Haza, clinical psychologist and lecturer at the University of Politiers, has claimed that harmful content is only truly dangerous for minority of users who already commit self-harm. The researcher has argued that strictly prohibitive measures, especially in adolescence, do not address root factors that lead to self-harm, stating that group relationships and community support are more effective ways to mitigate mental health risks. Haza noted that this social connection is something that platforms like TikTok can provide.

Grégoire Bors, professor of psychology and cognitive neuroscience at Paris-Cité, has said it is very difficult to find a clear connection between social media-use and self-harm, citing a leading peer-reviewed study which found that only 0.4 percent of the differences in teen-mental health could be attributed to the apps.

In contrast, a Amnesty study carried out at the end of 2024 shows that 58 percent of young people surveyed are negatively affected by disturbing content they watch, and only one-in-five has successfully avoided depressive content on their “For you” page. The report further highlighted community sections and bubbles which encourage and romanticize suicide.

Global leaders have begun proposing legislation to address potential threat posed to children by social media. In September, a French parliamentary report recommended the government implement a “digital curfew” for children under 16-years old. In Australia, the government has implemented a controversial social media ban for children under 16, which will come into force in December.

The post TikTok ‘rabbit holes’ still drive youth towards suicidal content, Amnesty International France finds appeared first on JURIST - News.

Continue reading...

Note: We don't have any responsibilities about this news. Its been posted here by Feed Reader and we had no controls and checking on it. And because News posted here will be deleted automatically after 21 days, threads are closed so that no one spend time to post and discuss here. You can always check the source and discuss in their site.
 
Status
Not open for further replies.
Top