TikTok is changing its algorithm to avoid streams of negative content, presumably in an attempt to counter criticism about the harmful effects of social media on the psyche of young users.
According to TikTok, the For You feed, which offers an endless stream of recommended content, has already been designed to avoid repetitive patterns that can tire users, such as not displaying multiple videos in a row from the same author. However, TikTok is now making changes to allow the algorithm to recognize content on negative topics such as “extreme diet and fitness” or “sadness,” states MMR. In doing so, the company hopes to “protect against viewing too many categories of content that might be as good as one video, but problematic when viewed in large numbers.”
These changes are based on consultations with experts in medicine, clinical psychology and the ethics of artificial intelligence.
TikTok is taking this step in the midst of a public reflection on the toxicity of social media for children and adolescents, who are at their most vulnerable ages.
Social media representatives, including TikTok, are now being called to Congress to answer questions about the dangers of their products, but research on TikTok is lacking, notes NIXSolutions. However, the New York Times article “How TikTok Reads Your Mind” looked at a leaked document called “TikTok Algo 101,” obtained from the company’s engineering group in Beijing. It suggests that algorithms optimize feeds to keep users on the app for as long as possible, even if that means promoting “sad” content that could harm the consumer.
TikTok also announced that it will allow users to remove certain words or hashtags from their feeds, which will help, say, a vegetarian who wants to avoid meat recipes or someone with low self-esteem who wants to avoid beauty tutorials.