TikTok will start telling users why it has recommended videos in their feeds amid demands from campaigners for more transparency about its algorithm.
In a blog post on Tuesday, TikTok announced that it will roll out a new feature to explain to users why the platform’s algorithm has recommended a particular video to them.
TikTok can find out whether a video has landed in their “For You” feed because of an interaction, such as another video watched, or a like, share or comment made.
It may also be because the video is popular in the user’s region.
To use the new feature, TikTok users can click the “Share” button on the video in their “For You” feed and select the question mark labeled “Why this video.”
From there, TikTok says the user can get more visibility into some of the reasons why a particular video was recommended to them.
Some of these reasons include: “user interactions,” which refers to the content they watch, like, or share on the platform.
It could also be down to comments they post and subjects they have searched for on the TikTok app.
A video could also be recommended because of “accounts you follow or suggested accounts for you,” “content posted recently in your region,” or “popular content in your region.”
TikTok will roll out the feature in the coming weeks.
A Powerful Algorithm
Like other social media platforms, TikTok is dependent on algorithms to present personalized content for users in the hopes of keeping them engaged on the platform for as long as possible.
The new feature is aimed at demystifying TikTok’s algorithm and explaining how a user’s specific activity or accounts influences the platform’s algorithmic recommendations.
In the future, TikTok says it plans to improve this feature with more detailed information for users.
“Looking ahead, we’ll continue to expand this feature to bring more granularity and transparency to content recommendations,” the company explains in the blog post.
Campaigners have argued that the power and accuracy of social media algorithms cause addiction and harm.
In September, a coroner’s office ruled that social media was a factor in the death of a 14-year-old British schoolgirl, Molly Russell.
Russell took her own life after viewing dark material on platforms including Pinterest and Instagram, that “shouldn’t have been available for a child to see.”
Her father, Ian Russell, has described how social media algorithms have the power to inflict harm.
“It’s a world I don’t recognize,” Russell told his daughter’s inquest. It’s a ghetto of the online world that once you fall into it, the algorithm means you can’t escape it and keeps recommending more content.”
Image credits: Header photo licensed via Depositphotos.