TikTok Finally Explains How the ‘For You’ Algorithm Works

For the first time, the social media company is opening up about its most mysterious feature.
hype house tiktok creators checking phones
Photograph: Michelle Groskopf/The New York Times/Redux

The first thing you see when you open TikTok is the platform’s For You page: an endless stream of videos uniquely tailored to each user. No two feeds are exactly the same, but as TikTok has exploded in popularity—the app has now been downloaded over 2 billion times—the For You page has become some of the most valuable digital real estate in the world. Getting featured can make or break a would-be TikTok influencer’s career. But for years, no one has known for sure how the For You page worked, leaving users to concoct their own conspiracy theories and experiments.

Now, TikTok is pulling back the curtain for the first time. In a blog post published on Thursday, the company outlined the basic mechanics of its For You page, revealing a recipe its users have long tried to estimate on their own. The social media company says it relies on a complex set of weighted signals to recommend videos, including everything from hashtags and songs to the kind of device a person is using.

The blog post is part of a wider push for transparency at TikTok, after US lawmakers and users raised security concerns last year about the company’s potential connections to the Chinese government. TikTok is owned by ByteDance, one of China's largest tech giants, but the company has repeatedly denied that the Chinese Communist Party wields any influence over its policies, and has worked to distance itself from Beijing. But the concerns have endured: Earlier this week, for example, The Wall Street Journal reported that some TikTok users were praising China and its leader Xi Jinping, wondering if the flattery might help their videos go viral.

Here is how TikTok videos get featured on the For You page, according to the platform. When a video is uploaded to TikTok, the For You algorithm shows it first to a small subset of users. These people may or may not follow the creator already, but TikTok has determined they may be more likely to engage with the video, based on their past behavior. If they respond favorably—say, by sharing the video or watching it in full—TikTok then shows it to more people who it thinks share similar interests. That same process then repeats itself, and if this positive feedback loop happens enough times, the video can go viral. But if the initial group of guinea pigs don’t signal they enjoyed the content, it’s shown to fewer users, limiting its potential reach.

This strategy is why your For You page may contain videos with lots of likes and views, alongside videos that may have been seen by only a few people. A new user with few followers can still make it to the For You page, in theory, although creators with large followings may have an advantage. “While a video is likely to receive more views if posted by an account that has more followers, by virtue of that account having built up a larger follower base, neither follower count nor whether the account has had previous high-performing videos are direct factors in the recommendation system,” the TikTok blog post reads.

TikTok relies on a number of signals to identify what kinds of videos users want to see, some weighted more heavily than others. Strong signals include whether you watched a video to the end, whether you shared it, and if you followed the creator who uploaded it afterward. Weak signals are things like the type of device you have, your language preference, and whether you’re in the same country as the person who posted a video. TikTok also considers negative feedback on a video, like whether a user tapped “Not Interested,” or if they choose to hide content from a certain creator or featuring a specific sound.

When videos are published is also a weak signal. TikTok says recommendations on the For You page may be up to roughly three months old, though videos usually peak in virality soon after they are posted. Since time stamps aren’t visible on the For You page, users may not know the memes they’re watching are from long ago.

The For You page algorithm looks at other elements, like song clips, hashtags, and captions, to categorize them and then recommend more videos like them. That’s why you may have noticed that your For You page often includes videos with the same sounds, helping to incubate new auditory memes and launch the careers of new musical artists, as well as worm catchy lyrics into your brain. The fact that TikTok considers hashtags lends credence to one of the most persistent theories for gaming the For You page—simply add the hashtag #foryou. If users engaged previously with videos using #foryou, it’s indeed possible they could be recommended more of them, but there’s nothing particularly special about the hashtag itself.

TikTok says it visually scans content to look for things that violate its rules, such as nudity, but the company insists the For You algorithm doesn’t take into account what videos look like or the way they were filmed. In March, The Intercept published internal training documents from TikTok that directed moderators to suppress videos from people deemed too ugly, poor, or disabled; the company said the rules were outdated, or in some cases never put in place.

Recommendation engines like the one TikTok uses have also helped power its Silicon Valley competitors—and over the years, their algorithms have been faulted for amplifying hateful ideologies and creating echo chambers. YouTube, for example, was criticized for adjusting its recommendation algorithms to increase time spent, because the decision may have helped content like conspiracy theories thrive. TikTok says the For You page algorithm isn’t optimized for any specific metric, but rather is designed to take into account many factors, like whether people like using the app and if they choose to post content themselves. User safety is also considered: TikTok says it may block videos depicting things like “graphic medical procedures or legal consumption of regulated goods” from being recommended, because some people may find the content shocking.

TikTok acknowledged in its blog post some of the challenges that come with designing recommendation algorithms, like the risk of creating filter bubbles, where users are shown the same one-sided ideas over and over again. To mitigate that possibility, TikTok says it purposely shows users different types of videos, even if they don’t match what they may have engaged with in the past. “Our goal is to find balance between suggesting content that's relevant to you while also helping you find content and creators that encourage you to explore experiences you might not otherwise see,” the company wrote.

“TikTok, and all social media platforms right now, are realizing they have tremendous responsibility in how they affect individuals and the information they consume,” says David Polgar, a technology ethicist and a member of TikTok’s Content Advisory Council, a group the company formed in March to help advise on content moderation policies. Around the same time, the company also announced plans to open the TikTok Transparency Center, a facility in Los Angeles where outside experts will have the opportunity to view source code and observe how the platform is moderated.


More Great WIRED Stories