How the algorithms of the Chinese social network TikTok work

An endless TikTok feed is controlled by an AI trained to give people what they want to see. Trends figured out how the first global network comes from China and what its algorithms hide

Around midday on February 21, 2019, a 19-year-old boy from the Brazilian city of Curitiba committed suicide. The action was broadcast live on the TikTok platform. After 40 minutes, the broadcast was automatically interrupted by the algorithms of the service (the reason was the lack of movement in the frame), then restored again and finally blocked an hour and a half after the start due to user requests.

Content like this is definitely the exception. TikTok is a platform for short entertainment videos. On the screen, as a rule, there are people who dance, sing, communicate, playing scenes from life. Each video lasts no longer than 60 seconds, and they are uploaded by users themselves – tens and hundreds of thousands daily. And yet, the tragic case of a Brazilian teenager exposes the essence of how TikTok works – what you see on the screen is determined by the algorithms.

TikTok is the only one of the five most popular social networks in the world that is not owned by Facebook, and the first successful global technology product comes from China. In 2016, the Chinese company ByteDance launched the Douyin service, a year later its international version appeared – TikTok. Around the same time, ByteDance bought American karaoke app Musical.ly and merged it with TikTok. The result of the merger is that by August of this year, the application was downloaded more than 2 billion times, and the monthly audience of TikTok exceeded 700 million people.

How Algorithms Work: The TikToker Side

TikTok’s explosive popularity has been made possible, at least in part, by the advanced artificial intelligence algorithms developed at ByteDance. It is they who watch the gigantic volume of video content, which the human intellect is no longer able to follow, and decide what to show to each user.

Everything is evaluated: from the content of the video to the program in which it is mounted. What algorithms pay attention to:

  • What is happening in the frame: what people do and say, what objects are around.
  • How the video was made: Shot on a phone or laptop camera, captured from the screen, or this is a rendered animation.
  • How the video was edited: in the TikTok editor or in another, whether filters, masks and other special effects are used.
  • What music is in the background: how popular and where it comes from – from the TikTok library or from our own.
  • Description for the video: what words and hashtags are used.
  • Copyright Compliance: whether there are fragments of other people’s videos.

In order for the video to be liked by the algorithms, it must go through a multi-stage inspection. It goes automatically, and controversial cases are checked by moderators.

If the video directly violates the rules of the community – drugs, pornography, incitement to hatred, bullying, violence and, by the way, suicide – the user receives a ban. But you can also break the unwritten rules and go to the so-called “shadow ban” – the account is not touched, but the video is no longer shown to viewers or deleted. This includes, for example, all of the above, but in a comic form: a trick with a cut off finger, blood from ketchup, inciting humor or a toy knife in the frame. Also, swearing, spam, head-on advertising, other people’s content (it is desirable to post no more than 30%) and, curiously, the reluctance to use the editor and TikTok special effects are not held in high esteem.

This whole complex system of rules should encourage authors to create original content. For beginners, something like a trial period works: videos must engage the viewer at once — through views, likes, comments, reposts and subscriptions. If a beginner can keep up with the pace in the first 10-20 videos, TikTok algorithms begin to actively promote it.

The same rules work in reverse. If a popular tiktoker loses its shape – uploading videos less often or not coming up with new moves – it quickly goes without views. The service thus kills two birds with one stone: it provides both a constant influx of original content and its rotation.

A key feature of TikTok’s algorithms, unpredictability, helps keep content creators on track. For a long time no one knew how they work.

At first, tiktokers tried to understand the operation of the algorithms themselves, empirically. The goal is to get to the main page that the user sees, it’s called ‘For You’. Some thought that it was enough to put the #ForYou tag everywhere, others thought that algorithms divide users into groups and then show videos in turn for each of them, and still others thought that videos were included in the recommendations absolutely by accident. Tiktokers adapted their videos to the version of the algorithms that they considered to be true.

Desperate attempts to win the favor of AI algorithms and get into the recommendations are somewhat reminiscent of the ritual dances of the ancient people, addressed to the gods.

And only in June 2020, TikTok revealed the secret of its recommendation algorithms. In a press release on the official website, the company told how videos get into the feed to users. We are talking about the criteria that are listed above – the content of the video, user reactions and the method of creation – of course, in general terms, without details.

Now tiktokers have something to build on. The head of TikTok, Kevin Meyer, raised even more hope when he spoke in late July that “algorithm transparency” could make the world a better place. In fact, he retired a month later.

How Algorithms Work: The Viewer’s Side

In order for the videos to gain millions of views, we need those who will not come off the screens. And this is also provided by algorithms.

When you first log into TikTok, you immediately see an endless stream of short videos – the same ‘For You’. It would be logical to first ask the user about interests, offer to subscribe to one of the popular tiktokers. But it is precisely in the absence of this that one of the main innovations of the service lies.

TikTok doesn’t show what you think you’ll like, but what you really like.

At first, the videos in your feed will be pretty random. All you need to do is just scroll. The videos you hold your attention on are marked as relevant and form the basis for personalized recommendations. After a few days or even a few hours, your ‘For you’ feed will be adapted to your characteristics and interests.

This approach is called interest graph. Traditionally, social networks were created around the idea of ​​a social graph: a person’s online profile is those people with whom he communicates and who follow his life, his social cast. An interest graph is a snapshot of the interests of a particular person, social connections are not included here.

These ideas are now being actively implemented by other social networks, especially those that are focused on video. The difference between TikTok as a social network is that it is not so social, as it was originally built around the idea of ​​an interest graph. From the viewer’s point of view, it is more like a service for the selection of entertainment content with elements of a social network.

Research on TikTok is still scarce. The ones that are starting to emerge are that video platforms like TikTok are characterized by weak social connections. Tiktokers don’t feel connected to fans and fan communities are unstable. This is not surprising, because the algorithms are sharpened for frequent changes in trends and heroes.

The content is always fresh, which prevents the viewer from getting bored and leaving the application. And if the viewer has such thoughts, then the same principle is used that keeps content creators on the site – unpredictability.

TikTok videos are attractive for two reasons: the unpredictability of the plot and the minimum cognitive effort.

The first principle is actively used in the casino – the player must win from time to time, receiving positive reinforcements. When there is no consistency in this, it seems that you are about to get lucky again. The endless feed works in much the same way, only the sources of pleasure here are fascinating videos that you don’t want to miss. TikTok algorithms will make sure that they come across regularly.

It is not only the plot that makes the videos exciting, but also the format itself. Moving pictures are easier and more understandable than, say, text. People are even more willing to respond to faces — it’s no coincidence that TikTok is the most active in promoting videos with people in the frame. Between two tasks that require different mental effort, people will choose the one that is easier. After the selection is made, the automatic tape works.

Dark side

If TikTok’s algorithms are like gods, then they must be gods of entertainment. But, as ancient mythology teaches us, endless entertainment often has consequences.

  • Psychological addiction.

Psychologists agree that gadgets and social networks can cause addiction. They argue only about the scale of the problem. Social psychologist Adam Adler notes in Irresistible:

Two important social media “ingredients” that our brains are hooked on are variable positive reinforcements and a craving for recognition.

The question is whether TikTok stands out against this background. Its main audience is children and teenagers. Psychologists consider them to be the most vulnerable group: they have an incompletely formed brain, which is poorly able to control impulsive impulses, and a special need for recognition. If you add attention-holding technologies here, then according to this logic, it turns out that TikTok should be addicted more than competitors.

On the other hand, there are practically no studies on this topic. In one of the papers, the authors compared how the use of Facebook, Twitter, Instagram and TikTok during a pandemic affects people’s well-being. Their conclusion: TikTok doesn’t make things worse. But this applies specifically to subjective sensations, but not to addictiveness.

  • Echo chambers and filter bubbles

Both of these phenomena occur when human cognitive biases are amplified by clever algorithms. Among these distortions are the tendency to confirm one’s point of view and the desire to have around people like you.

Filter Bubbles is a personal recommendation system taken to its logical conclusion: algorithms show what a person wants to see, not what is actually happening. As a result, everyone begins to exist in their own information bubble. For example, TikTok has been accused of creating such bubbles for gender, age, and race. Subscribing to a young brunette or liking a guy of Asian appearance will add videos with people of similar appearance to the feed.

This also works with political views, only here bubbles with similar contents merge and form echo chambers. In them, people of the same beliefs communicate with each other and only assert themselves in their own rightness, gradually becoming radicalized. University of Utah philosophy professor Thi Nguyen makes an important distinction:

An information bubble is when you do not hear people with a different point of view. Echo chamber – when you don’t trust them.

As a result, society is polarizing, and TikTok is contributing to this: some TikTokers are promoting US President Donald Trump and spreading conspiracy theories, others are trying to disrupt his campaign meetings.

  • Manipulation and censorship

One of the most popular opinions about digital surveillance is “I have nothing to hide”. However, it is not necessarily a matter of persecution. The main problem is that it becomes easier to manipulate people. When algorithms have a detailed graph of interests for each of us and sometimes know a person better than he does, they are able to sell him anything – from obviously unnecessary goods to “correct” behaviors and political views.

What cannot be finely adjusted can simply not be shown. After all, the service comes from China, known for its special approach to freedom of information. One of the TikTok scandals involved the removal of Feroza Aziz’s video from the US. In a training video about eyelash care, the last item she suggested was to google information about the life of the Uyghurs in China, for whom the local government organized a digital concentration camp. Last fall, the video scored 9 million views, but was removed, as TikTok later commented, due to a moderator error.

It’s hard to believe in a “mistake”, given the internal instructions for TikTok moderators, which The Intercept published later, in March of this year, and which the service did not refute. TikTok orders moderators to ban users for making political statements about China. The wording should be familiar to the Russian reader: “disrespect for the authorities”, “falsification of the country’s history”, “threat to national interests”.

TikTok censorship is not just about politics. Moderators should limit the promotion of videos where people are too “ugly”, “poor” or have developmental disabilities.

Moderators do not need to think out what exactly these terms mean – the company made sure that the instructions were detailed. “Obvious beer belly”, “abnormal body shape”, “ugly appearance”, dwarfism, “too many wrinkles”, “eye diseases” and other “low quality” features in people, as well as dilapidated housing with cracked walls, slums and rural terrain – all this will make the video invisible to users.

TikTok’s algorithms determine what we see on screen and what we don’t. Algorithms are not a mystical entity or an invisible hand that simply selects the best content for you. This is a very tangible hand of a very specific corporation, one of the most expensive private companies on the planet.

TikTok does have a lot of interesting content – funny, amazing, educational, silly, socially significant, embarrassing, and even shocking at times. Content is generated by users.

The company pursues its own interests, which may coincide with the interests of users, but not always. When a Brazilian teenager committed suicide on live TV, the first thing TikTok did was develop a public relations strategy. Three hours later he called the police.


Subscribe also to the Trends Telegram channel and stay up to date with current trends and forecasts about the future of technology, economics, education and innovation.

Leave a Reply