When TikTok s Algorithm Stops Following Beijing s Orders (Part 1)

TikTok

[People News] Introduction: A 17-Year-Old American Boy's Encounter with 'Tiananmen'

Picture this scenario —

A 17-year-old American high school student hears his teacher mention 'Tiananmen' for the first time during history class. After class, he does what any teenager would do: he pulls out his phone and searches for 'Tiananmen' on TikTok.

What does he find?

Golden glazed tiles glistening in the sunset. Tourists in the square are taking selfies with selfie sticks. An aerial video accompanied by soothing music showcases the grand architecture. Occasionally, there are a few posts in Chinese that he cannot comprehend.

He scrolls through over twenty videos. Not one mentions 1989. No tanks. No students. No massacre.

He locks his phone and thinks: So, Tiananmen is just a beautiful square.

This is not a fictional account. This is a real situation documented by researchers at Rutgers University's Network Contagion Research Institute (NCRI) during an experiment. They created new accounts simulating 16-year-old American teenagers, searched for the keywords 'Tiananmen, Tibet, Uyghur, Xinjiang' on TikTok, and meticulously recorded what the algorithm presented to them.

Currently, the technological foundation that powers this algorithm is undergoing a significant reorganisation. However, very few people are seriously discussing what this reorganisation means for us.

1. First, Let's Understand What Actually Happened

On January 22, 2026, TikTok announced the official formation of the 'TikTok USDS Joint Venture LLC' to meet the requirements of a U.S. presidential executive order, which mandates the sale of its U.S. operations to a consortium primarily made up of American investors.

ByteDance will retain nearly 20% of the shares in this new entity, while non-Chinese investors will hold the remaining approximately 80%. The managing investors include Oracle, private equity firm Silver Lake, and investment company MGX, each holding 15%. Other investors, including ByteDance's original backers, will collectively hold about 35%.

This joint venture will oversee data protection, algorithm security, content moderation, and software assurance in the U.S. market. It will receive algorithm licensing from ByteDance and will retrain the algorithm using data from U.S. users.

Once the transaction is finalised, Oracle will act as a security partner, responsible for auditing and ensuring compliance with the agreed-upon national security terms.

Beneath these seemingly mundane business arrangements lies a critical fact for the information warfare landscape: the recommendation algorithm—the unseen force that dictates what 170 million American users see and do not see—is being detached from its control in Beijing.

However, it is important to highlight a key detail that directly impacts our assessment of the window period: a TikTok spokesperson indicated that since the new investors took over the U.S. operations, there have been no changes to the algorithm. This implies that, as of now, the old algorithm remains in effect. The 'retraining' of the new algorithm is an ongoing process rather than a sudden overhaul.

This clarification enhances the strategic significance of the window period—we find ourselves in a transitional phase where the old system is weakening, and the new system has yet to be established. This represents the optimal moment for intervention.

2. What exactly did the old algorithm accomplish? The data reveals the truth.

Before we explore potential opportunities, it is essential to understand what we have faced in the past.

A study conducted by NCRI in 2024 (which was later peer-reviewed and published in the academic journal 'Frontiers in Social Psychology') carried out a systematic comparative experiment. Researchers created 24 'simulated accounts' to mimic the experience of 16-year-old American teenagers using the platform for the first time. They searched for the keywords 'Uyghur', 'Xinjiang', 'Tibet', and 'Tiananmen' on TikTok, YouTube, and Instagram, clicked on the first result, and then browsed the content subsequently recommended by the algorithm.

Here are some key data points:

When searching for 'Tiananmen', More than 26% of the results on TikTok were categorised as 'pro-China' content, compared to only 7.7% on YouTube and 16.3% on Instagram.

When searching for 'Xinjiang', only 2.3% of the results on TikTok are classified as 'anti-China' (i.e., content that criticises the policies of the Chinese Communist Party), compared to 21.7% on YouTube and 17.3% on Instagram.

In the case of 'Tibet', TikTok features the least amount of anti-China content (5%) and the highest proportion of pro-China content (30.1%) among the three platforms.

However, the most telling figure is not the percentage of pro-China content, but rather another statistic: 60.3% of the content recommended by TikTok's search algorithm is labelled as completely unrelated to the search topic, whereas this figure is below 5% on Instagram and YouTube.

What does this imply? A search for 'Tiananmen' yields videos showcasing scenic views of the square, a search for 'Uyghur' brings up idyllic scenes of Uyghur girls singing and dancing, and a search for 'Xinjiang' results in food and landscape content. Researchers have identified this phenomenon as 'keyword hijacking'—the practice of attaching popular tags of highly sensitive topics to irrelevant content in an effort to dilute and obscure genuine information.

Further insights emerge from the second phase of the study: TikTok users engage with anti-CCP content through likes and comments nearly four times more than with pro-CCP content, yet the pro-CCP content promoted by the search algorithm is nearly three times that of anti-CCP content—an imbalance not observed on Instagram and YouTube.

In summary, this is not a natural content ecosystem—users are evidently more interested in the truth, yet the algorithm systematically suppresses it.

The third phase of the research has unveiled some alarming findings: users who spend more than three hours a day on TikTok are more inclined to have a positive perception of China's human rights situation compared to those who do not use the platform. The percentage of heavy TikTok users who believe that 'Tiananmen Square is primarily known as a tourist attraction' has risen by 48%.

This phenomenon exemplifies the 'cognitive cleansing' executed by the Chinese Communist Party (CCP) through algorithms—not through the crude deletion of content, but by inundating users with a flood of irrelevant information that dilutes the truth to the point of being overlooked, while subtly reshaping the younger generation's understanding of history.

The CCP's most feared truths—such as organ harvesting from Falun Gong practitioners, genocide in Xinjiang, cultural erasure in Tibet, the Hong Kong protests, and the true origins of COVID-19—are all included in this algorithm's suppression list.

3. Beyond TikTok: The CCP's information warfare is comprehensive.

Assuming that decoupling TikTok will resolve all issues is a dangerously naive perspective. The CCP's information manipulation goes far beyond just one platform.

The Australian Strategic Policy Institute (ASPI) has identified a coordinated influence operation on YouTube, dubbed 'Shadow Play'. This operation encompasses at least 30 YouTube channels that have produced over 4,500 videos. Since mid-2022, these channels have garnered nearly 120 million views and 730,000 subscribers.

One of the defining characteristics of these channels is the extensive use of AI-generated content. According to ASPI, this marks the first significant initiative to combine video papers with AI-generated voiceovers as a tactical approach. This indicates a trend where threat actors are increasingly utilising readily available video editing and generative AI tools to produce persuasive content on a large scale.

ASPI's evaluation suggests that the operators behind this initiative may be commercial entities that are, to some extent, guided, funded, or encouraged by the state. This implies that some 'patriotic enterprises' are increasingly collaborating with government actors in influence operations related to China.

This reveals two key points:

First, the information warfare conducted by the Chinese Communist Party has entered an AI-driven industrial phase. It no longer relies on human labour for content flooding; instead, it employs generative AI to mass-produce seemingly professional English content, utilising platform recommendation algorithms to facilitate automatic dissemination of this content.

Second, even if TikTok's algorithm is no longer directly controlled by Beijing, the Chinese Communist Party will continue to assert narrative dominance across all mainstream platforms through proxy accounts, MCN organisations, united front networks, and AI content factories.

We are not merely facing an issue with a single platform; rather, we are confronted with a comprehensive system of information suppression supported by cross-platform, AI-driven, state-level resources.

4. Why is short video the decisive battlefield?

Having understood the scale of the adversary, the next question is: where should we focus our efforts?

The answer is clear: short videos—particularly TikTok and YouTube Shorts.

The reasons are not simply because they are 'trendy', but rather due to three structural factors:

First, the scale of reach is irreplaceable. TikTok boasts around 170 million users in the United States, while YouTube Shorts has exceeded 2 billion monthly active users. Together, these platforms encompass the largest young audience worldwide— and young people are the primary targets of the Chinese Communist Party's algorithm manipulation.

Second, information consumption habits have irreversibly shifted towards short videos. For those under 35, 60-second vertical videos have become the main way to acquire information, replacing traditional text and images. If the truth is only found in lengthy articles and conventional websites, it will never reach those who most need to know it.

Third, the dissemination mechanism of short videos naturally favours 'small bets for big returns'. Unlike traditional social media, which requires a follower base to gain visibility, the algorithms of TikTok and YouTube Shorts prioritise 'content quality' over 'account weight'. A brand-new account with no followers can achieve hundreds of thousands or even millions of views if its content captures the audience's attention within the first few seconds. This creates an unprecedented opportunity for low barriers and high-leverage dissemination.

(First published by People News)