TikTok, a widely used social media platform with over a billion active users worldwide, has become a key source of news, particularly for younger audiences. This growing influence has raised concerns about potential political biases in its recommendation algorithm, especially during election cycles. A recent preprint study examined this issue by analyzing how TikToks algorithm recommends political content ahead of the 2024 presidential election. Using a controlled experiment involving hundreds of simulated user accounts, the study found that Republican-leaning accounts received significantly more ideologically aligned content than Democratic-leaning accounts, while Democratic-leaning accounts were more frequently exposed to opposing viewpoints. TikTok has become a major force among social media platforms, boasting over a billion monthly active users worldwide and 170 million in the United States. It has also emerged as a significant source of news, particularly for younger demographics. This has raised concerns about the platforms potential to shape political narratives and influence elections. Despite these concerns, there has been limited research investigating TikToks recommendation algorithm for political biases, especially in comparison to extensive research on other social media platforms like Facebook, Instagram, YouTube, X (formerly Twitter), and Reddit.
We previously conducted experiments auditing YouTubes recommendation algorithms. This study published at PNAS Nexus demonstrated that the algorithm exhibited a left-leaning bias in the United States, said Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi. Given TikToks widespread popularityparticularly among younger demographicswe sought to replicate this study on TikTok during the 2024 U.S. presidential elections. Another motivation was the concerns over TikToks Chinese ownership led many U.S. politicians to advocate for banning the platform, citing fears that its recommendation algorithm could be used to promote a political agenda. To examine how TikToks algorithm recommends political content, the researchers designed an extensive audit experiment. They created 323 sock puppet accountsfake accounts programmed to simulate user behavioracross three politically diverse states: Texas, New York, and Georgia. Each account was assigned a political leaning: Democratic, Republican, or neutral (the control group).
The experiment consisted of two stages: a conditioning stage and a recommendation stage. In the conditioning stage, the Democratic accounts watched up to 400 Democratic-aligned videos, and the Republican accounts watched up to 400 Republican-aligned videos. Neutral accounts skipped this stage. This was done to teach TikToks algorithm the political preferences of each account. In the recommendation stage, all accounts watched videos on TikToks For You page, which is the platforms main feed of recommended content. The accounts watched 10 videos, followed by a one-hour pause, and repeated this process for six days. Each experimental run lasted one week. The researchers collected data on approximately 394,000 videos viewed by these accounts between April 30th and November 11th, 2024.