Social Media can create a dangerous echo chamber that supports our views. However, things can be even more dangerous when politically-linked individuals end up as content moderators at social media companies. That’s precisely why Tiktok is being called out for on its own platform by Yuwana TV.
Malaysia’s relationship with TikTok and social media has become a little bit strenuous after their previous General Elections. Like the U.S., social media became the battleground for votes particularly when it came to the younger generation of voters. In Malaysia, TikTok became ground zero for bipartisan and racist propaganda.
With that as our backdrop, it comes as no surprise that things are being stirred up once again as an individual linked to Malaysia’s opposition coalition, Perikatan Nasional (PN), was working at TikTok’s parent company, ByteDance, as a content moderator close to the elections. The fact was highlighted in a TikTok video uploaded by Yuwana TV to promote their Facebook Live session. The video cites a Forbes report alleging that employees engage in the practice of “heating” where they manually push content into FYPs to help them go viral. The report was written after seven sources verified the practice.
@yuwanatv Apa kaitan TikTok Malaysia dan Perikatan Nasional? #tiktok #tiktokmalaysia #tiktokmalaysia🇲🇾 #bytedance #perikatannasional #pasgombak #dppm #suffikamari #pembangkang #viral #kerajaanperpaduan #pakatanharapan ♬ original sound – Yuwana TV
In the video, one of the hosts highlights that the “For You” page (FYP) in TikTok is done by an algorithm. He also notes that content moderators are able to overwrite the algorithm and push their own content into the FYP of other users. This assertion was supported by @cikguseo on Twitter. Some of the denizens of Twitterjaya (Malaysia’s slang for netizens) have since attributed PN’s social media success during the elections to this.
Be that as it may, it seems like that may not be the case. TikTok has issued a statement addressing the issue. In a statement from Hafizin Tajudin, TikTok’s Head of Public Policy, the company asserts that the accusations are false. In fact, they refer to the individual implicated, Suffi Kamari, as a former employee. This is supported by a tweet by Kamari, himself, who is now the Head of Communications of the Gombak Youth Division at the Malaysian Pan-Islamic Party (PAS). In his tweet, Kamari clarifies that he had left ByteDance in June 2022 – about 5 months before the elections.
Tajudin also clarified that “moderators do not have any authority or access to any forms of promotional tools for content” and that “[TikTok has] a robust quality assurance system in place to ensure that the political or personal opinions of our employees do not affect their work quality and ethics, including when performing content moderation tasks. TikTok’s content moderation decisions are based on a set of clearly defined Community Guidelines and have layers of checks and balances including quality assurance and third-party fact-checkers, to uphold safety and ensure fairness in moderation.”
TikTok has tried to proactively address concerns such as these as they arose amidst the American elections in 2020 on platforms like Facebook and Twitter. The company has adopted policies that try to limit the proliferation of misinformation and political rhetoric on the platform. In their statement, TikTok asserts that political ads on their platform. This is an assertion which is enshrined in their advertising policy. They have since introduced a policy for government, politician and political party accounts which prohibit monetization and promotion by these accounts.
In fact, their statement even addresses Forbes’s assertion of content “heating”. They do admit that they “do promote a small fraction of videos to help diversify the content experience and introduce celebrities and emerging creators to the TikTok community.” This does bring up the question of whether “heating” is a policy which needs to be revisited by TikTok. While it does help with creating platforms for emerging content creators, it makes us ask, “how are they vetting these creators?”. However, it can’t fall to the company to ensure that these creators are completely clean when they do promote content. We’ve all seen creators go off the deep end at some point in their careers.
Be that as it may, the content that made PN relevant wasn’t from a political party or politician account; instead, it was rhetoric spread by regular users and influencers that were incentivised to do so. This issue highlights a growing concern of individuals, government and watchdogs about the role social media is playing when it comes to political rhetoric.
Is it then possible to actually snip movements like this in the bud as a social media company? Should they be allowed to police what is spread on their platforms? These are questions which continue to be asked in situations like these. Where does freedom of expression end and where does moral policing begin?