Skip to main content
Hamburger Menu Close

Advertisement

Commentary

Commentary: TikTok scores points for its handling of Russia’s invasion of Ukraine

TikTok’s actions against disinformation on the Ukraine war suggest it has some independence from Chinese parent company ByteDance and has learnt from Facebook’s missteps, says NTU’s Dr Mark Cenite.

Commentary: TikTok scores points for its handling of Russia’s invasion of Ukraine

Valeria Shashenok is among those who have not given up on the playful nature of videos considered a trademark of TikTok. (Images: TikTok/@Valerisssh)

SINGAPORE: A month ago, I was among those who dismissed TikTok as just an app for a younger generation to share lip-sync and dance videos. Since Russia invaded Ukraine, it has become a first-hand source of war news, as users flood the video-sharing platform with harrowing scenes and impassioned commentaries from bunkers and bombarded buildings. 

And with content tagged #Ukraine amassing more than 30 billion views so far, the platform has drawn scrutiny of its moderation policies. As the war began, misleading pro-Russia propaganda abounded, but TikTok appears to have stayed a step ahead of critics with steps to curb it. 

Meanwhile, Meta (formerly Facebook) continues to bewilder with its moderation choices. Though Meta has disabled accounts conducting disinformation campaigns, eyebrows were raised when it said it will temporarily allow exceptions to its hate speech policy in Ukraine. 

Meta will allow calls for violence against Russian soldiers and the deaths of Russian President Vladimir Putin or Belarusian President Alexander Lukashenko - as long as they do not appear to be true threats with details such as where and how to carry them out.

For a company that only recently stepped up policing of calls to violence, the approach seems like a dizzying throwback to its much-maligned previous approach: Free speech absolutism about political expression. 

ASSERTIVE MODERATION AMID CONCERNS OF CHINA CENSORSHIP

Against initial doubts about how it would handle its prominent role, TikTok has acted against the most egregious disinformation by removing users and videos - such as false claims that Ukrainian President Volodymyr Zelenskyy fled Kyiv. 

And it has done more. Days into the conflict, TikTok restricted access to Russian state-controlled media outlets RT and Sputnik in the 27 countries of the European Union. It labels videos that remain from such sources as “state-controlled media”. Earlier this month, it also took the drastic step of suspending new uploads from Russia. 

Western observers have been sceptical of TikTok’s approach to moderating content on the Ukraine war because its parent company, ByteDance, is Chinese. Chinese media have followed Russian President Vladimir Putin in calling the conflict a “special military operation”. 

ByteDance operates two platforms - Douyin for the Chinese market and TikTok for the rest of the world - and early on, both platforms had similar restrictions on themes including the 1989 Tiananmen Square protests, Falun Gong, Tibet, Taiwan and LGBT+ content.

But by 2020, as TikTok’s popularity grew and Western officials raised concerns about its Chinese owner, restrictions were lifted and content on such themes circulated. TikTok acknowledged and apologised for its previous censorship of LGBT+ content, which it said was a “terrible idea”. 

Perhaps lingering fears of Chinese censorship might be allayed now that TikTok is playing host to condemnation of Russia from much of the world.

A tiktok company logo mounted on a wall. (Photo: AFP/Tolga Akmen)

Just last week, the Wall Street Journal reported that the platform is also restricting access to most foreign content in Russia. The company said that this is to protect Russian employees and users from prosecution under Putin’s new fake news law that imposes up to 15 years’ jail for referring to the conflict as an invasion or war. 

But there are also unintended consequences for Russians: Restricting Russian uploads cuts off circulation of all Russian views - dissent included - and restricting access to foreign sources cuts off Russia from all perspectives except the Kremlin’s narrative. Protests in Russia reveal that the war is not as popular there as some might imagine.

Though TikTok has millions of Russian users, ByteDance appears to have weighed the odds and concluded that the risks of freezing the app are lower than those it incurs - for employees, users, and its reputation - by providing services. 

AVOIDING META'S MISSTEPS

Signs are that TikTok has learnt from the public relations debacles of Meta. Users demand action against disinformation. Inaction invites backlash.  

TikTok’s quick action is a contrast to Meta’s previous strictly hands-off approach to political posts, which CEO Mark Zuckerberg attributed to the company’s value of unrestricted free expression. It’s a position that many, even Facebook employees, criticised. 

On the basis that whatever a political leader said was of public importance, Facebook left up former US president Donald Trump’s posts – even when his response to looting in the wake of George Floyd’s murder by police officers was read by many as advocating violence and when he was seen to encourage followers to march on Capitol Hill in January 2021 after his electoral loss.

Facebook then made a clumsy about-face by suspending Trump’s account after Joe Biden’s win was certified.

Its early decisions not to moderate may have been defensible to free speech purists. But the timing of Facebook’s change of heart raised doubts. It seems a bit too convenient that a company that continued to profit from hosting Trump’s incendiary posts for years suddenly found that he crossed the line only when his presidency ended. 

Liberals criticised Zuckerberg for not acting sooner against Trump’s divisive, demonstrably false rhetoric. Conservatives criticised him for the decision to censor in the end. 

Facebook’s missteps may still have consequences beyond its platforms. In the US, there’s bipartisan support for reform of the 1996 law that shields social media companies from liability for users’ content and allows them to moderate - or not - however they choose. 

That said, TikTok might just have the good fortune to be given an easier choice of taking blunt action against false content in an unpopular war. 

The platform is barely five years old and its content format and large Gen Z user base mean that it has been used less as a news source and as a soapbox for politics. 

Until now, TikTok simply had not been taken as seriously, so its content moderation faced only muted scrutiny at the time of the Black Lives Matter protests, the US Capitol riot and the height of the COVID-19 pandemic.

CONTENT MODERATION WILL ONLY GET MORE COMPLEX

By acting against disinformation, TikTok has avoided alienating users and stoking the same fervour for regulation that Facebook did.  

For a platform that makes it easy for users to manipulate videos with its built-in editing tools, detecting deception will present technical challenges. Algorithmic analysis of images is less developed than analysis of text and audio, which other platforms have relied upon heavily, the Wall Street Journal reports. 

And as disinformation takes on more technically advanced forms like deepfakes, such as a recent video manipulated to show Zelenskyy calling on Ukrainians to lay down arms, TikTok will have their work cut out for them. 

And soon, this platform’s popularity may mean that it no longer has Meta to clear the path ahead. Mistakes may become inevitable as TikTok grows to Meta’s scale. 

Dr Mark Cenite is Associate Dean (Undergraduate Education) at Nanyang Technological University’s College of Humanities, Arts, & Social Sciences. He teaches communication law at the Wee Kim Wee School of Communication & Information.

Source: CNA/geh

Advertisement

RECOMMENDED

Advertisement