KUALA LUMPUR – TikTok’s artificial intelligence (AI) may sometimes go too far, failing to differentiate between legitimate news reporting and regular user-generated content, warned Communications Minister Fahmi Fadzil.
He said that the Malaysian Communications and Multimedia Commission (MCMC) has reached out to TikTok for clarification on the recent blocking of 18 local media organisations.
According to initial information from the platform, the affected accounts were restricted due to their coverage of a molestation case involving a young girl at a mosque in Batang Kali.
“The problem is TikTok’s artificial intelligence (AI) itself. Here’s a little warning: AI can sometimes go too far and not understand that the media organisation reporting is different from the content produced by ordinary people,” Fahmi said as reported by Bernama.
To prevent similar incidents in the future, he has called for discussions with TikTok to refine how the platform manages media-run accounts.
Fahmi made these comments while launching the “AI in the Newsroom” course, an initiative by Bernama as part of a broader series of AI-focused training programmes for media professionals.
Some TikTok accounts categorised as “official” were also reportedly affected by the restrictions.
Expanding on the issue, Fahmi stressed that the report on the molestation case was a standard news report, as typically produced by media organisations, and should not have been flagged as problematic.
“The report should not be a problem. So here I see the AI problem that TikTok needs to explain to us and then also to media companies,” he said.
Fahmi noted that TikTok has increasingly relied on AI for content moderation, which at times leads to misinterpretations of uploaded content.
“This is where I see an opportunity for us to engage with TikTok—to allow more flexibility or perhaps assign a different status to media organisations. These are news reports, and we already have our own guidelines and code of ethics, so TikTok needs to recognise that,” he added. – February 24, 2025