By Harris Zainul and Samantha Khoo

In a recent letter to the Prime Minister of Malaysia, the Asia Internet Coalition (AIC) highlighted their concerns about Malaysia’s proposed licensing framework for social media and private messaging platforms in the country. In short, the AIC argues that imposing a licensing regime on platforms would significantly burden their members. In turn, they argue for self-regulation as a more flexible and responsive alternative to government intervention. 

There are deep flaws to this. 

First, the idea that these platforms are “neutral conduits for information” is woefully inaccurate at best or a blatant untruth at worst. The reality is that most social media platforms, particularly the ones the AIC claims to be representing, utilise algorithmic curation for various purposes. The most obvious reason would be to maximise user engagement and retention by prioritising a type of content over another. 

In such situations, it is no longer that platforms are merely facilitating free expression, but it is complicit in algorithmic manipulation. The consequences of such curation, when it goes wrong, have been well-documented globally – from the spread of conspiracy theories to real-world violence incited by disinformation.

Second, that platforms are able to self-regulate to satisfactorily address online harms. As it stands, scams, child sexual abuse, cyber bullying, anti-public health and hate speech content continue to proliferate despite the existence of the platforms’ moderation policies. It is worth remembering that for long stretches of the Covid-19 pandemic, many platforms were hesitant or unwilling to expand their moderation policies to include medical mis- and disinformation, further contributing to negative public health outcomes.

Relatedly, the AIC’s statement also alludes to the platforms’ apparent willingness to collaborate with the Malaysian Communications and Multimedia Content Forum – the industry forum designated to oversee and promote self-regulation of online content. Nonetheless, none of the platforms represented by AIC have subscribed to the Content Code for self-regulation. 

Third, it remains unclear how much resources these platforms dedicate to individual markets like Malaysia. This includes the number of human content moderators, and subsequent considerations such as their proficiency at local languages, their familiarity with local context, and the amount of training given to perform their role. Similarly, algorithmic and/or automated moderation remains in a black box – with equal levels of uncertainty over the training data of these systems and their effectiveness as proven by an objective third-party. 

Fourth, platforms are eager to point to their transparency reports as proof that they are diligently enforcing their internal standards. Nonetheless, with these reports only containing what the platforms choose to include, while leaving out sufficient details necessary for effective scrutiny – how different is this practice from transparency washing? Likewise, if students were allowed to mark their own exams, it would be no surprise that everyone would be tied for first place. 

These considerations must be taken into account for a balanced view on MCMC’s licensing framework. While it is expected of – and we, personally, encourage – people to be healthily sceptical of government overreach, especially in matters associated with free speech, it is worth remembering that these platforms neither have “clean hands” nor a democratic mandate to determine the redlines for free speech. 

Despite this, platforms continue to resist meaningful regulation by hiding behind the narrative of neutrality. With the AIC arguing that regulation will harm innovation, we must question if the innovation that they are claiming to promote will genuinely benefit society or a company’s stock price. 

Moving forward from this fiasco requires humility, especially since the AIC has since backtracked on its most damning statements. Platforms must understand that governments and people have wisened up to the consequences of the tech industry’s “move fast and break things” mentality, and that government can, indeed, regulate a technical sector. 

Here, we posit that there is room for deliberate and delicate policymaking that supports free expression and innovation, while simultaneously holding platforms accountable for the impact they have on society. With the right policy intervention, these three ostensibly competing objectives can be mutually reinforcing. 

- Advertisement -