by Harris Zainul and Ryan Chua
MARCH 15 — Much has been and will be said about the introduction of the Emergency (Essential Powers) (No. 2) Ordinance 2021 (hereinafter referred to as “EO2”). To be clear, there are many potential problems involving EO2, especially concerning its probable impacts on civil liberties.
That said, this article is not about that. Rather, this article seeks to highlight why and how EO2 is too blunt a tool to meet its desired policy objective in dealing with Covid-19 fake news.
Firstly, the logic behind EO2’s objective appears to be overly simplistic: in that higher punishments would deter would-be creators or sharers of “fake news”, thus helping to address the infodemic. This is reflected in how under EO2, creators and sharers of fake news could be punished with a fine up to RM100,000 or imprisonment not exceeding three years, or to both, if found guilty.
This is a marked increase over the punishments provided for under Section 505(b) of the Penal Code (up to two years imprisonment and/or fine) and Section 233 of the Communications and Multimedia Act 1998 (up to one year imprisonment and/or RM50,000 fine) which had been the legal provisions of choice to deal with fake news during this pandemic.
While deterrence theory makes sense at the theoretical level, the real world, however, is not as clear cut. This is because not all that are contributing to the infodemic are doing so with awareness, intent and/or malice.
The fact of the matter is that some people might be sharing information—including ones that eventually turn out to be false—because they think it could be beneficial to their family, friends and loved ones. What this means is that some contributors to the infodemic would not have reason to suspect that they are creating and/or sharing fake news to begin with, which then raises the question as to how deterrence would work for this category of people.
Second, the problem with EO2 is compounded by it creating an offence for not just those intending to create and/or share fake news, but also those inadvertently doing so, as long as it causes fear or alarm to the public. EO2 does not sufficiently distinguish between misinformation and disinformation—with the key difference being the absence of intent in the former and presence of intent in the latter—but rather focuses on the fake news’ impact on the masses.
Here, it ought to be appreciated that information that could cause fear or alarm to the public might also be more “shareable” due to its perceived value in being able to warn others. Take for example an assertion that a Covid-19 positive case was detected in a locality — it only makes sense for people to want to share this information to warn others, despite its potential to cause fear or alarm.
In practice, this could mean that while EO2 is ostensibly intended to deter and sanction creators and sharers of Covid-19 fake news, its application could inadvertently apply to the innocently mistaken as well. Adding to this risk is how digital literacy skills remain lacking and it is arguably unfair to place the burden of determining the veracity of information upon people who do not know any better.
Third, this sharing of information is taking place against the backdrop of the infodemic and an imperfect information environment where scientific consensus on some matters relating to Covid-19 are still being debated. Moreover, with authoritative and inaccurate information intermingling in the online space without visual distinction—which could make the latter seem of equal quality with the former—it is not surprising that people could inadvertently consume and share wholly or partly false information.
Fourth, adopting such a harsh measure to restrict potentially problematic conversations on Covid-19 could backfire and drive problematic and even harmful conversations on Covid-19 and/or the vaccines into private, closed-groups. In these “echo chambers”, the probability of fact-checks and counter narratives being presented will fall drastically at the expense of group members not being exposed to contrary information.
Even being perceived to be clamping down on conversations around Covid-19 could risk hardening the worldview of some conspiracy believers by feeding into their paranoia that the government has “something to hide”. In a situation where the Covid-19 vaccines are already facing some hesitancy, this may threaten to jeopardise the government’s own efforts to encourage voluntary vaccination.
To close, it must be emphasised that we are under no illusion that the marketplace of ideas—where good ones trump the bad—is working as intended. In fact, contrary to what we once thought, research has suggested that facts and figures might be insufficient in convincing people of an argument.
To be sure, there is a legitimate argument to be made about regulating speech when it can cause harm. However, the bluntness of EO2 as a policy tool to regulate a complex, amorphous issue could serve to exacerbate it further while risking ordinary people being unintendedly prosecuted.
Extending executive powers to nullify fake news without understanding the nature of the problem would go against what the facts and science are telling us: that there are better ways to fight fake news.
That is the higher standard we must appeal to and work towards.
*Harris Zainul is Analyst at the Institute of Strategic and International Studies (ISIS) Malaysia. Ryan Chua is a Master in Public Policy student at the Lee Kuan Yew School of Public Policy in Singapore. They both work on misinformation policy.
This article was first appeared in the Malay Mail on 15 March 2021.