Skip to main content



Public hearings throw up wide-ranging views and suggestions

Over five days of public hearings held so far in the past two weeks, views and suggestions were put forward — in the form of written and oral representations — at the public hearings conducted by the Select Committee on deliberate online falsehoods. To date, about half of the 79 individuals and groups scheduled to give evidence at the hearings have done so. The remaining will appear before the Select Committee during the last batch of hearings to be conducted from Tuesday to Thursday. While the representations were wide-ranging, some key themes have emerged:

Editors of Singapore's media companies at the public hearings of the Select Committee on Deliberate Online Falsehoods. Photo: Ministry of Communications and Information

Editors of Singapore's media companies at the public hearings of the Select Committee on Deliberate Online Falsehoods. Photo: Ministry of Communications and Information

Follow us on Instagram and Tiktok, and join our Telegram channel for the latest updates.

Over five days of public hearings held so far in the past two weeks, views and suggestions were put forward — in the form of written and oral representations — at the public hearings conducted by the Select Committee on deliberate online falsehoods. To date, about half of the 79 individuals and groups scheduled to give evidence at the hearings have done so. The remaining will appear before the Select Committee during the last batch of hearings to be conducted from Tuesday to Thursday. While the representations were wide-ranging, some key themes have emerged:


Dr Carol Soon, Senior Research Fellow, Institute of Policy Studies, and Mr Shawn Goh, Research Assistant, Institute of Policy Studies:

“The government should not intervene in every incident but focus its resources on “high breach” deliberate online falsehoods, specifically those that threaten public order and national security. Legislation plays a complementary role to non-governmental measures and non-legislative governmental measures. We proposed a “5Cs” framework that can be used by the government to evaluate the necessity for legislative action. Each of the ‘5Cs’ — Content, Context, Communicator’s Identity, Communicator’s Intent and Consequence — should be used when evaluating if legal action (and what type) is needed to counter a specific deliberate online falsehood (as demonstrated in Paragraph 39). When legislative action is absolutely necessary, we advocate that the government leverage existing laws and regulations as they have stood the test of time, contain clear provisions for safeguards against content that can harm our society, are platform and context neutral, as well as allow for various intervention points in a calibrated fashion.”

Dr Mathew Mathews, Senior Research Fellow, National University of Singapore:

“There should be a mechanism in place to allow the Government to stop the access of the local population to media sites which feature deliberate online falsehoods that threaten Singapore’s social harmony. This mechanism should kick in quickly when such clear falsehoods (as defined in the Introduction) are discovered. Since some sites might be based overseas there must be adequate provision to block these sites if it is deemed as necessary, especially if website owners cannot be reached to take down the online distortions or place the needed clarification of facts related to the story.”

Dr Shashi Jayakumar, Head, Centre of Excellence for National Security and Executive Coordinator, Future Issues and Technology, S Rajaratnam School of International Studies:

“There has from time to time been discussion on some sort of international set of norms or basic understandings on controlling various issues in the cyber and disinformation spheres - akin to a Geneva Convention on these subjects. At present, these attempt appear to have failed for the time being, and it appears extremely unlikely that nations will attempt at any point in the near future to come together to talk over red lines and rules of the road when it comes to disinformation campaigns.”

Mr Morteza Shahrezaye and Professor Simon Hegelich, Technical University of Munich:

“Regulations and laws dealing with this new phenomenon seem to be necessary, not to stop this development – which is probably already impossible – but to steer it in favour of society. Policymakers face at least two very difficult challenges: the digital revolution is rapidly and ongoing transforming the public and private sphere. Any governmental action has to deal with the situation that the half-life of any solution might be very short. In addition, the separation/integration of the public and private sphere is a core element of democracies. There will always be a trade-off between personal freedom and public interest.”

Mr Zubin Jain, a Grade 10 student at United World College of South East Asia

“Legislation should be targeted towards institutions that encourage (the spread of online falsehoods) rather than those that create (them) due to a simple scarcity of resources. Overly harsh punishment created in response to this problem has room to be misused and will achieve only petty retribution. Harsh legislation against websites that encourage and abet the spread of misinformation is, therefore, the best defence.”



“We do not believe that legislation is the best approach to addressing the issue. Singapore already has a variety of existing laws and regulations which addresse hate speech, defamation and the spreading of false news including the Telecommunications Act, the Protection from Harassment Act, the Penal Code, the Maintenance of Religious Harmony Act, and others. Instead, we believe in the need to adopt an innovative and iterative approach, as prescriptive legislation and requirements would make it harder for us and other online platforms to find the right technical solutions, consumer messaging, and policies to address this shared challenge.”

Mr Gaurav Keerthi, founder of and

“I believe that the challenge of fake news should not be addressed by the use of harsh laws as a primary course of action. The strategy should rely on (a) education of students to be more discerning readers, (b) an easily accessible and reliable source of facts on all policy fronts, (c) cleverly designed online tools to help users sift fact from fiction, and (d) a forum for individuals to robustly debate dissenting opinions.”


Dr Soon and Mr Goh:

“While government-led fact checking initiatives are important, they are insufficient on their own to combat deliberate online falsehoods. Research shows that people with low institutional trust are more likely to believe in rumours, conspiracy theories and alternative narratives. The implication is that during a crisis where the subject embroiled in a deliberate online falsehood is the government or organs of the state themselves, people are likely to turn to non-government platforms to seek information or verification, especially if government-led fact checkers demonstrate a pro-government bias. As such, we should encourage and allow room for the establishment of non-government fact checking. Non-government fact checking should be seen as a complement and not a substitute for government-led efforts. Besides countering cynicism (towards the authorities and institutions), industry-led and ground-up fact checking initiatives also help build social trust and increase the number of avenues and platforms that people can go to in times of doubt and fear.”

Mr Walter Fernandez, Mediacorp Editor-in-Chief; Mr Jaime Ho, Chief Editor, Digital News; Ms Yeung Shuk Lin, Chief Editor, News; Ms Quah Ley Hoon, Chief Editor, Current Affairs, Mediacorp:

“It will be useful to establish a “fact checking” council, committee or body made up of diverse representatives to assess and thereafter designate deliberate online falsehoods as specifically defined. This council should be independent, transparent and be able to react to emergent deliberate online falsehoods quickly. It should include Singaporean representatives from academia, NGOs, civil society, including from the legal community, and other social groups that are representative of Singapore society. Its mandate must include identifying a (falsehood) and thereafter recommending appropriate remedial actions. As a crucial tool of public trust, the work, findings and recommendations of the council must also be open to public scrutiny.”

Singapore Press Holdings:

“On the question of which authoritative body should define and identify falsehoods, SPH recommends the establishment of a full-time coalition of media players, industry practitioners and other interested parties to carry out fact-checking of user-submitted information. This group should sit independently from government bodies and commercial entities, although representatives from these organisations may participate in the committee.”

Mr Keerthi:

“(There should be) a reliable source of information that is not government-curated. I propose for the creation of the Office of Ombudsman to be a neutral arbiter to receive public requests for information and assess if government-classified data should be revealed for the wider public good. A website should be created for the Ombudsman or whichever agency is assigned to disseminate such facts on sensitive policy issues.”

Mr Ruslan Deynychenko, co-founder of

“Within four years, StopFake has collected thousands of examples of Russia’s purposeful dissemination of fakes and manipulations. It is extremely important that every article includes detailed facts that clearly show why this information is false. We also pay special attention to media organisations who participate in the creation and spread of disinformation. This allows us to firmly accuse specific television channels, radio stations, and newspapers of actively participating in circulating propaganda.”

Dr Jayakumar:

“Singapore could consider establishing a body — not necessarily a government one — that uses grassroots participation to counter fake news and disinformation operations. This institution could (1) carry out research and fact checking initiatives, and congregate various experts under its umbrella to wage targeted campaigns against fake news (particularly when organised fake news campaigns are brought to bear against the people); (2) produce content for television, newspapers and social media to debunk fake news and inform audiences, and (3) offer training to media professionals and other relevant parties.”


Mr Keerthi:

“I propose that the Ministry of Education revamp the secondary school syllabus to place a much stronger emphasis on critical reading and debating (which teaches individuals the art of disagreeing without disrespect, and also how to be critical of falsified facts). While debate has recently been made part of the syllabus, more can be done to ensure that young minds are taught the importance of being discerning about their online news consumption behaviour.”

Mr André Ahchak, Director of Communications, Roman Catholic Archdiocese:

“The Church believes that the best way to handle misunderstandings or minor falsehoods which do not impact Singapore’s safety and religious harmony is through public education. It is no longer possible to stop ‘fake news’ simply by blocking websites or publications because social media utilises person-to-person sharing.”

Mr Ben Nimmo, Senior Fellow, Information Defence Digital Forensic Research Lab:

“It is therefore important to educate Internet users in the basic principles of digital awareness and hygiene, and to work with the platforms on solutions, rather than against them. Essential skills such as how to identify a bot or a troll can be taught without recourse to sophisticated software or analytical techniques. Such skills are vital for normal users, and particularly for media outlets, which can otherwise amplify fraudulent accounts. As the government is responsible for education policy, it is best placed to lead such educational efforts.”


Assistant Professor Michael Raska, Assistant Professor, S Rajaratnam School of International Studies:

“Singapore needs to explore the nature of the evolving strategic competition in East Asia. In this context, Singapore may become vulnerable to other non-traditional emerging threats, particularly political and hybrid warfare. As conflicts evolve parallel in the cyber and information domains, the centres of gravity are also going to shift. The value, and more importantly, the accuracy and reliability of strategic information relevant for the situational awareness and function of the nation state as a system will become even more important with the increased dependence on cyberspace.”

Dr Jayakumar

“Modern information technologies empower and incentivise subversion at scale. In cyberspace, there is no requirement of course for messages to have a direct connection to the truth, and, either way, the perpetrator of falsehoods can mask its tracks and have some degree of plausible deniability. Why not then employ these techniques to undermine resilience in targeted countries, when these methods can be far cheaper (and less bloody) than warfare, and which may be more precisely tailored to achieve state aims compared to diplomacy? It was after all Sun Tzu who observed that to subdue the enemy without fighting is the acme of skill.”

Dr Janis Berzins, Director, Center for Security and Strategic Studies, The National Defense Academy of Latvia:

“One of the main aspects of modern warfare is the idea that the main battlespace is the mind. As a result, new generation wars are dominated by information and psychological warfare, aiming to achieve superiority in troops and weapons control, and morally and psychologically depressing the enemy’s armed forces personnel and civil population. The main objective is to reduce the necessity of deploying hard military power to the minimum necessary, making the opponent’s military and civil population support the attacker to the detriment of their own country, its values, culture, political system, and ideology.”


Dr Mathews:

“In the Singaporean context, online falsehoods that can threaten social harmony can come in various forms and become an everyday experience. These can include reports that intentionally feature misinformation about particular ethnic, religious or immigrant groups and their loyalty to Singapore, their potential to commit anti-social acts or crimes, their lack of contribution to society, their overuse of state resources, or highlight and speculate about aspects of their culture which may not be well understood but deemed as at odds with majority culture.”

Mr Nimmo:

“For Singapore, election campaigns and tensions between different social, religious, political, economic and ethnic groups are likely to be the main targets of any such attempts, as the campaigns tend to gradually inflame tensions and hollow out the political centre at the expense of the fringes. Relations with neighbouring countries can also be the focus of targeted disinformation or influence campaigns.”

Dr Elmie Nekmat, Assistant Professor, Communications and New Media, National University of Singapore:

“In reality, the effects of deliberate falsehoods and misinformation in social media can occur rapidly and impact broad segments of society within a short period of time. Such effects tend to be the outcomes of finely calibrated disinformation campaigns carried out on social media, leveraging unverifiable information. Among other things, ‘cyber armies’ and ‘web brigades’ comprising fake accounts, bots, and trolls in social media — 1) induce virality of online falsehoods by ‘sharing’ disinformation within and across different social media channels, 2) produce faulty perceptions of majority opinion surrounding issues affecting society, and 3) create the illusion of majority support that can spur actual individual support through a bandwagon effect.”

Mr Septiaji Eko Nugroho, Founder and Chairman of Masyarakat Anti Fitnah Indonesia, Mafindo/Indonesian Anti Hoax Community:

“The disinformation ecosystem exploits the state of Internet and social media that are friendly to anonymous account. Many people also still have false impressions that they are free to speak anything on social media without consequences. We are also entering the post truth era, that people with different political affiliations, different ethnics, different religions, tend to have more distrust for others. People (find it) easy to believe any information that suits their initial position, regardless true or false, but on the other hand, (it is) very difficult to accept even facts, if (they are) different with personal position and we call this confirmation bias.”

Read more of the latest in




Stay in the know. Anytime. Anywhere.

Subscribe to get daily news updates, insights and must reads delivered straight to your inbox.

By clicking subscribe, I agree for my personal data to be used to send me TODAY newsletters, promotional offers and for research and analysis.