Govt proposes disabling social media access to harmful content, as part of new codes of practices on online safety
SINGAPORE — In a bid to protect users from harmful online content, the Government is proposing two codes of practices for social media services, including being able to direct such companies to disable access to specific content.

- MCI is planning to get social media services with a large reach or high risk to play a part in enhancing safety for all users.
- It is also planning to direct social media services to disable access to specific content for Singapore users
- This may cover content relating to sexual harms, self-harm, public health, public security, and racial or religious disharmony or intolerance
- Industry consultations for the proposals have started and there will be public consultations on the proposal in July
SINGAPORE — In a bid to protect users from harmful online content, the Government is proposing two codes of practices for social media services, including being able to direct such companies to disable access to specific content.
The Ministry of Communications and Information (MCI) said at a press conference on Monday (June 20) that the first proposal is for social media services with a large reach or high risk to have “system-wide processes” to enhance safety for all users.
This would include having in place community standards and content moderation mechanisms to mitigate users’ exposure to sexual, violent and self-harm content.
The second proposal is for the Infocomm Media Development Authority (IMDA) to be able to direct social media services to disable access to “specified content” for Singapore users, or disallow specific online accounts on social media services to interact with or communicate content to Singapore users.
This may cover content relating to sexual harms, self-harm, public health, public security, and racial or religious disharmony or intolerance.
MCI said that these proposals will only cover social media services — platforms that allow posting of online content with the primary purposes of online interaction and linking — and will exclude messenger applications.
Industry consultations for the proposals started this month and there will be public consultations on the proposal next month.
Communications and Information Minister Josephine Teo first unveiled plans for these new codes of practice against online harm in March, during the budget debates for her ministry.
MCI said on Monday that the prevalence of online harms both globally and in Singapore is a major concern despite many online services working to address this issue.
"Such content that could propagate harm include those that endorse acts of terrorism, extreme violence or hateful acts against certain communities, encourages suicide or self-harm, or those that destabilise one’s physical or mental well-being through harassment, bullying or the non-consensual sharing of sexual images.
“These online harms are exacerbated when they are amplified on social media services,” MCI said.
For instance, platform algorithms based on user interest can propel content such as dangerous challenges on video that can go viral rapidly, which can lead to injuries and deaths and acts of terrorism, and their aftermath can also be spread through videos captured through live-streaming and re-sharing of content, it added.
MCI also pointed out that religiously or racially offensive content can incite religious intolerance and prejudice our racial harmony.
For example, it said, last year, a Singaporean man impersonated a Chinese female and posted several racially offensive and insensitive public posts on a social media service, denigrating minority communities in Singapore, and this has since been reported to the authorities.
In 2020, a person behind the profile "NUS Atheist Society" published a religiously offensive post that depicted the Bible and Quran as alternatives to be used in the event of a toilet paper shortage.
Abusive online behaviour such as harassment and sexual violence has also been prevalent, MCI added.
Last year, a poll asking people to rank female asatizah (religious teachers) according to their sexual attractiveness was posted on social media.
“The post caused immense distress to the individuals involved and was found to have promoted sexual violence,” said MCI.
A survey in January conducted by Sunlight Alliance for Action, a cross-sector alliance that tackles online dangers, found that 61 per cent of Singaporeans who experienced gender-based online harms mainly experienced it on popular social media services.
The proposed codes call for greater accountability on the part of social media platforms, by designating them to produce an annual report to be published on IMDA’s website, and allowing the authority to direct such platforms to remove access to harmful content.
However, MCI said that details on the consequences for not abiding to such directions have not been discussed at this early stage of consultations.
“MCI takes a collaborative approach towards governing the online space against harms,” it added.
”We recognise that the industry has taken active steps in recent years towards combating harmful online content on social media, and their contributions will be critical in shaping a safer and more responsible online space for users in Singapore.”
Meta, which is the parent company of social media platform Facebook, said in response to TODAY's queries that combating harmful content is a shared goal between governments and the industry.
"We welcome dialogue among youth, parents and caregivers, educators and experts to ensure teen safety and well-being, while respecting their expectations to privacy and promoting their autonomy," it added.
Ms Teresa Tan, TikTok's director of public policy for Southeast Asia and Singapore, also said that the social media company "shares the same commitment" as the Singapore Government to combat online harms.
"User safety is our top priority and over the years, we have made every effort to create a safe online space that prioritises age-appropriate experiences. We look forward to furthering our work to enhance online safety for our communities."
YouTube and Twitter declined to comment.
As part of its consultative approach, the authorities will also look at legislations by other jurisdictions to regulate online services.
These include Germany’s Network Enforcement Act, which came into effect in January 2018, and the United Kingdom’s Online Safety Bill, which was introduced in March.
MCI said that these legislations are broad in nature, while the proposed codes are intended to be more targeted.
The codes will be expected to fall under the Broadcasting Act, which now also covers the Internet Code of Practice.
The Internet Code of Practice sets out standards for internet content and makes it compulsory for internet service providers and internet content providers to deny access to any content prohibited by the authorities.