Skip to main content



Facebook detects more ‘covert influence operations’ globally, but no foreign-directed ones found in Singapore

SINGAPORE — There is an increasing number of “covert influence operations” on Facebook, including those conducted by organisations paid to do so, two top executives of the social media giant said on Tuesday (Sept 28).

Singapore is one of the first countries in the world to take a legislative approach to tackle the challenge of influence operations.

Singapore is one of the first countries in the world to take a legislative approach to tackle the challenge of influence operations.

Follow us on Instagram and Tiktok, and join our Telegram channel for the latest updates.

  • There is a rising trend of covert influence operations around the world, two top executives from Facebook said
  • The social media giant has scrubbed more than 150 of such operations from its platforms since 2017
  • However, there has been no cases of foreign influence ops targeting Singapore, and most influence ops in the region are domestic
  • Facebook shares Singapore’s goal in combating influence operations, but said a proposed law to do so is worded too broadly


SINGAPORE — There is an increasing number of “covert influence operations” on Facebook, including those conducted by organisations paid to do so, two top executives of the social media giant said on Tuesday (Sept 28).

“Influence operations” refer to any organised campaign on Facebook that aims to sway public opinion on a certain issue.

The social media firm has in the past few years worked to root out such operations that are done in a covert manner — that is, run by a party that is hiding its true identity or pretending to be someone else.

Since 2017, Facebook has publicly removed more than 150 of covert influence operations, which sought to manipulate and corrupt public debate to achieve strategic goals in a coordinated fashion.

This was what Mr David Agranovich, Facebook's director of global threat disruption, said in a briefing with reporters based in Singapore.

The rate of takedowns by Facebook has also accelerated over time, he added. Mr Agranovich was previously director for intelligence at the United States’ National Security Council in the White House.

“In 2020 and 2021, we were routinely taking down a large number of these influence operations networks in more places all over the world, earlier in their life cycle, capturing them before they can build meaningful audiences or reach large numbers of people,” he said.

Facebook’s media briefing came after the Ministry of Home Affairs tabled a Bill in Parliament earlier this month for a new law — the Foreign Interference (Countermeasures) Act, or Fica — which aims to safeguard Singapore’s political sovereignty from foreign influence by empowering the Government to order the takedown of foreign interference campaigns.

The Bill will be debated at the next parliamentary sitting.

On Tuesday, Facebook executives said that the company, which also owns Instagram and messaging platform WhatsApp, has detected zero cases of foreign influence operations in Singapore.

“That is not for want of looking — we’re constantly looking for these types of operations, constantly monitoring threat actors in the region. That doesn’t mean that they don’t exist, but that we just haven’t seen them,” Mr Agranovich said.

Instead, he warned about a growing trend of what Facebook calls “perception hacking” — whereby people are misled to believe that there are ongoing influence campaigns when there is really none.

For example, more than 84 per cent of all influence operations occurring in the Asia Pacific are domestic — conducted by people within the same country — rather than originating from foreign sources, based on Facebook’s analysis of the influence operations it removed in the last four years.

Mr Agranovich noted that many governments around the world have “a very big focus” on foreign interference as a threat, even though Facebook’s analysis shows otherwise.

“We need to recognise that it's easy to raise the spectre of foreign interference as a threat and go beyond the reality of what we see. This type of perception hack can be a real challenge.”


Mr Nathaniel Gleicher, the firm’s global head of cybersecurity policy, said that Facebook does not target influence operations indiscriminately.

“If an advocacy group, a non-governmental organisation (NGO), or a government runs a campaign to convince people of something, such as to convince someone to vote for a law or to care about a particular issue, you might call that an influence operation,” he said.

“And that might not be bad. Influence and convincing people is the core of our public debate.”

Facebook focuses on inauthentic behaviours, rather than on content alone, to determine what is an adversarial threat, the executives said.

However, when someone conceals his or her true identity, for example, to make it look like a campaign is being run independently when it is not, that is a covert influence operation that Facebook would consider a threat.

Facebook has also acted to remove influence operations that had been paid for by entities. These campaigns are usually conducted by private firms, such as public relations or marketing firms, which had been hired to engage in coordinated and inauthentic behaviour on its platforms, Mr Agranovich said.

For example, in the Philippines, Facebook banned marketing group Twinmark in 2019 for using fake accounts and violating its misrepresentation and spam policies to artificially increase distribution and generate profit.

That same year, Facebook also took down accounts belonging to Indonesian media firm InsightID for sharing posts about the independence movement of West Papua using fake Facebook and Instagram accounts to conceal its true identity.

When asked why Facebook focuses on inauthentic behaviours in its users, rather than looking at whether the content does harm and is based on foreign sources, Mr Gleicher said that Facebook prioritises inauthentic behaviours because it is “particularly pernicious”.

“It is a tactic we've seen determined governments and non-government actors use to mislead and deceive people,” he said, adding that Facebook also has other tools to take down misinformation that causes physical harm, whether it is part of an influence operation or not.


Responding to TODAY’s question about Facebook's position on laws such as Fica, in which governments act as the arbiter to determine what action to take against influence operations, Mr Gleicher said that Facebook shares the same goals as the Singapore Government.

Singapore is one of the first countries in the world to take a legislative approach to tackle the challenge of influence operations, he noted. The Bill, however, is worded “very broadly”.

“Foreign interference, as a concept, is very broad. You can imagine it covering both a covert operation that misleads people about what's happening and who's behind it, and an open public effort... being run by an authentic NGO or community of users,” Mr Gleicher said.

“Lumping these two things together is tricky, and it can lead to some real challenges.”

Noting that Facebook has done significant work to combat influence operations on its own, Mr Gleicher said that the social media giant hires more than 40,000 people around the world who work on safety and security, and has poured in more than US$13 billion (S$18 billion) in this effort that has helped it protect hundreds of elections and other civic moments around the world.

He added: “The question that I always ask when we're looking at legislation is what does that add to the work that's already being done, how does it improve the community that has already been built to tackle these threats.

“As with any new legislative approach, we want to be careful to understand how it's going to work in practice, what its implications will be, what it means for user privacy, for freedom of expression, and for security.”

Related topics

social media Facebook foreign influence law Politics Foreign Interference Countermeasures Act

Read more of the latest in




Stay in the know. Anytime. Anywhere.

Subscribe to get daily news updates, insights and must reads delivered straight to your inbox.

By clicking subscribe, I agree for my personal data to be used to send me TODAY newsletters, promotional offers and for research and analysis.