Skip to main content

Advertisement

Advertisement

Ways to make social media less viral

In 1964, the journal Nature published a paper, “Epidemics and Rumours”, in which two mathematicians, DJ Daley and DG Kendall, employed models used to study the spread of disease to examine how rumours propagate. People were divided into groups: Those “susceptible” to a rumour, those already “infected” and spreading it, and those no longer passing the rumour on, described as “dead, isolated or immune”.

Ms Frances Haugen, the data scientist and Facebook whistleblower accused Facebook of fanning hate and misinformation via its algorithms, claiming it systemically chooses profits over people, particularly those outside the United States.
Ms Frances Haugen, the data scientist and Facebook whistleblower accused Facebook of fanning hate and misinformation via its algorithms, claiming it systemically chooses profits over people, particularly those outside the United States.
Follow TODAY on WhatsApp

In 1964, the journal Nature published a paper, “Epidemics and Rumours”, in which two mathematicians, DJ Daley and DG Kendall, employed models used to study the spread of disease to examine how rumours propagate. People were divided into groups: Those “susceptible” to a rumour, those already “infected” and spreading it, and those no longer passing the rumour on, described as “dead, isolated or immune”.

The image of minds infected by falsehood is a powerful one. More than half a century later, we’re contending with online rumours that spread like the plague.

The question is: Can we interrupt these waves of hyper-transmission, just as we do the spread of viruses?

In 2020, the two questions merged when the World Health Organisation warned of an “infodemic” of misinformation about Covid.

Fake news, it said, “spreads faster and more easily than this virus”.

Soon after, Julian Kauk of the University of Jena in Germany revealed he had put this to the test, applying Daley and Kendall’s model to a current conspiracy theory.

Mr Kauk’s models simulated how a false rumour about the spread of coronavirus was propagated on Twitter during the first six months of 2020 and found the same “wave‑like patterns” as the real‑world spread of coronavirus.

He also studied the effects of two countermeasures used to stem the flow of false information.

He found that, in the early stages of diffusion, fact-checkers were very efficient at choking off false content, but this rapidly lost effectiveness if applied too late.

The second method, tweet deletion, showed a moderate effect on the spread of the rumour, but was less time-sensitive.

In 2018, Jieun Shin of the University of Florida also found that false rumours tend to “mutate” and resurge in zombie‑like forms, just like a virus.

Studying 13 months during the 2012 US presidential election, she traced the lifecycle of 17 popular political rumours that had circulated on Twitter.

Dr Shin told me that companies have attempted since then to mitigate the problem by trying to slow down the movement of content.

For instance, Twitter has introduced tools such as “quote tweet”, or asking a user to read an article before retweeting.

“Even just requiring them to pause and think of something to write, makes them less likely to share misinformation.”

In early testing, Twitter said it found users clicked on articles they were considering sharing 40 per cent more often if they were asked to read the link.

In other words, simple design changes can help stymie “superspreaders” online.

This is what Frances Haugen, the data scientist and Facebook whistleblower, proposes. She accuses Facebook of fanning hate and misinformation via its algorithms, claiming it systemically chooses profits over people, particularly those outside the United States.

In an interview with the Financial Times, Ms Haugen said the platform could be made safer without content moderators or Facebook employees touching user-generated content, or making politicised decisions about what stayed online.

“Facebook’s own research details lots of what I call ‘content-neutral’ solutions,” she said. “[It’s] not about picking good and bad ideas, it’s about making the distribution of ideas safer.”

Solutions to slow content transmission to what Haugen calls “human scale” would include design tweaks such as limiting the size of groups (as WhatsApp has), or requiring users to copy and paste links (rather than just clicking “share”) if the chain of transmission goes beyond friends of friends.

In response, Facebook said the company had used this method sparingly but described it as “blunt” as it reduces all content, whether potentially offensive or benign, with equal strength.

The changes sound small but, according to Haugen, internal research has shown they can radically slow down a piece of content.

“That may carve out one or half a per cent of profit, but that has the same impact on disinformation as the entire third-party fact-checking programme . . . and it works in every language,” she said, referring to the results of internal research done while she was at the company.

Such circuit-breaking techniques are in stark conflict with the fundamental business model of all social media platforms, which have aggressively prioritised engagement and growth over all other metrics.

But they may be essential for social cohesion and health of the public. Epidemiologists haven’t stopped coronavirus yet, but the tools to do so have been tested and are being implemented.

Facebook would do well to take notes. FINANCIAL TIMES

ABOUT THE AUTHOR:

Madhumita Murgia is the FT’s European technology correspondent.

Related topics

social media viral misinformation falsehood

Read more of the latest in

Advertisement

Advertisement

Stay in the know. Anytime. Anywhere.

Subscribe to get daily news updates, insights and must reads delivered straight to your inbox.

By clicking subscribe, I agree for my personal data to be used to send me TODAY newsletters, promotional offers and for research and analysis.