Lesson from War of the Worlds, 80 years on
October 30 marks the 80th anniversary of Orson Welles’ radio adaptation of H.G. Wells’ The War of the Worlds – a novel on Martians invading the Earth. Why is this incident relevant today? Because, 80 years on, most still believe those news reports to be true even though there was no actual widespread panic. And this has relevance to the ongoing debate about fake news and how we should tackle it.
October 30 marks the 80th anniversary of Orson Welles’ radio adaptation of H.G. Wells’ The War of the Worlds – a novel on Martians invading the Earth.
Welles’ infamous radio broadcast reportedly caused widespread hysteria and panic as listeners thought it was of a true event.
Newspapers the next day described reactions such as roads being jammed with fleeing people, some panic-stricken hiding in cellars, and there was even a report of a listener dying from a heart attack.
Why is this incident relevant today? Because, 80 years on, most still believe those news reports to be true even though there was no actual widespread panic. And this has relevance to the ongoing debate about fake news and how we should tackle it.
There are two main reasons for the initial false reports.
First, research on the reporting has revealed that evidence gathered by newspapers was anecdotal and spun into exaggerated narratives. Most papers relied on dispatches sent by wire services incorrectly extrapolating widespread fear from small numbers of accounts.
Second, data was misunderstood. An increase in the number of calls to the police was incorrectly accepted as evidence of panic and hysteria.
In such a situation, a spike in calls was expected as it is rational to fact check when unsure. Also, some called to find out where to donate blood while others called to complain that the broadcast was too realistic.
From the early bad reporting, a cognitive bias called an availability cascade resulted in the story becoming entrenched in the people’s minds.
An availability cascade is a self-reinforcing process where information attains more and more credibility as it is repeated in public discourse. Two drivers led to this cascade.
First, strategic exaggeration by individuals and groups who benefited from this false story. Welles talked up the story as it enhanced his personal myth.
Newspapers had little interest in investigating the veracity of the story because radio at that time was establishing itself as a challenger to newspapers as a news provider and a competitor for advertising revenue.
Showing how competition can lead to blinkered coverage, newspapers continued to run literally thousands of articles on the event in the following month.
Unfortunately, little is said on what radio did.
Second, initial poor academic research done on the incident fed the cascade. Hadley Cantril, a Princeton University psychologist, published the first study on this event in 1940. The study was questionable in terms of rigour.
Cantril incorrectly categorised interviewees who said they were “frightened”, “disturbed” or “excited” as being panicked. These terms are not synonyms.
Moreover, recent more rigorous research analysing letters sent to Welles and the Federal Communications Commission by listeners soon after the show suggests only a few were panicked to the degree maintained by news reports and Cantril’s study.
Coupled with the availability cascade, the confirmation bias came into play. A confirmation bias is the predisposition to search for information confirming one’s preexisting beliefs.
This was a good story to confirm American naiveté and credulousness.
In today’s language, the myth of the War of the Worlds panic is an early example of how a fake news story or deception can find its reach amplified by inaccurate reporting.
One modern parallel is the case of Cambridge Analytica (CA), the consulting firm that employed Facebook user data to target voters with personalised and possibly misleading political advertisements based on their psychological profile.
CA’s actions were blamed for helping sway voters and affecting the outcome of the United States Presidential Election in 2016 and the Brexit Referendum in 2015. While subsequent academic research shows little concrete evidence of CA succeeding in these efforts, just like the War of the Worlds myth, this story persists.
Why has this modern myth persisted?
First, the initial story spread quickly because they captured the zeitgeist of fears that social media was being used to deceive populations and disrupt society, especially societies with high levels of social media penetration like Singapore.
Second, there are numerous parties who benefit from this story. News publishers can paint Facebook as culpable or at least grossly negligent in enabling wrongdoing.
Facebook is the major source of news in this generation and poses an existential threat to traditional news publishers, just as radio was a threat to newspapers 80 years ago, and the story is a useful rallying call for more regulation.
CA itself benefits from this infamy, as it is still in the business of selling its subversive services, this time to Caribbean states.
Third, the story appeals to anyone unable to fathom why large groups of people support, even in some cases to their own detriment, Mr Donald Trump or Brexit. Surely these people must have been deceived and/or manipulated, for they would have made the ‘correct’ choice if they knew the ‘Truth’.
The assumption people will make the correct decisions when presented with the facts is known as the Enlightenment Fallacy. However, this assumption is false as humans do not share a common decision-making code the same way computer systems do.
Differing experiences lead to different decisions when presented with the facts. One cannot deny that genuine socio-economic and socio-cultural issues and pain points created the fertile ground for the election of Mr Trump and Brexit.
From the tale of two myths, two lessons are derived for dealing with disinformation.
First, people sometimes act because of genuine socio-economic or socio-cultural issues, not just because of fake news that they have consumed. This means we cannot focus solely on solving the latter without tackling the former.
Second, while we should not underestimate the harm hoaxes and disinformation may cause, we should not overestimate them either, as this can give them more power than they deserve. CA and Facebook should be accountable for what they have done or failed to do.
Nevertheless, over-reacting to the fear of disinformation can play into the hands of those spreading it.
Today, more have heard about the War of the Worlds panic than those who heard the broadcast. Eighty years from now, if the legend of elections and referendums swayed by commercial firms persists, we will have learned nothing.
ABOUT THE AUTHORS:
Norman Vasu and Benjamin Ang are Senior Fellows at the Centre of Excellence for National Security, S. Rajaratnam School of International Studies, Nanyang Technological University.