Skip to main content

Advertisement

Advertisement

Imagine the unthinkable (before it happens to you)

In December 2006, an earthquake off Taiwan damaged undersea cables carrying voice and data, interrupting communications across Asia, including in Singapore. Internet traffic slowed to a snail’s pace. Some telephone subscribers had difficulty connecting to Europe — hardly a problem one would have expected in the 21st century.

A fire at SingTel’s Bukit Panjang facility disrupted many services dependent on broadband, but it could have had even wider repercussions. TODAY file photo

A fire at SingTel’s Bukit Panjang facility disrupted many services dependent on broadband, but it could have had even wider repercussions. TODAY file photo

Follow TODAY on WhatsApp
Follow TODAY on WhatsApp

In December 2006, an earthquake off Taiwan damaged undersea cables carrying voice and data, interrupting communications across Asia, including in Singapore. Internet traffic slowed to a snail’s pace. Some telephone subscribers had difficulty connecting to Europe — hardly a problem one would have expected in the 21st century.

Following many years when flooding was not such an acute concern, June 2010 saw the first in a series of flash floods in Singapore. And more recently, a fire at SingTel’s Bukit Panjang facility disrupted many services dependent on broadband, including polyclinic e-medical records and competing telcos’ operations.

What do these situations have in common? Each could have been even worse, with wider repercussions. For example, what if the undersea cable damage had been more extensive, leaving Singapore without high-speed Internet access for weeks?

Such scenarios require imaginative planning to optimise and coordinate a response spanning multiple agencies and stakeholders.

STUDYING MANY PASTS, CONSIDERING MANY FUTURES

Scenario planning requires both study and imagination.

Past failures around the world must be researched, to squeeze as much learning as possible from the lessons of history. Every imperfection discovered —at home or elsewhere — is an opportunity to improve and learn.

But while studying the past helps us avoid repeating old mistakes, what came before can differ from what will come next. Historical norms and trends guide planning, but trends can change and norms can be breached. We must beware failures of imagination.

A robust imagination is key to anticipating future threats. Some of tomorrow’s challenges will be similar in character, but different in scale: A larger fire, a bigger oil spill. Assumptions may have to be revised to match changing realities, such as the effect of global warming on rainfall and ocean levels, both of which have implications for flood prevention.

Other problems are of a completely different nature to what has come before and may seem absurd at first, but chillingly possible on later reflection. Whatever one thinks of the late Tom Clancy’s geopolitical fiction, he wrote about commercial airliners being used as suicide weapons years before the 9/11 attacks on the United States.

Red Team analysis can help: A trusted group is assigned to study systems vulnerabilities and design scenarios as if they were a hostile force. Successful implementation requires a willingness to give the Red Team leeway to think the unthinkable, however unpleasant. Leaders must be open-spirited enough not to feel embarrassment, should the Red Team discern an unexpected angle of importance which was previously unseen.

Training exercises should encompass more than only the routine and expected. Once competency has been established, there is value in the occasional “testing to destruction” with an overwhelming crisis scenario. Better to discover the limits of systems during a simulated challenge — the real world has no rewind button.

TECHNOLOGY: OF NETWORKS AND VULNERABILITIES

Modern technology allows data collection and analysis in great breadth and depth. Analytics can be used to scan for sentinel events and harbingers of future breakdowns, such as increasingly frequent minor mishaps and near-misses; or individual sub-systems persistently running near full capacity and thus having no functional reserve.

Technology has also brought new vulnerabilities. With so much being transacted on digital platforms, reliance on the network has become greater. For example, polyclinic electronic medical record systems, bank transactions and digital voice calls have become more dependent on Internet infrastructure.

The care and nurturing of essential services require attention to small detail, but just as important is a holistic appreciation of how the entire system holds together, especially when under stress. Individual operators and agents will seek improvement within their fields of responsibility, but local over-optimisation can result in broader brittleness.

For example, if voice telephony must be routed via fibre broadband modem, then voice communications adopts the systems risk of the broadband network — whether it is a fire at a telco’s Internet facilities or a power blackout which would disable modems.

Likewise, the cessation of TV teletext service means that many older Singaporeans will need an Internet connection to access the same information. Besides cost and inconvenience, it again converges systems risk: There is no teletext information service to help fill the gap when Internet services have downtime.

Indeed, systems brittleness was one lesson drawn from last year’s inquiry into MRT breakdowns: Emergency plans needed to be truly systems-wide, integrated across all parties and stakeholders.

HUMAN NATURE AND THE ROLE OF ORGANISATIONAL CULTURE

Human beings have frailties even when trying their very best; by understanding human beings better, we can build more robust systems.

The writer Margaret Heffernan has studied how well-meaning folk can overlook problems which seem obvious in hindsight.

“Wilful blindness” started out as a legal concept, but it has since been applied to the analysis of organisations and management.

We minimise our blind spots by being unafraid to question ourselves constructively. The first step in learning from a problem is to acknowledge its existence.

Organisational culture can shape attitudes and behaviour. Does the error-handling process encourage people to highlight problems and learn from them — or to cover them up?

Do junior staff feel able to discuss systems issues with seniors — or does the management give an impression that they do not really want to know?

Imaginative scenario planning, whole-systems thinking and a culture of open-eyed learning all play a role in preparing for challenges both expected and unforeseen.

These attributes may not guarantee success in an uncertain world. But the converse — a paucity of imagination, a proliferation of fiefdom silo mentalities and a prevalence of wilful blindness — would be a recipe for certain failure.

ABOUT THE AUTHOR:

Tan Wu Meng, a Singaporean, is a medical doctor working in a public sector hospital.

Read more of the latest in

Advertisement

Advertisement

Stay in the know. Anytime. Anywhere.

Subscribe to get daily news updates, insights and must reads delivered straight to your inbox.

By clicking subscribe, I agree for my personal data to be used to send me TODAY newsletters, promotional offers and for research and analysis.