Online Exclusive: How to get ready for the next disaster

by Jim Lynch, FCAS, Insurance Information Institute

It feels like an age of megadisasters, Case in point:

  • 1992: Hurricane Andrew causes insured losses far beyond what experts predicted.
  • 2008: Global markets melt down.
  • 2020: Pandemic ravages the world.

Each of these events was devastating, but they had something else in common: Few saw them coming, but more people should have.

Why that is and how to be better prepared for the next “unexpected” event was the theme of the closing General Session at the November 2020 CAS Annual Meeting, “Grey Swans and Black Elephants: Why We Keep Getting Surprised and What We Can Learn from Them.”  

Rade Musulin, ACAS, a principal at Finity Consulting Party Ltd., addressed the animals in the session title. Leigh Wolfrom, a policy analyst at the Organization for Economic Co-operation and Development, better known by its acronym OECD, highlighted his organization’s research into the insurability of various types of catastrophes risks.

About those animals: The term grey swans comes from a bestselling book from a decade ago that classified disasters this way:

  • Black swans are highly unlikely events that are unpredictable and massive. After the fact, we concoct an explanation to explain them retroactively.
  • Grey swans are unlikely but possible events that can blow up into big disasters. After the fact, we often explain them away by blaming some human failing, like errors in judgment.
  • Add to those black elephants, a mix of a black swan (unpredictable, massive event) and the proverbial elephant in the room — everyone knows about it but pretends it is not there.

Musulin also talked about the famous butterfly effect, where an insignificant effect (butterfly flapping its wings in Africa) compounds into a disaster like a hurricane.

He rounded up the monochrome menagerie to present a window into the psychological and systemic structures behind disastrous events. He said he wanted to “adjust mindsets to help us manage risk in a world filled with extreme events.”

Black swans

For thoughts about systems in a crisis, Musulin turned to Black Swans author Nassim Taleb, who explored a theory of fragility that at first sounds counterintuitive. In his essay, “The Calm Before the Storm,” Taleb suggests that a healthy dose of volatility is better than decades of stability.

The reasoning works something like this:  

  • Systems that occasionally face moderate disorder are more resilient, and ones that suppress disorder are more vulnerable.
  • Most of the things that happen are normal, so looking at the past is a good way of seeing how a system can withstand normal stresses but might not reveal how that system will do in an extreme event.

Musulin’s example here is Egypt, a country that through suppression had been stable for decades until a heat wave in the Ukraine raised the price of bread, an important food staple in many Arab countries. The ensuing food riots led to the Arab Spring, which, when spread to Egypt, brought down the government in a few weeks. The placid surface presented by an authoritarian government couldn’t withstand a violent upheaval.

The lesson: The best way to see how a system will withstand a black swan event is to see how the system handles occasional bouts of disorder. The absence of disorder could be considered a sign of weakness in the system.  

Black elephants

The term comes from Adam Sweidan, an investor and environmentalist, who pointed out that our world is filled with black elephants: climate change, mass extinction, deforestation. Everyone knows these extreme events exist, but they are ignored.

Musulin adds global pandemic to the list, noting that philanthropist Bill Gates and Johns Hopkins researchers had in recent years issued eerily prescient warnings on how a global pandemic could grip the world.  

The risks of a pandemic, Musulin said, were “clearly known, clearly dangerous and totally ignored.”


Chaos theory essentially relies on the butterfly effect — a small difference in the initial condition of a process leads to a vastly different outcome. A small change in Florida votes in 2000 elects a different president; a driver in 1914 Sarajevo makes a wrong turn, leading Archduke Ferdinand to his assassin and the world to war; a Soviet general in 1983 ignores a (false) report of incoming missiles and prevents an accidental nuclear holocaust.

“Sometimes massive dislocations are triggered by seemingly minor and inconsequential events that are difficult to predict,” Musulin said. But knowing how these swans, elephants and butterflies behave, he said, “We can better understand extreme events.” He then gave examples of threats that were mismanaged and those managed successfully.

Several big mistakes ignored the collision of history and modern technology.  

  • Iceland has had volcanoes longer than humans have flown airplanes. Why were people caught unprepared when an eruption disrupted European air travel in 2010?
  • In Japan centuries-old obelisks warn against building below them because the risk of tsunamis is too great. Why was a nuclear plant put on the wrong side of one without simple anti-tsunami measures like waterproof rooms for backup generators?

Musulin said these situations teach us that:

  • It is important to consider how past events will play out in the more complicated world of today.
  • We can become too comfortable with models, ignoring risk that falls outside of them.
  • Human factors are important, but hard to model.

However, he said, there are examples of our ability to adjust to risk and mitigate it.

For example, when the United States and other countries abandoned the gold standard in the 1970s, companies had to adjust. One, Laker Airways, struggled. As a British company, most of its revenue was in pounds, while most of its costs were in U.S. dollars. A weakening of British currency in the early 1980s left it with heavy losses. Back then it had few tools to handle its predicament. Today there are tools to hedge against currency fluctuation that did not exist then. (There had been no fluctuation to hedge against prior to floating exchange rates, which began in the 1970s.)

Another example is when the insurance market froze up after the terrorist attacks in 2001, just as insurers stopped covering losses from terror attacks. But governments made terrorist attacks harder to accomplish (airport screening, security investments) and created financial backstops for terrorism losses. Insurers also revised policy terms and learned how to better model the risk. The resulting system has rekindled the market for terrorism cover.

The lessons, some of which Musulin suggests could be applied to the current pandemic, include the following:

  • Risk is best understood through a combination of models and mindsets.
  • Complex systems are rife with interconnected risks.
  • Human responses are an important factor, but these are difficult to model.
  • Risk management can address problems that seem intractable.
  • Many risks are global and easily cross international borders.

Proper risk management techniques can manage risks, Musulin said. Among his ideas are to design systems that are resilient and can adapt; look to history to see how new technology could affect old perils, and understand why volatility, not stability, might show the strength of a system.

OECD researcher Wolfrom discussed research into trends in the insurability of major risks. The goal is to understand how “problematic they may be for the insurance market.”

His address focused on three affecting property-casualty insurers: climate change, cyberattacks, and pandemic.

He suggested that a risk was insurable if the cost of underwriting coverage that generates a reasonable profit is less than what people are willing to pay for that coverage. The cost of coverage is a function of the size of the expected loss, the uncertainty of results and the requirements of the suppliers of capital and regulators. (Capital requirements consider the ability of insurers to take advantage of the law of large numbers in building a portfolio of diversified risk.)

Wolfrom provided the following insights about each type of risk.

Climate change  

The expected loss is quantifiable but is growing rapidly. Under climate change, hurricanes seem likely to become more powerful, wildfires seem likely to become more prevalent, and an increase in intense rainfall seems likely to bring more floods.  

The natural catastrophes are well-modeled now, but as the climate changes, uncertainty grows.

It should still be possible to diversify risks, though there is a chance that correlation across territories and risks could increase as the climate evolves.

Cyber attacks

The expected loss will grow as our digital systems grow and become more integrated.

Uncertainty is relatively high because the risk is fairly young and modeling techniques are still new.

Capital needs are significant, and it may be hard to diversify because a single attack can affect systems around the world.


The size of loss from an event is large. There is also uncertainty about frequency; it’s easy to call the current pandemic a one-in-100-year event, but climate change, globalization, and evolving viruses could make it more frequent than that. On the other hand, societies may be able to manage future pandemics better than this one.

Modeling of the key property-casualty pandemic exposure — business interruption coverage — is in early stages.

Capital requirements might be prohibitively high in the case of a truly global outbreak as the global risk cannot be diversified away easily.

Actuaries should have a role in calling attention to these enormous risks, both known and unknown.

 “Part of the reason that actuaries are vital in the global financial system, Musulin said, “is that we are the ones that should be pulling the fire alarm when there are risks that aren’t being mitigated. That’s what we do. We measure risk. We look into the future and we price risk, and we send an economic signal to discourage bad behavior.”