Tag Archives: meyer

How to Outfox Our Brains About Risk

Ostriches are often characterized as hapless birds that bury their heads in the sand whenever danger approaches. In fact, they are highly astute escape artists. They use their great speed to overcome their inability to fly.

Much in the same way that ostriches are limited in their defensive actions because they cannot fly, we need to recognize that, when making decisions, our biases are part of our cognitive DNA. In the same way that the ostrich has adapted to risk by taking into consideration its physical limitations, we humans, when thinking about risk, need to develop policies that take into consideration our inherent cognitive limitations. We need to learn to be more, not less, like ostriches — hence the paradox — if we are to be better prepared for disasters.

We read about disasters all the time and often see post-disaster coverage about what people should have done in the situation: They should have put up their storm shutters, they should have evacuated, they should have purchased earthquake insurance — and so on. But people tend to disregard these warnings because of six major decision-making biases.

People have a hard time foreseeing future consequences (myopia); are too quick to forget losses from the past (amnesia); are inclined to think losses will occur to others rather than themselves (optimism); are too inclined to prefer inaction over action when faced with risks and maintain the status quo (inertia); fail to base decisions on all of the information that is made available about a risk (simplification); and are overly prone to imitate the behaviors of others who exhibit the same biases (herding).

See also: Need for Lifelong Learning in Insurance  

Their relative importance varies from situation to situation, but if there is one that it is most fundamental, it is excessive optimism. We have a hard time fully anticipating the physical and emotional toll that disasters can impart, and we are too prone to believe that disasters happen to other people in other places in other times.

A second bias that can create serious problems is myopia. There is a tendency for individuals to focus on short time horizons, so they do not undertake protective measures that have long-term benefits, such as not investing in loss reduction measures because of their high upfront costs.

Most modern approaches to risk management start by analyzing the objective likelihood and consequences of risks faced by individuals or communities and then design measures that could mitigate these risks — and hope that people will choose to implement them. For example, people in areas prone to earthquakes might be provided with checklists for how to prepare for such events and urged to buy earthquake insurance. But because people often do not adopt these measures, effective risk management has to proceed in the reverse order, starting with an understanding of why people may not choose to adopt risk-reduction measures and then design approaches that work with, rather than against, our natural biases.

A behavioral risk audit can — and should — be used as a source of guidance, not just for communities, but also for individuals and households. It should foster a discussion between family members as to the biases that we are most prone to have and suggest measures for overcoming them that the household agrees can be implemented.

When unsure how best to prepare for a disaster, we often choose the option that requires the least active mental effort — such as accepting the basic deductible in an insurance policy when one is unsure what is best or deciding to stay at home when uncertain whether to evacuate. Unfortunately, in many cases, accepting these “defaults” can have tragic consequences, such as staying when evacuation is essential.

This propensity to look for easy ways out in decision-making, however, can sometimes be flipped on its head by making safety something one needs to actively opt out of rather than opt into. As an example, one might overcome the hesitancy of people in flood-prone areas to buy flood insurance by providing it automatically with the payment of property taxes each year and allowing people who would actively prefer not to have it to apply for a refund of the premium.

See also: Next Generation of Insurance Services  

The greatest challenge we face is how to embrace cultures of protective action in the long run as a society. The behavioral risk audit offers a tool that can help individuals overcome the psychological biases that often impede preparedness, such as failing to see the future benefits of protective investments and believing that disasters are things that happen to others. Many of the truly long-run risks we face, however — such as those posed by climate change — are even more difficult to deal with as they require collective rather than just individual action.

Achieving effective collective action requires us not only to address individual biases but embrace a series of guiding principles of societal-level safety, such as demanding that safety and long-run preparedness be a top priority in government planning and insisting that social equity be a consideration in the formation of policies.

6 Reasons We Aren’t Prepared for Disasters

When dawn broke on the morning of Sept. 8, 1900, the people of Galveston, Texas, had no inkling of the disaster that was about to befall them. The thickening clouds and rising surf hinted that a storm was on the way, but few were worried. The local Weather Bureau office, for its part, gave no reason to worry; no urgent warnings were issued, and no calls were made to evacuate. But by late afternoon it became clear that this was no ordinary storm. Hurricane-force winds of more than 100 mph were soon raking the city, driving a massive storm surge that devoured almost everything in its path. Many tried to flee, but it was too late. By the next day, more than 8,000 people were dead, the greatest loss of life from a natural disaster in U.S. history.

Fast-forward to September 2008 when Hurricane Ike threatened the same part of the Texas coast — but this time it was greeted by a well-informed populace. Ike had been under constant surveillance by satellites, aircraft reconnaissance and land-based radar for more than a week, with the media blasting a nonstop cacophony of reports and warnings, urging those in coastal areas to leave. The city of Galveston was also well-prepared: A 17-foot-high seawall that had been constructed after the 1900 storm stood ready to protect the city, and government-flood insurance policies were available to residents who were at risk of property loss. Unlike in 1900, Texas residents really should have had little reason to fear. On their side was a century of advances in meteorology, engineering and economics designed to ensure that Ike would, indeed, pass as a forgettable summer storm.

See also: 5 Techniques for Managing a Disaster  

It didn’t quite work out that way. Warnings were issued, but many in low-lying coastal communities ignored them — even when told that failing to heed the warnings meant they faced death. Galveston’s aging seawall turned out to be vulnerable; it was breached in multiple places, damaging roughly 80% of the homes and businesses in the city. The resort communities to the north on the Bolivar Peninsula, which never saw the need for a seawall, fared even worse, witnessing almost complete destruction. And among the thousands of homeowners who suffered flood losses, only 39% had seen fit to purchase flood insurance. In the end, Ike caused more than $14 billion in property damage and 100 deaths — almost all of it needless.

Why are we underprepared for disasters?

The gap between protective technology and protective action illustrated by the losses in Hurricane Ike is, of course, hardly limited to Galveston or to hurricanes. While our ability to foresee and protect against natural catastrophes has increased dramatically over the course of the past century, it has done little to reduce material losses from such events.

Rather than seeing decreases in damage and fatalities because of the aid of science, we’ve instead seen the worldwide economic cost and impact on people’s lives as hazards increased exponentially through the early 21st century, with five of the 10 costliest natural disasters in history with respect to property damage occurring since 2005. While scientific and technological advances have allowed deaths to decrease on average, horrific calamities still occur, as in the case of the 230,000 people estimated to have lost their lives in the 2004 Indian Ocean earthquake and tsunami; the 87,000 who died in the 2008 Sichuan earthquake in China; the 160,000 who lost their lives in Haiti from an earthquake in 2010; and the 8,000 fatalities that occurred in the 2015 Nepalese earthquake. Even in the U.S., Hurricane Katrina in 2005 caused more than 1,800 fatalities, making it the third-most deadly such storm in U.S. history.

In our book “The Ostrich Paradox,” we explore six reasons that individuals, communities and institutions often under-invest in protection against low-probability, high-consequence events. They are:

  1. Myopia: a tendency to focus on overly short future time horizons when appraising immediate costs and the potential benefits of protective investments;
  2. Amnesia: a tendency to forget too quickly the lessons of past disasters;
  3. Optimism: a tendency to underestimate the likelihood that losses will occur from future hazards;
  4. Inertia: a tendency to maintain the status quo or adopt a default option when there is uncertainty about the potential benefits of investing in alternative protective measures;
  5. Simplification: a tendency to selectively attend to only a subset of the relevant factors to consider when making choices involving risk; and
  6. Herding: a tendency to base choices on the observed actions of others.

See also: Are You Ready for the Next Disaster?

We need to recognize that, when making decisions, our biases are part of our cognitive DNA. While we may not be able to alter our cognitive wiring, we may be able to improve preparedness by recognizing these specific biases and designing strategies that anticipate them.

Adapted from The Ostrich Paradox: Why We Underprepare for Disasters, by Robert Meyer and Howard Kunreuther, copyright 2017. Reprinted by permission of Wharton Digital Press.