How to Outfox Our Brains About Risk

To overcome our natural biases, the usual approach to risk management must be reversed.

|
Ostriches are often characterized as hapless birds that bury their heads in the sand whenever danger approaches. In fact, they are highly astute escape artists. They use their great speed to overcome their inability to fly. Much in the same way that ostriches are limited in their defensive actions because they cannot fly, we need to recognize that, when making decisions, our biases are part of our cognitive DNA. In the same way that the ostrich has adapted to risk by taking into consideration its physical limitations, we humans, when thinking about risk, need to develop policies that take into consideration our inherent cognitive limitations. We need to learn to be more, not less, like ostriches — hence the paradox — if we are to be better prepared for disasters. We read about disasters all the time and often see post-disaster coverage about what people should have done in the situation: They should have put up their storm shutters, they should have evacuated, they should have purchased earthquake insurance — and so on. But people tend to disregard these warnings because of six major decision-making biases. People have a hard time foreseeing future consequences (myopia); are too quick to forget losses from the past (amnesia); are inclined to think losses will occur to others rather than themselves (optimism); are too inclined to prefer inaction over action when faced with risks and maintain the status quo (inertia); fail to base decisions on all of the information that is made available about a risk (simplification); and are overly prone to imitate the behaviors of others who exhibit the same biases (herding). See also: Need for Lifelong Learning in Insurance   Their relative importance varies from situation to situation, but if there is one that it is most fundamental, it is excessive optimism. We have a hard time fully anticipating the physical and emotional toll that disasters can impart, and we are too prone to believe that disasters happen to other people in other places in other times. A second bias that can create serious problems is myopia. There is a tendency for individuals to focus on short time horizons, so they do not undertake protective measures that have long-term benefits, such as not investing in loss reduction measures because of their high upfront costs. Most modern approaches to risk management start by analyzing the objective likelihood and consequences of risks faced by individuals or communities and then design measures that could mitigate these risks — and hope that people will choose to implement them. For example, people in areas prone to earthquakes might be provided with checklists for how to prepare for such events and urged to buy earthquake insurance. But because people often do not adopt these measures, effective risk management has to proceed in the reverse order, starting with an understanding of why people may not choose to adopt risk-reduction measures and then design approaches that work with, rather than against, our natural biases. A behavioral risk audit can — and should — be used as a source of guidance, not just for communities, but also for individuals and households. It should foster a discussion between family members as to the biases that we are most prone to have and suggest measures for overcoming them that the household agrees can be implemented. When unsure how best to prepare for a disaster, we often choose the option that requires the least active mental effort — such as accepting the basic deductible in an insurance policy when one is unsure what is best or deciding to stay at home when uncertain whether to evacuate. Unfortunately, in many cases, accepting these “defaults” can have tragic consequences, such as staying when evacuation is essential. This propensity to look for easy ways out in decision-making, however, can sometimes be flipped on its head by making safety something one needs to actively opt out of rather than opt into. As an example, one might overcome the hesitancy of people in flood-prone areas to buy flood insurance by providing it automatically with the payment of property taxes each year and allowing people who would actively prefer not to have it to apply for a refund of the premium. See also: Next Generation of Insurance Services   The greatest challenge we face is how to embrace cultures of protective action in the long run as a society. The behavioral risk audit offers a tool that can help individuals overcome the psychological biases that often impede preparedness, such as failing to see the future benefits of protective investments and believing that disasters are things that happen to others. Many of the truly long-run risks we face, however — such as those posed by climate change — are even more difficult to deal with as they require collective rather than just individual action. Achieving effective collective action requires us not only to address individual biases but embrace a series of guiding principles of societal-level safety, such as demanding that safety and long-run preparedness be a top priority in government planning and insisting that social equity be a consideration in the formation of policies.

Howard Kunreuther

Profile picture for user HowardKunreuther

Howard Kunreuther

Howard C. Kunreuther is professor of decision sciences and business and public policy at the Wharton School, and co-director of the Wharton Risk Management and Decision Processes Center.

MORE FROM THIS AUTHOR

Read More