Overcoming Human Biases via Data

Communicating risk with data will start shifting your work culture to predictive risk management, but don't forget the human element.

Managing business risk is a tricky thing. With an appetite too small, opportunity could be lost, but taking on too much risk could hurt profit and performance. 

Companies that are not thinking about risk are at risk!

Making the move to proactive risk management requires a culture shift, but 65% of organizations say they’re still operating with “reactive” or “basic” risk management response. Mature companies often take a strategic and calculated approach to risk management. Considering that risk = probability of occurrence x severity or consequence, mathematic analyses can help organizations avoid preventable pitfalls. Risk modeling using advanced statistical techniques has developed to align theoretical risk with real-world events and provides C-suite decision makers with quantifiable support needed to make data-informed decisions.

A Five-Step Approach to Data-Driven Risk Management

Where do smart companies start when they want to begin addressing risk? Data. 

To understand risk beyond “gut feelings” and anecdotal evidence, companies need to leverage the information that is available to them – especially in today’s data-saturated environment. These five steps can outline your path to data-informed risk management.

  • Step 1: Collect your data. Often the most difficult step, identifying the right data to inform your analysis, is critical. We all know that “data is out there,” but not all data is created equal. For best results, explore different dataset options, take the time to understand how this data was collected and then clean data to ensure any risk analysis is both relevant and actionable.
  • Step 2: Develop a risk model. Risk modeling allows teams to include contextually relevant predictors and relationships. If historical data exists for current risks, create an empirical model to articulate key predictors. If analysis focuses on emerging risks where no data exists, craft a theoretical risk model based on the relationships you do know.
  • Step 3: Explore differing scenarios. There are probably a few risk scenarios that keep you up at night. Use your model to understand the likelihood and loss of these potential events. Estimate losses for each scenario in a metric that’s meaningful to your audience. Money? Time? Human capital?
  • Step 4: Share your findings. Now it’s time to tell your story. This is where data geeks sometimes “lose their audience.” Your analysis is ineffective if decision makers do not understand the implications.  Share your findings in a way that is meaningful using relevant metrics, data visualization and scenario storytelling. In practice, this means avoiding abstract metrics in favor of direct impacts — such as potential revenue loss or downtime — and possibly using infographics to support cause and effect narratives. Connect the dots between risk and results with a relevant story that ends with actionable advice.
  • Step 5: Enable action. As Theodore Roosevelt once implied, sharing a problem without proposing a solution is called whining. Once you’ve presented your model and your findings, you will likely have an understanding of the leading risk factors. Let these factors inform your recommendations for risk mitigation. This will help decision makers prioritize their resources for maximum impact. 

Sometimes, data isn’t enough

Not surprisingly however, data isn’t always enough to instigate change. As anyone who’s listened to the news lately knows, data can be manipulated and interpreted in different ways. Sometimes, we see what we want to see - it’s in our psychology - and the C-suite is not immune to this. To be human is to be biased. 

Therefore, communicating risk with data is a strong technique for neutralizing the effects of human biases, but one should be aware of common predispositions that often arise when people assess risk.  

See also: Claims and Effective Risk Management

To Be Human Is to Be Biased

The famous psychologist Daniel Kahneman highlighted the fallibility of human cognition in his work to discover inherent human biases. These biases evolved over millennia as coping mechanisms for the complex world around us, but today they sometimes impede our ability to reason. The challenge is that many of us are not aware of these biases and therefore unknowingly fall victim to their influence.

"We can be blind to the obvious, and we are also blind to our blindness." – Daniel Kahneman 

There are a few important biases to be aware of when presenting your risk analysis and recommendations. 

  • Conservatism bias: People are comfortable with what they know, and we show preference toward existing information over new data. As a result, if new data emerges suggesting increased risk, an audience may resist this new information simply because it’s new. 
  • The ostrich effect: No one likes bad news. When it comes to risk, people tend to ignore dangerous or negative information by “burying” their heads in the sand like an ostrich. But just ignoring the data doesn’t make the risk go away. A strong culture of risk management will help negate this effect. 
  • Survivorship bias: Biases can work toward unsupported risk tolerance, as well. With survivorship bias, people only focus on “surviving” events and ignore non-surviving events (or those events that did not actually occur). For instance, a company’s safety data may show a lack of head injuries (surviving event), and decision makers may believe there is no need for hard hats. 

Communicating risk with data is an excellent start toward shifting your work culture to one of predictive risk management, but we cannot forget the human element. As you share your models, data and findings, remember to address potential biases of your audience… even if your audience is unaware of their own human susceptibility!


Paris Stringfellow

Profile picture for user ParisStringfellow

Paris Stringfellow

Paris Stringfellow is assistant research professor in Clemson University's department of industrial engineering. Her research focuses on understanding human behavior in cyber-physical-social systems. She is director of the Risk Engineering and System Analytics Center at Clemson.

Read More