August 3, 2015
Modeling Flood — the Peril of Inches
by Nick Lamparelli and Ivan Maddox
Until a few years ago, insuring flood was essentially impossible. Now, there are major opportunities.
“Baseball is a game of inches” – Branch Rickey
Property damage because of flooding is quite different from any other catastrophic peril such as hurricane, tornado or earthquake. Unlike with those perils, estimating losses from flood requires a higher level of geospatial exactness. Not only do we need to know precisely where that property is located and the distance to the nearest potential flooding source, but we also need to know the elevation of the property in comparison to its nearby surroundings and the source of flooding. Underwriting flood insurance is a game of inches, not ZIP codes.
With flood, a couple feet can make the difference between being in a flood zone or not, and a few inches of elevation can increase or decrease loss estimates by orders of magnitude. This realization helps explain the current financial mess of the National Flood Insurance Program (NFIP). In hindsight, even if the NFIP had perfect actuarial knowledge about the risk of flood, its destiny was preordained simply because it lacked other necessary tools.
This might make the reader believe that insuring flood is essentially impossible. Until just a few years ago, you’d be right. But, since then, interesting stuff has happened.
In the past decade, technologies like data storage, processing, modeling and remote sensing (i.e. mapping) have improved incredibly. All of a sudden it is possible to measure and store all topographical features of the U.S. — it has been done. Throw in analytical servers able to process trillions of calculations in seconds, and all of a sudden processing massive amounts of data is relatively easy. Meanwhile, the science around flood modeling, including meteorology, hydrology and topology, has been developed in a way that the new geospatial information and processing power can be used to produce models that have real predictive capabilities. These are not your grandfather’s flood maps. There are now models and analytics that provide estimates for frequency AND severity of flood loss for a specific location, an incredible leap forward from zone or ZIP code averaging. Like baseball, flood insurance is also a game of inches. And now it’s also a game that can be played and profited from by astute insurance professionals.
For the underwriting of insurance, having dependable frequency and severity loss estimates at a location level is gold. There is no single flood model that will provide all the answers, but there is definitely enough data, models and information available to determine frequency and severity metrics for flood to enable underwriters to segment exposure effectively. Low-, moderate- and high-risk exposures can be discerned and segregated, which means risk–based, actuarial pricing can be confidently implemented. The available data and risk models can also drive the design of flood mitigation actions (with accurate credit incentives attached to them) and marketing campaigns.
With the new generation of models, all three types of flooding can be evaluated, either individually or as a composite, and have their risk segmented appropriately. The available geospatial datasets and analytics support estimations of flood levels, flood depths and the likelihood of water entering a property by knowing the elevation of the structure, floors of occupancy and the relationship between the two.
In the old days, if your home was in a FEMA A or V zone but you were possibly safe from their “base flood” (a hypothetical 1% annual probability flood), you’d have to spend hundreds of dollars to get an elevation certificate and then petition the NFIP, at further cost, hoping to get a re-designation of your home. Today, it’s not complicated to place the structure in a geospatial model and estimate flood likelihood and depths in a way that can be integrated with actuarial information to calculate rates – each building getting rated based on where it is, where the water is and the likelihood of the water inundating the building.
In fact, the new models have essentially made the FEMA flood maps irrelevant in flood loss analysis. We don’t need to evaluate what flood zone the property is in. We just need an address. Homeowners don’t need to spend hundreds of dollars for elevation certificates; the models already have that data stored. Indeed, much of the underwriting required to price flood risk can be handled with two to three additional questions on a standard homeowners insurance application, saving the homeowner, agent and carrier time and frustration. The process we envision would create a distinctive competitive advantage for the enterprising carrier and one that would create and capture real value throughout the distribution chain, if done correctly. This is what disruption looks like before it happens.
In summary, the tools are now available to measure and price flood risk. Capital is flooding (sorry, we couldn’t help ourselves) into the insurance sector, seeking opportunities to be put to work. While we understand the skepticism of the industry to handle flood, the risk can be understood well enough to create products that people desperately need. Insuring flood would be a shot in the arm to an industry that has become stale at offering anything new. Billions of dollars of premium are waiting for the industry to capitalize on. One thing the current data and analytics make clear is this: There are high-, medium- and low-risk locations waiting to be insured based on actuarial methods. As long as flood insurance is being rated by zone (whether it is FEMA zone or Zipcode), there is cherry-picking to be done.
Who wants to get their ladder up the cherry tree first? And who will be last?