Tag Archives: cea

How CAT Models Lead to Soft Prices

In our first article in this series, we looked back at an insurance industry reeling from several consecutive natural catastrophes that generated combined insured losses exceeding $30 billion. In the second article, we looked at how, beginning in the mid-1980s, people began developing models that could prevent recurrences of those staggering losses. In this article, we look at how modeling results are being used in the industry.

 

Insurance is a unique business. In most other businesses, expenses associated with costs of operation are either known or can be fairly estimated. The insurance industry, however, needs to estimate expenses for things that are extremely rare or have never happened before. Things such as the damage to a bridge in New York City from a flood or the theft of a precious heirloom from your home or the fire at a factory, or even Jennifer Lopez injuring her hind side. No other industry has to make so many critical business decisions as blindly as the insurance industry. Even in circumstances in which an insurer can accurately estimate a loss to a single policyholder, without the ability to accurately estimate multiple losses all occurring simultaneously, which is what happens during natural catastrophes, the insurer is still operating blindly. Fortunately, the introduction of CAT models greatly enhances both the insurer’s ability to estimate the expenses (losses) associated with a single policyholder and concurrent claims from a single occurrence.

When making decisions about which risks to insure, how much to insure them for and how much premium is required to profitably accept the risk, there are essentially two metrics that can provide the clarity needed to do the job. Whether you are a portfolio manager managing the cumulative risk for a large line of business or an underwriter getting a submission from a broker to insure a factory or an actuary responsible for pricing exposure, what these stakeholders need to minimally know is:

  1. On average, what will potential future losses look like?
  2. On average, what are the reasonable worst case loss scenarios, or the probable maximum loss (PML)?

Those two metrics alone supply enough information for an insurer to make critical business decisions in these key areas:

  • Risk selection
  • Risk-based pricing
  • Capacity allocation
  • Reinsurance program design

Risk Selection

Risk selection includes an underwriter’s determination of the class (such as preferred, standard or substandard) to which a particular risk is deemed to belong, its acceptance or rejection and (if accepted) the premium.

Consider two homes: a $1 million wood frame home and a $1 million brick home both located in Los Angeles. Which home is riskier to the insurer?  Before the advent of catastrophe models, the determination was based on historical data and, essentially, opinion. Insurers could have hired engineers who would have informed them that brick homes are much more susceptible to damage than wood frame homes under earthquake stresses. But it was not until the introduction of the models that insurers could finally quantify how much financial risk they were exposed to. They shockingly discovered that on average brick homes are four times riskier than wood frame homes and are twice as likely to sustain a complete loss (full collapse). This was data not well-known by insurers.

Knowing how two or more different risks (or groups of risks) behave at an absolute and relational level provides a foundation to insurers to intelligently set underwriting guidelines, which work toward their strengths and excludes risks they do not or cannot absorb, based on their risk appetite.

Risk-Based Pricing

Insurance is rapidly becoming more of a commodity, with customers often choosing their insurer purely on the basis of price. As a result, accurate ratemaking has become more important than ever. In fact, a Towers Perrin survey found that 96% of insurers consider sophisticated rating and pricing to be either essential or very important.

Multiple factors go into determining premium rates, and, as competition increases, insurers are introducing innovative rate structures. The critical question in ratemaking is: What risk factors or variables are important for predicting the likelihood, frequency and severity of a loss? Although there are many obvious risk factors that affect rates, subtle and non-intuitive relationships can exist among variables that are difficult, if not impossible, to identify without applying more sophisticated analyses.

Regarding our example involving the two homes situated in Los Angeles, catastrophe models tell us two very important things: what the premium to cover earthquake loss should roughly be and that the premium for masonry homes should be approximately four times larger than wood frame homes.

The concept of absolute and relational pricing using catastrophe models is revolutionary. Many in the industry may balk at our term “revolutionary,” but insurers using the models to establish appropriate price levels for property exposures have a massive advantage over public entities such as the California Earthquake Authority (CEA) and the National Flood Insurance Program (NFIP) that do not adhere to risk-based pricing.

The NFIP and CEA, like most quasi-government insurance entities, differ in their pricing from private insurers along multiple dimensions, mostly because of constraints imposed by law. Innovative insurers recognize that there are literally billions of valuable premium dollars at stake for risks for which the CEA, the NFIP and similar programs significantly overcharge – again, because of constraints that forbid them from being competitive.

Thus, using average and extreme modeled loss estimates not only ensures that insurers are managing their portfolios effectively, but enables insurers, especially those that tend to have more robust risk appetites, to identify underserved markets and seize valuable market share. From a risk perspective, a return on investment can be calculated via catastrophe models.

It is incumbent upon insurers to identify the risks they don’t wish to underwrite as well as answer such questions as: Are wood frame houses less expensive to insure than homes made of joisted masonry? and, What is the relationship between claims severity and a particular home’s loss history? Traditional univariate pricing analysis methodologies are outdated; insurers have turned to multivariate statistical pricing techniques and methodologies to best understand the relationships between multiple risk variables. With that in mind, insurers need to consider other factors, too, such as marketing costs, conversion rates and customer buying behavior, just to name a few, to accurately price risks. Gone are the days when unsophisticated pricing and risk selection methodologies were employed. Innovative insurers today cross industry lines by paying more and more attention to how others manage data and assign value to risk.

Capacity Allocation

In the (re)insurance industry, (re)insurers only accept risks if those risks are within the capacity limits they have established based on their risk appetites. “Capacity” means the maximum limit of liability offered by an insurer during a defined period. Oftentimes, especially when it comes to natural catastrophe, some risks have a much greater accumulation potential, and that accumulation potential is typically a result of dependencies between individual risks.

Take houses and automobiles. A high concentration of those exposure types may very well be affected by the same catastrophic event – whether a hurricane, severe thunderstorm, earthquake, etc. That risk concentration could potentially put a reinsurer (or insurer) in the unenviable position of being overly exposed to a catastrophic single-loss occurrence.  Having a means to adequately control exposure-to-accumulation is critical in the risk management process. Capacity allocation enables companies to allocate valuable risk capacity to specific perils within specific markets and accumulation zones to minimize their exposure, and CAT models allow insurers to measure how capacity is being used and how efficiently it is being deployed.

Reinsurance Program Design

With the advent of CAT models, insurers now have the ability to simulate different combinations of treaties and programs to find the right fit, maximizing their risk and return. Before CAT models, it would require gut instinct to estimate the probability of attachment of one layer over another or to estimate the average annual losses for a per-risk treaty covering millions of exposures. The models estimate the risk and can calculate the millions of potential claims transactions, which would be nearly impossible to do without computers and simulation.

It is now well-known how soft the current reinsurance market is. Alternative capital has been a major driving force, but we consider the maturation of CAT models as having an equally important role in this trend.

First, insurers using CAT models to underwrite, price and manage risk can now intelligently present their exposure and effectively defend their position on terms and conditions. Gone are the days when reinsurers would have the upper hand in negotiations; CAT models have leveled the playing field for insurers.

Secondly, alternative capital could not have the impact that it is currently having without the language of finance. CAT models speak that language. The models provide necessary statistics for financial firms looking to allocate capital in this area. Risk transfer becomes so much more fungible once there is common recognition of the probability of loss between transferor and transferee. No CAT models, no loss estimates. No loss estimates, no alternative capital. No alternative capital, no soft market.

A Needed Balance

By now, and for good reason, the industry has placed much of its trust in CAT models to selectively manage portfolios to minimize PML potential. Insurers and reinsurers alike need the ability to quantify and identify peak exposure areas, and the models stand ready to help understand and manage portfolios as part of a carrier’s risk management process. However, a balance between the need to bear risk and the need to preserve a carrier’s financial integrity in the face of potential catastrophic loss is essential. The idea is to pursue a blend of internal and external solutions to ensure two key factors:

  1. The ability to identify, quantify and estimate the chances of an event occurring and the extent of likely losses, and
  2. The ability to set adequate rates.

Once companies have an understanding of their catastrophe potential, they can effectively formulate underwriting guidelines to act as control valves on their catastrophe loss potential but, most importantly, even in high-risk regions, identify those exposures that still can meet underwriting criteria based on any given risk appetite. Underwriting criteria relative to writing catastrophe-prone exposure must be used as a set of benchmarks, not simply as a blind gatekeeper.

In our next article, we examine two factors that could derail the progress made by CAT models in the insurance industry. Model uncertainty and poor data quality threaten to raise skepticism about the accuracy of the models, and that skepticism could inhibit further progress in model development.

The Next Jolt That Will Hit California

Losses from Sunday’s 6.0-magnitude earthquake near Napa, the largest in California in 25 years, seriously damaged more than 170 structures and injured more than 200 people. Overall earthquake-related losses are expected to exceed $1 billion.

Many unreinforced masonry buildings risk being declared a total loss. But even retrofitting doesn’t always ensure earthquake immunity. The charming 1901 stone Goodman Library in downtown Napa was seismically retrofitted at a cost of $1.7 million a few years ago, yet the top of the building toppled over in the earthquake. A nearby historic brick building was retrofitted for $1.2 million after a 2000 Napa earthquake, and it was red-tagged Monday with serious damage, as well.

Unfortunately, the repair and rebuilding costs will be the next jolt that rocks the budgets of businesses and homeowners. It’s estimated that less than 12% of homeowners in California have earthquake coverage – a figure that was as high as 22% last year, according to the California Earthquake Authority. CEA underwrites more than 800,000 policies representing 70% of the homeowner earthquake insurance in the state.

California has two-thirds of the nation’s earthquake risk, with 2,000 known faults producing 37,000 measurable earthquakes a year. Besides California, the U.S. Geological Survey maps show major earthquake risks in nearly half the U.S.

In 1994, after a 6.7-magnitude earthquake hit the Northridge area of Southern California, 93% of homeowner insurance companies restricted or refused to write earthquake insurance policies. In response, the California legislature established the California Earthquake Authority (CEA) in 1995 to provide a reduced-coverage (“no-frills”) earthquake policy for homeowners in the state — things like swimming pools, decks and detached structures are not included. Insurance carriers in California can offer their own earthquake coverage or be a participating member of the CEA, which made the CEA one of the largest providers of residential earthquake coverage in the world.

Currently, 21 major insurance carriers participate in CEA, and its assets total nearly $10 billion. Its A.M. Best rating is A- (excellent). CEA policies are available to homeowners and renters, including for mobile homes and condominiums, if their primary homeowners’ coverage is with one of the CEA insurers. Keep in mind that many condominium communities have common ownership, which means that the condo owners could have joint and several liability for repairs after an earthquake. CEA reports that it uses 83% of the premiums it collects for claims or reinsurance, 14% for broker commissions and 3% for operations/overhead.

The likelihood that state or federal disaster relief may be available is a risky proposition for home or business owners. The president needs to declare a disaster before the Federal Emergency Management Agency (FEMA) can grant any limited assistance. States surplus funds for relief, on the other hand, are simply non-existent.

So why are people buying less earthquake coverage when the hazards of a potential devastating earthquake are growing? Unlike other natural disasters like hurricanes, tornadoes, and wildfires that are usually covered under homeowners’ insurance policies, earthquake coverage is a separate insurance policy with a deductible of 10% or 15% of the structure’s estimated replacement cost.

The average earthquake policy in California in 2013 cost $676 a year, according to the California Department of Insurance. The current average cost of a home in California, according to Zillow, is $429,000. Even with a minimum 10% deductible, a homeowner would be out of pocket $42,900 before earthquake insurance coverage kicks in.

Business properties suffer a much larger risk factor considering the additional exposure of damaged inventory, a red-tagged (unusable) building risk and loss of use and income.

In contrast, flood insurance is available in most of the country with a $1,000 building and $1,000 contents deductible as part of the property coverage. The Insurance Information Institute reports that the average flood damage claim in 2013 was $26,165 for the 13% of U.S. homeowners who buy the additional flood coverage – sometimes as a condition to their mortgage, if they are located in a flood zone.

Low-frequency, high-severity risks like earthquakes represent a bet that few home or business owners can afford to lose. Unfortunately, Californians, who own the nation’s highest-valued properties, also have the most money on the table when the next big shake comes.