Tag Archives: california earthquake authority

When NOT to Have Earthquake Insurance

Southern Californians found themselves earlier this summer dealing with some of the largest earthquakes and aftershocks to hit the area in 20 years. Although the earthquakes weren’t devastating, they did serve as a reminder about earthquake insurance and whether its cost, high deductibles and exclusions make it the best route to go to protect your home or business.

Following the quakes, which included nerve-shattering aftershocks, questions and confusion abound, particularly on the subject of insurance. Many believe inaccurately, state Insurance Commissioner Ricardo Lara would later emphasize, that a moratorium had been placed on all new earthquake policies.

“While we have Californians’ attention, insurers should not create barriers to homeowners or renters who want to protect their assets from earthquakes,” Lara said, promising to send notices to insurers reiterating that the standard 15-day waiting period for coverage after a seismic event in no way means they should decline to write new policies.

And there is plenty of interest. Glenn Pomeroy, CEO of the California Earthquake Authority, reported traffic to the not-for-profit’s website had increased tenfold since the earthquakes.

See also: Preparing for the Next Big Earthquake  

But the vast majority of California residents — 87% statewide, according to the California Department of Insurance — don’t opt for earthquake coverage. The number with insurance rises to only 20% in Los Angeles and Orange counties.

What gives?

The answer most often lies in the deductibles, says Consumer Action, a San Francisco-based education and advocacy nonprofit founded in 1971. Simply put, combine modern building codes with the fact that you would have to suffer catastrophic home damage for a policy to kick in, and the odds are probably in your favor without insurance.

“Even on the West Coast, earthquake insurance is not for everyone,” Consumer Action spokeswoman Linda Sherry says. “It can be prohibitively expensive and come with large deductibles.”

High cost, large deductibles put coverage out of reach

For example: If a $500,000 home with a replacement value of $300,000 was struck by a quake, you’d have to cough up $45,000 to meet a 15% deductible. That’s a lot of earthquake damage before the policy would help. Throw in the fact that many California residents have little if any equity in their homes after the housing bust, and it would be tempting to walk away from a mortgage if such a calamity occurred.

Ultimately, whether to opt for earthquake coverage is a case-by-case and individual budget decision, Sherry says.

Rather than spend money on extra insurance, some homeowners invest it in a preventative approach — reinforcement bracing and securing homes to foundations.

A “brace-and-bolt” program administered by the California Earthquake Authority has thus far distributed grants of up to $3,000 to retrofit more than 5,000 high-risk houses built before 1979.

“After the Napa earthquake, I saw quite a few houses that slid off their foundation,” California Earthquake Authority Chief Mitigation Officer Janiele Maffei says. “A brace-and-bolt retrofit beforehand could have made the difference.”

See also: A Troubling Gap in Earthquake Coverage  

A retrofit typically costs $3,000 to $7,000, according to the CEA.

Hazards to people come mostly from man-made structures, and the Federal Emergency Management Agency says many injuries can be prevented by securing tall or heavy items, such as appliances, with nylon straps or closed hooks, moving them away from beds and seating and making sure gas appliances have flexible connectors to prevent fires.

This article was originally published on insurancequotes.com.

How CAT Models Lead to Soft Prices

In our first article in this series, we looked back at an insurance industry reeling from several consecutive natural catastrophes that generated combined insured losses exceeding $30 billion. In the second article, we looked at how, beginning in the mid-1980s, people began developing models that could prevent recurrences of those staggering losses. In this article, we look at how modeling results are being used in the industry.

 

Insurance is a unique business. In most other businesses, expenses associated with costs of operation are either known or can be fairly estimated. The insurance industry, however, needs to estimate expenses for things that are extremely rare or have never happened before. Things such as the damage to a bridge in New York City from a flood or the theft of a precious heirloom from your home or the fire at a factory, or even Jennifer Lopez injuring her hind side. No other industry has to make so many critical business decisions as blindly as the insurance industry. Even in circumstances in which an insurer can accurately estimate a loss to a single policyholder, without the ability to accurately estimate multiple losses all occurring simultaneously, which is what happens during natural catastrophes, the insurer is still operating blindly. Fortunately, the introduction of CAT models greatly enhances both the insurer’s ability to estimate the expenses (losses) associated with a single policyholder and concurrent claims from a single occurrence.

When making decisions about which risks to insure, how much to insure them for and how much premium is required to profitably accept the risk, there are essentially two metrics that can provide the clarity needed to do the job. Whether you are a portfolio manager managing the cumulative risk for a large line of business or an underwriter getting a submission from a broker to insure a factory or an actuary responsible for pricing exposure, what these stakeholders need to minimally know is:

  1. On average, what will potential future losses look like?
  2. On average, what are the reasonable worst case loss scenarios, or the probable maximum loss (PML)?

Those two metrics alone supply enough information for an insurer to make critical business decisions in these key areas:

  • Risk selection
  • Risk-based pricing
  • Capacity allocation
  • Reinsurance program design

Risk Selection

Risk selection includes an underwriter’s determination of the class (such as preferred, standard or substandard) to which a particular risk is deemed to belong, its acceptance or rejection and (if accepted) the premium.

Consider two homes: a $1 million wood frame home and a $1 million brick home both located in Los Angeles. Which home is riskier to the insurer?  Before the advent of catastrophe models, the determination was based on historical data and, essentially, opinion. Insurers could have hired engineers who would have informed them that brick homes are much more susceptible to damage than wood frame homes under earthquake stresses. But it was not until the introduction of the models that insurers could finally quantify how much financial risk they were exposed to. They shockingly discovered that on average brick homes are four times riskier than wood frame homes and are twice as likely to sustain a complete loss (full collapse). This was data not well-known by insurers.

Knowing how two or more different risks (or groups of risks) behave at an absolute and relational level provides a foundation to insurers to intelligently set underwriting guidelines, which work toward their strengths and excludes risks they do not or cannot absorb, based on their risk appetite.

Risk-Based Pricing

Insurance is rapidly becoming more of a commodity, with customers often choosing their insurer purely on the basis of price. As a result, accurate ratemaking has become more important than ever. In fact, a Towers Perrin survey found that 96% of insurers consider sophisticated rating and pricing to be either essential or very important.

Multiple factors go into determining premium rates, and, as competition increases, insurers are introducing innovative rate structures. The critical question in ratemaking is: What risk factors or variables are important for predicting the likelihood, frequency and severity of a loss? Although there are many obvious risk factors that affect rates, subtle and non-intuitive relationships can exist among variables that are difficult, if not impossible, to identify without applying more sophisticated analyses.

Regarding our example involving the two homes situated in Los Angeles, catastrophe models tell us two very important things: what the premium to cover earthquake loss should roughly be and that the premium for masonry homes should be approximately four times larger than wood frame homes.

The concept of absolute and relational pricing using catastrophe models is revolutionary. Many in the industry may balk at our term “revolutionary,” but insurers using the models to establish appropriate price levels for property exposures have a massive advantage over public entities such as the California Earthquake Authority (CEA) and the National Flood Insurance Program (NFIP) that do not adhere to risk-based pricing.

The NFIP and CEA, like most quasi-government insurance entities, differ in their pricing from private insurers along multiple dimensions, mostly because of constraints imposed by law. Innovative insurers recognize that there are literally billions of valuable premium dollars at stake for risks for which the CEA, the NFIP and similar programs significantly overcharge – again, because of constraints that forbid them from being competitive.

Thus, using average and extreme modeled loss estimates not only ensures that insurers are managing their portfolios effectively, but enables insurers, especially those that tend to have more robust risk appetites, to identify underserved markets and seize valuable market share. From a risk perspective, a return on investment can be calculated via catastrophe models.

It is incumbent upon insurers to identify the risks they don’t wish to underwrite as well as answer such questions as: Are wood frame houses less expensive to insure than homes made of joisted masonry? and, What is the relationship between claims severity and a particular home’s loss history? Traditional univariate pricing analysis methodologies are outdated; insurers have turned to multivariate statistical pricing techniques and methodologies to best understand the relationships between multiple risk variables. With that in mind, insurers need to consider other factors, too, such as marketing costs, conversion rates and customer buying behavior, just to name a few, to accurately price risks. Gone are the days when unsophisticated pricing and risk selection methodologies were employed. Innovative insurers today cross industry lines by paying more and more attention to how others manage data and assign value to risk.

Capacity Allocation

In the (re)insurance industry, (re)insurers only accept risks if those risks are within the capacity limits they have established based on their risk appetites. “Capacity” means the maximum limit of liability offered by an insurer during a defined period. Oftentimes, especially when it comes to natural catastrophe, some risks have a much greater accumulation potential, and that accumulation potential is typically a result of dependencies between individual risks.

Take houses and automobiles. A high concentration of those exposure types may very well be affected by the same catastrophic event – whether a hurricane, severe thunderstorm, earthquake, etc. That risk concentration could potentially put a reinsurer (or insurer) in the unenviable position of being overly exposed to a catastrophic single-loss occurrence.  Having a means to adequately control exposure-to-accumulation is critical in the risk management process. Capacity allocation enables companies to allocate valuable risk capacity to specific perils within specific markets and accumulation zones to minimize their exposure, and CAT models allow insurers to measure how capacity is being used and how efficiently it is being deployed.

Reinsurance Program Design

With the advent of CAT models, insurers now have the ability to simulate different combinations of treaties and programs to find the right fit, maximizing their risk and return. Before CAT models, it would require gut instinct to estimate the probability of attachment of one layer over another or to estimate the average annual losses for a per-risk treaty covering millions of exposures. The models estimate the risk and can calculate the millions of potential claims transactions, which would be nearly impossible to do without computers and simulation.

It is now well-known how soft the current reinsurance market is. Alternative capital has been a major driving force, but we consider the maturation of CAT models as having an equally important role in this trend.

First, insurers using CAT models to underwrite, price and manage risk can now intelligently present their exposure and effectively defend their position on terms and conditions. Gone are the days when reinsurers would have the upper hand in negotiations; CAT models have leveled the playing field for insurers.

Secondly, alternative capital could not have the impact that it is currently having without the language of finance. CAT models speak that language. The models provide necessary statistics for financial firms looking to allocate capital in this area. Risk transfer becomes so much more fungible once there is common recognition of the probability of loss between transferor and transferee. No CAT models, no loss estimates. No loss estimates, no alternative capital. No alternative capital, no soft market.

A Needed Balance

By now, and for good reason, the industry has placed much of its trust in CAT models to selectively manage portfolios to minimize PML potential. Insurers and reinsurers alike need the ability to quantify and identify peak exposure areas, and the models stand ready to help understand and manage portfolios as part of a carrier’s risk management process. However, a balance between the need to bear risk and the need to preserve a carrier’s financial integrity in the face of potential catastrophic loss is essential. The idea is to pursue a blend of internal and external solutions to ensure two key factors:

  1. The ability to identify, quantify and estimate the chances of an event occurring and the extent of likely losses, and
  2. The ability to set adequate rates.

Once companies have an understanding of their catastrophe potential, they can effectively formulate underwriting guidelines to act as control valves on their catastrophe loss potential but, most importantly, even in high-risk regions, identify those exposures that still can meet underwriting criteria based on any given risk appetite. Underwriting criteria relative to writing catastrophe-prone exposure must be used as a set of benchmarks, not simply as a blind gatekeeper.

In our next article, we examine two factors that could derail the progress made by CAT models in the insurance industry. Model uncertainty and poor data quality threaten to raise skepticism about the accuracy of the models, and that skepticism could inhibit further progress in model development.

When Nature Calls: the Need for New Models

The Earth is a living, breathing planet, rife with hazards that often hit without warning. Tropical cyclones, extra-tropical cyclones, earthquakes, tsunamis, tornados and ice storms: Severe elements are part of the planet’s progression. Fortunately, the vast majority of these events are not what we would categorize as “catastrophic.” However, when nature does call, these events can be incredibly destructive.

To help put things into perspective: Nearly 70% (and growing) of the entire world’s population currently lives within 100 miles of a coastline. When a tropical cyclone makes landfall, it’s likely to affect millions of people at one time and cause billions of dollars of damage. Though the physical impact of windstorms or earthquakes is regional, the risk associated with those types of events, including the economic aftermath, is not. Often, the economic repercussions are felt globally, both in the public and private sectors. We need only look back to Hurricane Katrina, Super Storm Sandy and the recent tsunamis in Japan and Indonesia to see what toll a single catastrophe can have on populations, economies and politics.

However, because actual catastrophes are so rare, property insurers are left incredibly under-informed when attempting to underwrite coverage and are vulnerable to catastrophic loss.

Currently, insurers’ standard actuarial practices are unhelpful and often dangerous because, with so little historical data, the likelihood of underpricing dramatically increases. If underwriting teams do not have the tools to know where large events will occur, how often they will occur or how severe they will be when they do occur, then risk management teams must blindly cap their exposure. Insurers lacking the proper tools can’t possibly fully understand the implications of thousands of claims from a single event. Risk management must place arbitrary capacity limits on geographic exposures, resulting in unavoidable misallocation of capital.

However, insurers’ perceived success from these arbitrary risk management practices, combined with a fortunate pause in catastrophes lasting multiple decades created a perfect storm of profit, which lulled insurers into a false sense of security. It allowed them to grow to a point where they felt invulnerable to any large event that may come their way. They had been “successful” for decades. They’re obviously doing something right, they thought. What could possibly go wrong?

Fast forward to late August 1992. The first of two pivotal events that forced a change in the attitude of insurers toward catastrophes was brewing in the Atlantic. Hurricane Andrew, a Category 5 event, with top wind speeds of 175 mph, would slam into southern Florida and cause, by far, the largest loss to date in the insurance industry’s history, totaling $15 billion in insured losses. As a result, 11 consistently stable insurers became insolvent. Those still standing either quickly left the state or started drastically reducing their exposures.

The second influential event was the 1994 earthquake in Northridge, CA. That event occurred on a fault system that was previously unknown, and, even though it measured only a 6.7 magnitude, it generated incredibly powerful ground motion, collapsing highways and leveling buildings. Northridge, like Andrew, also created approximately $15 billion in insured losses and caused insurers that feared additional losses to flee the California market altogether.

Andrew and Northridge were game changers. Across the country, insurers’ capacity became severely reduced for both wind and earthquake perils as a result of those events. Where capacity was in particularly short supply, substantial rate increases were sought. Insurers rethought their strategies and, in all aspects, looked to reduce their catastrophic exposure. In both California and Florida, quasi-state entities were formed to replace the capacity from which the private market was withdrawing. To this day, Citizens Property Insurance in Florida and the California Earthquake Authority, so-called insurers of last resort, both control substantial market shares in their respective states. For many property owners exposed to severe winds or earthquakes, obtaining adequate coverage simply isn’t within financial reach, even 20 years removed from those two seminal events.

How was it possible that insurers could be so exposed? Didn’t they see the obvious possibility that southern Florida could have a large hurricane or that the Los Angeles area was prone to earthquakes?

What seems so obvious now was not so obvious then, because of a lack of data and understanding of the risks. Insurers were writing coverage for wind and earthquake hazards before they even understood the physics of those types of events. In hindsight, we recognize that the strategy was as imprudent as picking numbers from a hat.

What insurers need is data, data about the likelihood of where catastrophic events will occur, how often they will likely occur and what the impact will be when they do occur. The industry at that time simply didn’t have the ability to leverage data or experience that was so desperately needed to reasonably quantify their exposures and help them manage catastrophic risk.

Ironically, well before Andrew and Northridge, right under property insurers’ noses, two innovative people on opposite sides of the U.S. had come to the same conclusion and had already begun answering the following questions:

  • Could we use computers to simulate millions of scientifically plausible catastrophic events against a portfolio of properties?
  • Would the output of that kind of simulation be adequate for property insurers to manage their businesses more accurately?
  • Could this data be incorporated into all their key insurance operations – underwriting, claims, marketing, finance and actuarial – to make better decisions?

What emerged from that series of questions would come to revolutionize the insurance industry.

The Next Jolt That Will Hit California

Losses from Sunday’s 6.0-magnitude earthquake near Napa, the largest in California in 25 years, seriously damaged more than 170 structures and injured more than 200 people. Overall earthquake-related losses are expected to exceed $1 billion.

Many unreinforced masonry buildings risk being declared a total loss. But even retrofitting doesn’t always ensure earthquake immunity. The charming 1901 stone Goodman Library in downtown Napa was seismically retrofitted at a cost of $1.7 million a few years ago, yet the top of the building toppled over in the earthquake. A nearby historic brick building was retrofitted for $1.2 million after a 2000 Napa earthquake, and it was red-tagged Monday with serious damage, as well.

Unfortunately, the repair and rebuilding costs will be the next jolt that rocks the budgets of businesses and homeowners. It’s estimated that less than 12% of homeowners in California have earthquake coverage – a figure that was as high as 22% last year, according to the California Earthquake Authority. CEA underwrites more than 800,000 policies representing 70% of the homeowner earthquake insurance in the state.

California has two-thirds of the nation’s earthquake risk, with 2,000 known faults producing 37,000 measurable earthquakes a year. Besides California, the U.S. Geological Survey maps show major earthquake risks in nearly half the U.S.

In 1994, after a 6.7-magnitude earthquake hit the Northridge area of Southern California, 93% of homeowner insurance companies restricted or refused to write earthquake insurance policies. In response, the California legislature established the California Earthquake Authority (CEA) in 1995 to provide a reduced-coverage (“no-frills”) earthquake policy for homeowners in the state — things like swimming pools, decks and detached structures are not included. Insurance carriers in California can offer their own earthquake coverage or be a participating member of the CEA, which made the CEA one of the largest providers of residential earthquake coverage in the world.

Currently, 21 major insurance carriers participate in CEA, and its assets total nearly $10 billion. Its A.M. Best rating is A- (excellent). CEA policies are available to homeowners and renters, including for mobile homes and condominiums, if their primary homeowners’ coverage is with one of the CEA insurers. Keep in mind that many condominium communities have common ownership, which means that the condo owners could have joint and several liability for repairs after an earthquake. CEA reports that it uses 83% of the premiums it collects for claims or reinsurance, 14% for broker commissions and 3% for operations/overhead.

The likelihood that state or federal disaster relief may be available is a risky proposition for home or business owners. The president needs to declare a disaster before the Federal Emergency Management Agency (FEMA) can grant any limited assistance. States surplus funds for relief, on the other hand, are simply non-existent.

So why are people buying less earthquake coverage when the hazards of a potential devastating earthquake are growing? Unlike other natural disasters like hurricanes, tornadoes, and wildfires that are usually covered under homeowners’ insurance policies, earthquake coverage is a separate insurance policy with a deductible of 10% or 15% of the structure’s estimated replacement cost.

The average earthquake policy in California in 2013 cost $676 a year, according to the California Department of Insurance. The current average cost of a home in California, according to Zillow, is $429,000. Even with a minimum 10% deductible, a homeowner would be out of pocket $42,900 before earthquake insurance coverage kicks in.

Business properties suffer a much larger risk factor considering the additional exposure of damaged inventory, a red-tagged (unusable) building risk and loss of use and income.

In contrast, flood insurance is available in most of the country with a $1,000 building and $1,000 contents deductible as part of the property coverage. The Insurance Information Institute reports that the average flood damage claim in 2013 was $26,165 for the 13% of U.S. homeowners who buy the additional flood coverage – sometimes as a condition to their mortgage, if they are located in a flood zone.

Low-frequency, high-severity risks like earthquakes represent a bet that few home or business owners can afford to lose. Unfortunately, Californians, who own the nation’s highest-valued properties, also have the most money on the table when the next big shake comes.