Tag Archives: air worldwide

Parametric Insurance: 12 Firms to Know

If you’ve read Part One of this series, you’ll have got your crash course in parametric insurance and can now call yourself an expert (relatively speaking). In the article, I promised to give a short overview of the 12 companies I think worth looking at that are examples of how parametric insurance works, and what the future might look like. But first a quick summary of Part One:

  • Parametric insurance is emerging as a way to provide financial protection against losses that are often hard, or even impossible, to get insured for.
  • Parametric insurance has been around for over 20 years. Today, it makes up around 15% of issued catastrophe bonds in a $100 billion market.
  • Access to more data and more reliable sources of how that data is transferred is opening up an opportunity to take parametric insurance to companies small, medium and large. The access is also offering new ways to address the global insurance gap.

Now, here’s my list, ordered approximately by theme, rather than any specific ranking. Leaderboards have their place, but outside the established market for catastrophe bonds the area of parametric insurance is still too diverse, and broadly unproven, to attempt to rank companies. At least not yet.

AIR Worldwide (Owned by Verisk)

AIR’s U.S. hurricane catastrophe model was used for the first major catastrophe bond, issued by USAA, in 1997, providing $400 million of protection. AIR continues to provide one of the two most commonly used suites of catastrophe models. Reinsurers, brokers and insurers run cat models to price and manage natural catastrophe risk in most major countries. AIR has probably been the most frequently used modeling agent for U.S. hurricane catastrophe bonds in the last decade, and the company provided the analysis for the WHO pandemic catastrophe bond.

RMS

RMS and AIR have been jostling for position as the leader in providing insurance linked securities (ILS) catastrophe bonds to the global insurance industry. The RMS capital markets team has been behind some of the most complex and innovative catastrophe bonds, and RMS has been particularly strong in creating well-designed parametric triggers. Examples of bonds that RMS has worked on include Golden Goal, which provided $262 million of terrorism cover for FIFA for the 2006 World Cup. RMS was also behind the New York Mass Transit Authority (MTA) $200 million storm surge bond in 2013, issued following MTA’s unexpected and largely uninsured losses of $4.7 billion from Hurricane Sandy.

As I mentioned in Part One, Artemis has the most comprehensive directory of catastrophe bond issued, if you want to learn more about what RMS, AIR and EQECAT (since acquired by Corelogic) have worked on. 

See also: Growing Case for Parametric Coverage  

New Paradigm Underwriters

This is one version of what the future may look like as parametric insurance moves upstream to the company market. Founded in 2013, New Paradigm pre-dates the term “insurtech,” but as an MGA using new technology in a smart way it is one of the pioneers in this space. The company offers supplemental U.S. hurricane insurance for businesses that want added coverage for exclusions from the conventional policies on offer from insurers. The company’s first product used an index derived from recorded wind speed as the payout trigger, and the company is now diversifying into terrorism cover.

A quick side note here. It was discovered in the early history of parametric hurricane bonds that conventional windspeed recorders that were relied on to measure windspeed (and hence define if the bond had triggered) had a tendency to blow away when conditions got unusually gusty. New Paradigm, and others structuring windstorm parametric triggers, now use data from the WeatherFlow network. It has installed over 100 windspeed recorders designed to survive winds of 140 mph and requiring no external power.

FloodFlash

Adam Rimmer and Ian Bartholomew, the founders of FloodFlash, started their careers at RMS and got their passion for parametric insurance when working on the New York MTA bond. The company’s seed funding came from U.K. investor and incubator Insurtech Gateway three years ago. Still relatively modestly funded (£2 million, according to Crunchbase), FloodFlash is one of the best examples of parametric insurance being used today to provide a solution where traditional insurers have declined cover.

In the U.K., most homeowners get flood protection thanks to the government’s Flood Re initiative, but commercial businesses are excluded. FloodFlash models the flood risk at a high resolution and sells building-specific parametric insurance. The company operates as an MGA and sells via brokers. A FloodFlash sensor is attached to a building and triggers a payment almost instantly when the water rises to a pre-agreed depth. With hundreds of clients already signed up in the U.K., FloodFlash proved its worth after Hurricane Caura hit the country earlier this year — “the fastest payout by a parametric insurance product that I’ve ever seen,” according to Steve Evans of Artemis.

Global Parametrics

Global Parametrics was launched in 2016, the brainchild of Professor Jerry Skees, and run today by Hector Ibarra, formerly of the World Bank and Partner Re. With funding that includes support from the U.K. government’s DIFD and Germany’s KfW, the company is building parametric products to support organizations and people in the developing world who lack insurance coverage or can’t afford it. Global Parametrics has commissioned its own models for climate-related losses around the world and is building out partnerships with other leading providers. Its customers include microfinance lending organizations and NGOs such as VisionFund. The company provides payments through disaster recovery payments, which can be used to help get vulnerable communities back on their feet after a flood, drought or other natural disaster. The team is well connected and has strong technical chops, definitely one to watch. Catch Hector live or listen to the recording of our chat on our BrightTALK channel.

Descartes Underwriting

It’s one thing to build the technology for parametric insurance, but someone needs to have the confidence to underwrite it. Descartes is a Paris-based underwriting specialty insurer and is open-minded in what it covers as long as it gets “proper data.” Coverage so far has included property damage, business interruption from natural catastrophes, losses from droughts and losses from excessively high or low temperatures. Descartes has covered industries in areas such as agriculture, mining, construction, renewable energy and supports banks in protecting their loans and assets. Sebastien Piguet, co-founder and head of underwriting at Descartes, spoke to us on stage in April last year, and you can hear him on Episode 23 of the InsTech London podcast

Jumpstart Recovery

Getting claims paid from traditional insurance cover can take weeks, or even longer after a major catastrophe, but the costs kick in immediately. California earthquake insurance is expensive,and there are few affordable options to the rather limited state-backed California Earthquake Authority. Kate Stillwell, an engineer and earthquake modeler, started Jumpstart in 2015 with the aim of providing much-needed funds to increase the financial resilience of communities and provide economic stimulus immediately after an earthquake. Jumpstart accesses the peak ground velocity of the earthquake recorded by the USGS (U.S. Geological Survey) and aims to pay claims after 24 hours. The cover is currently limited to $10,000 per person, for residents of California only, and users need to certify, by text, that there has been damage and loss. Jumpstart has been supported SCOR’s Channel Syndicate.

Exante

One of the best ways to reduce loss from natural disasters is to provide funds to help people act before an event even happens. There is a lot of work going on to improve resilience from natural disasters through improvements in construction, often at a city or state level, but actions taken by individuals before disasters hit can also make a big difference. No one’s yet figured out how to forecast earthquakes, but Chris Lee, Dublin-based founder of Exante, launched his company in 2019 with backing from Shipyard Technology Ventures. Its aim is to help increase hurricane resilience for companies and their staff with a new approach to using parametric cover. Exante has designed a payout approach that is developed and calibrated using near-time forecasts of U.S. hurricane severity and landfall. If a hurricane looks likely to make landfall, funds will be released in the hours before a hurricane strikes. Payments will be made directly to Exante’s clients’ employees to help cover the costs of protecting their homes or evacuation expenses. It’s early days yet for the company, but contingency finance for risk prevention is a smart way to reduce losses.

African Risk Capacity

The African Risk Capacity (ARC) is a specialized agency of the African Union established to help governments improve their abilities to plan for, prepare for and respond to extreme weather events and natural disasters. ARC is using parametric triggers to provide contingency funding, and ARC Insurance creates pools of risk across Africa, which are then insured in the global markets. One of ARC’s parametric covers had a wobble in 2016, when a major drought in Malawi caused a large loss for farmers, but due to a problem in how the modeled index was set up didn’t trigger a payment as was intended. ARC ended up agreeing to pay a contribution toward the costs, but the wobble is a reminder that parametric insurance is sensitive to modeling assumptions and data, and that payouts may not always match the financial losses suffered (a problem termed “basis risk).

See also: Travel Insurance: An Exemplary Experience  

Blink

Paul Prendergast set up Blink in 2016 to provide flight cancellation insurance and earlier this year announced the launch of “Blink Parametric.” Back in the normal world we knew a few weeks ago, Blink Travel offered a cash payout or vouchers for hotel stays to customers who missed a flight, all fully automated. A recent development is Blink Energy & IoT, aimed at domestic appliance insurance and industrial IoT, offering protection for problems such as unexpected increase or decrease in energy usage. Blink’s partners include Generali, Munich Re and Manulife. Paul reckons he’ll have three million customers by the end of this year.

Arbol

Arbol was set up in 2018 by former banker and commodities trader Siddhartha Jha to provide weather-related crop cover for farmers and others. The team is using highly localized data sets accessed from IoT sensors and satellites to create bespoke cover down to individual field level and is selling these through an established insurer broker network. The market in the U.S. for agriculture insurance is limited due to government subsidies, but demand globally is significant, and a lack of crop insurance, particularly in developing countries, is one of the biggest contributors to the global insurance protection gap. I’ll be recording an interview with Siddhartha later this year.

Qomplx

Formerly known as Fractal, Qomplx has the experience, beefy technology and access to data for rapidly analyzing risk across many industries. The insurance business is headed by President Alastair Speare-Cole, previously chief underwriting officer of Qatar RE, CEO of broker JLT Re and chairman of Aon Benfield Securities. Qomplx has a number of initiatives in the pipeline. It recently launched its first parametric product, WonderCover, backed by Chaucer and offering cyber and terrorism cover for small to medium-sized enterprises (SMEs). Alastair and his team supported our live chat event on April 30 on our BrightTALK channel.

In conclusion..

It’s not possible to get every company offering parametric insurance onto a list of 10, and this is certainly not intended to be the definitive top 10. (Although unlike some lists of “top insurtech companies” I’ve come across, at least all these companies are all still in business at the time of writing.) None of the main brokers are mentioned, but the big three (or should that be two?) are key in working with insurers and insureds to help communicate and structure all but the smallest risks. As a supporter of InsTech London, Aon gets a shout out here as one of the longest-standing experts in this field.

There are other companies we’re watching closely and have had on stage at InsTech London. Please let me know of other (decent) companies you are aware of with parametric solutions.

And look out for more live events on this topic soon. I’ll also be hosting chats on post-pandemic coverages. Registration on BrightTALK.

Finally, if you are a company that would like to be considered for a future article, being a member of InsTech London or having a great photo of your equipment or your tech….

If you enjoyed this, found it useful or maybe both, then you may find something of interest in my other articles. You can also hear me talking to the industry’s leaders and innovators each week on the InsTech London podcast channel (available on Apple, iTunes, Spotify etc). And for a weekly check-in on what’s going on and what we think about it, you can get our two-minute, handcrafted newsletter delivered to you each Wednesday morning – sign up here.

Is There Risk in Embracing Insurtech?

As insurers rush headlong into the digital scramble, they should keep in mind the proverbial iceberg. Not all the risks involved are strictly tied to the innovation itself. Certain ones are below the water level.

Insurers actively participating in the digital revolution have done so in a variety of ways: 1) innovation labs, 2) insurtech accelerators with external partners, 3) investments in insurtech companies, 4) purchases of insurtech companies. These are reasonable approaches for staying current and competitive. However, there are some caveats that should be heeded.

Focus Risk

Insurance is not a simple business. Machines cannot be set to produce thousands of identical items, a sale is not final and competition is never at a low ebb. It is a complex business that relies on actuarial forecasting, capital models, complicated and multi-layered contracts, in many cases, and astute claims handling. Thus, companies must remain focused on the functions and metrics fundamental to the business, if they are to achieve good results.

Over the years, the insurance industry has adapted to paradigm shifts of all types, for example: 1) automation of internal operations, 2) agent/broker electronic interface, 3) paperless environments, 4) increased transparency with regulators and 5) product development responsive to new risks such as cyber or supply chain disruption. Now, creating new ways to interact with stakeholders digitally and developing products that are fit for purpose in a digital world should be within the capability bounds of these same insurers.

The caution is that insurers should not get so focused on their digital initiatives they lose proper sight of the basics of the business: underwriting, claims, actuarial, finance, customer service. Equally, insurers cannot lose sight of other disruptive forces in the environment such as climate change, terrorism and cyber threats.

See also: Insurtech: Unstoppable Momentum  

A piece appearing on AIR Wordwide’s website written by Bill Churney asks “Have You Lost Focus On Managing Catastrophe Risk?” He alludes to the fact that catastrophes have been light these past 10 years, which may cause inattention, and that many new insurance staffers were not working when Katrina, Andrew or Hugo hit, thus have no personal experience to tap for handling sizable events. A lack of focus on managing catastrophe risk could be critically detrimental for companies. And although there is nothing concrete to suggest that insurers have lost such focus, the question underscores the possibility of attention deficits. The need for continuous and careful attention to the rudimentary aspects of the business cannot be dismissed, even if they may not seem as exciting or timely as digital inventions.

Within memory, there have been companies that allowed themselves to lose necessary focus. Some got so focused on mergers and acquisitions that core functions were not managed properly while the emphasis was on cross sales and economies of scale. Some got so intent on improving customer relations that business imperatives were ignored in favor of appeasing the customer, and some got so diversified that senior management did not have the bandwidth to manage the whole enterprise.

How can the 2016 results at AIG be explained? Could the more recent focus on divestitures, staff changes and cuts, a drive to return dividends to shareholders and the CEO’s reported concentration on technology have caused it to lose its once unparalleled focus on profitable underwriting, rigorous claims handling and product innovation.

Investment Risk

With investments pouring into insurtech, it raises the question: What is left for anything else? Despite fintech investments starting to slow, KPMG reports, “There was a dramatic increase in interest in insurtech in Q3’16, and the trend is expected to continue. The U.S. was the top country in Q3’16 with 10 insurtech deals, valued at $104.7 million in total.”

These numbers do not capture the many millions of dollars that insurers are investing in insurtech activities internally, of-course. As mentioned above, they are spending money to create dedicated innovation labs and accelerator programs and to launch other types of speculative insurtech projects. Many older projects have become operational, including new business unit or company startups, the introduction of bots on company websites, telematics in vehicles, digitized claims handling…and the list goes on.

How does an insurer know when an investment in insurtech is enough or too much, thereby negating other necessary investments required by functions such as underwriting, claims or actuarial?

The caution is not about doing an ROI (return on investment) analysis for a specific project. It is about doing an ROI analysis for the portfolio of projects that are vying for funding vis-a-vis the need to keep the company solvent while maintaining progress with the digital strategy. The larger the insurer, the more used it is to managing multiple priorities and projects. For mid-size to small insurers, this skill may be less developed, and they may face even greater risk of getting out of balance.

Growth Risk

Insurance is one of the few industries for which growth can be just as risky as no growth. Industry pundits have long claimed that new business performs about three points worse than policies already on the books. The difference between a company at a combined ratio of 99 compared with 102 can be quite significant. The causes for this phenomenon have to do with such factors as: 1) the potential for adverse selection, 2) the reason customers choose to change carriers and 3) the costs associated with putting new business on the books. These are not the only ones. It is harder for actuaries to predict the loss patterns for groups of customers for whom there is no history in the company’s database.

See also: Infrastructure: Risks and Opportunities  

If the reason for investing in insurtech is to increase new business written, insurers should be cautious about how much and what kind of new business they will write because of their insurtech enhancements. To the extent that insurtech enables insurers to hold on to existing business, the outcome is less risky.

For example, it remains to be seen whether drivers who want to buy insurance by the mile are a better or worse risk pool than other drivers, whether those involved in the sharing economy, such as renting rooms in their homes, are more or less prone to loss than homeowners who do not rent rooms. Are small businesses that are willing to buy their coverage on-line likely to file a higher number of claims or a lower number compared with small businesses who use an agent? Do insurance buyers who are attracted to peer-to-peer providers have loss experiences at a different rate than those who are not attracted to such a model?

Conclusion

The march toward more digitization in the insurance industry will and must go forward. At the same time, insurers should be wise enough to realize and address underlying risks inherent in this type of aggressive campaign to modernize.

data symmetry

Competing in an Age of Data Symmetry

For centuries, people have lived in a world where data was largely proprietary, creating asymmetry. Some had it. Others did not. Information was a currency. Some organizations held it, and profited from it. We are now entering an era of tremendous data balance — a period of data symmetry that will rewrite how companies differentiate themselves.

The factors that move the world toward data symmetry are time, markets, investment and disruption.

Consider maps and the data they contained. Not long ago, paper maps, travel books and documentaries offered the very best views of geographic locations. Today, Google allows us to cruise nearly any street in America and get a 360° view of homes, businesses and scenery. Electronic devices guide us along the roadways and calculate our ETA. A long-established map company such as Rand McNally now has to compete with GPS up-and-comers, selling “simple apps” with the same information. They all have access to the same data. When it comes to the symmetry of geographic data, the Earth is once again flat.

Data symmetry is rewriting business rules across industries and markets every day. Insurance is just one industry where it is on the rise. For insurers to overcome the new equality of data access, they will need to understand both how data is becoming symmetrical and how they can re-envision their uniqueness in the market.

It will be helpful to first understand how data is moving from asymmetrical to symmetrical.

Let’s use claims as an example. Until now, the insurer’s best claims data was found in its own stockpile of claims history and demographics. An insurer that was adept at managing this data and applied actuarial science would find itself in a better position to assess risk. Competitively, it could rise to the top of the pack by pricing appropriately and acquiring appropriately.

Today, all of that information is still very relevant. However, in the absence of that information, an insurer could also rely upon a flood of data streams coming from other sources. Risk assessment is no longer confined to historical data, nor is it confined to answers to questions and personal reports. Risk data can be found in areas as simple as cell phone location data — an example of digital exhaust.

Digital exhaust as a source of symmetry

Digital exhaust is the data trail that all of us leave on the digital landscape. Recently, the New York City Housing Authority wished to determine if the “named” renter was the one actually living in a rent-controlled apartment. A search of cell phone tower location records, cross-referenced to a renter’s information, was able to establish the validity of renter occupation. That is just one example of digital exhaust data being used as a verification tool.

Another example can be found in Google’s Waze app. Because I use Waze, Google now holds my complete driving history — a telematics treasure trove of locations, habits, schedules and preferences. The permissions language allows Waze to access my calendars and contacts. With all of this, in conjunction with other Google data sets, Google can create a fairly complete picture of me. This, too, is digital exhaust. As auto insurers are proving each day, cell phone data may be more informative to proper pricing than previous claims history. How long is it until auto insurers begin to look at location risk, such as too much time spent in a bar or frequent driving through high-crime ZIP codes? If ZIP codes matter for where a car is parked each night, why wouldn’t they matter for where it spends the day?

Data aggregators as a source of symmetry

In addition to digital exhaust, data aggregators and scoring are also flattening the market and bringing data symmetry to markets. Mortgage lenders are a good example from outside the industry. Most mortgage lenders pay far more attention to comprehensive credit scores than an individual’s performance within their own lending operation. The outside data matters more than the inside data, because the outside data gives a more complete picture of the risk, compiled from a greater number of sources.

Within insurance, we can find a dozen or more ways that data acquisition, consolidation and scoring is bringing data symmetry to the industry. Quest Diagnostics supplies scored medical histories and pharmaceutical data to life insurers — any of whom wish to pay for it. RMS, AIR Worldwide, EQECAT and others turn meteorological and geographical data into shared risk models for P&C insurers.

That kind of data transformation can happen in nearly any stream of data. Motor vehicle records are scored by several agencies. Health data streams could also be scored for life and health insurers. Combined scores could be automatically evaluated and placed into overall scores. Insurers could simply dial up or dial down their acceptance based on their risk tolerance and pricing. Data doesn’t seem to stay hidden. It has value. It wants to be collected, sold and used.

Consider all the data sources I will soon be able to tap into without asking any questions. (This assumes I have permissions, and barring changes in regulation.)

  • Real-time driving behavior.
  • Travel information.
  • Retail purchases and preferences.
  • Mobile statistics.
  • Exercise or motion metrics.
  • Household or company (internal) data coming from connected devices.
  • Household or company (external) data coming from geographic databases.

These data doors, once opened, will be opened for all. They are opening on personal lines first, but they will open on commercial lines, as well.

Now that we have established that data symmetry is real, and we see how it will place pressure upon insurers, it makes sense to look at how insurers will use data and other devices to differentiate themselves. In Part 2 of this blog, we’ll look at how this shift in data symmetry is forcing insurers to ask new questions. Are there ways they can expand their use of current data? Are there additional data streams that may be untapped? What does the organization have or do that is unique? The goal is for insurers to innovate around areas of differentiation. This will help them rise above the symmetry, embracing data’s availability to re-envision their uniqueness.

TRIA Non-Renewal: Effect on P&C?

Losses stemming from the destruction of the World Trade Center and other buildings by terrorists on Sept. 11, 2001, totaled about $31.6 billion, including commercial liability and group life insurance claims — not adjusted for inflation — or $42.1 billion in 2012 dollars. About two-thirds of these losses were paid for by reinsurers, companies that provide insurance for insurers.

Concerned about the limited availability of terrorism coverage in high-risk areas and its impact on the economy, Congress passed the Terrorism Risk Insurance Act (TRIA). The act provides a temporary program that, in the event of major terrorist attack, allows the insurance industry and federal government to share losses according to a specific formula. TRIA was signed into law on Nov. 26, 2002, and renewed for two years in December 2005. Passage of TRIA enabled a market for terrorism insurance to begin to develop because the federal backstop effectively limits insurers’ losses, greatly simplifying the underwriting process. TRIA was extended for seven years to 2014 in December 2007. The new law is known as the Terrorism Risk Insurance Program Reauthorization Act (TRIPRA) of 2007.

This week, Congress failed to reauthorize TRIA before members adjourned for the holiday recess. Now, with the expiration of the law on Dec. 31, some businesses may be left without insurance coverage in the event of a terrorist attack on the U.S. Both houses of Congress have been discussing legislation that would set out the federal government’s involvement in funding potential terrorism losses, but bills proposed by the two houses earlier this year differed, and no extension was passed.

A report from the Wharton Risk Management and Decision Processes Center found that, under the current TRIA program, some insurers have already reached a level of exposure to losses from a terrorist attack that could jeopardize their ability to pay claims, based on a critical measure of solvency: the ratio of an insurer’s TRIA deductible amount in relation to its surplus. The report, “TRIA After 2014: Examining Risk Sharing Under Current and Alternative Designs,” found that as the deductible percentage rises, as it does under the Senate bill and proposals put forward in the House, more insurers have a deductible-to-surplus ratio that is above an acceptable level. The report also sets out in detail the amount the American taxpayer and federal government would have to pay under differing scenarios.

A RAND Corp. study published in April 2014 found that in a terrorist attack with losses of as much as $50 billion, the federal government would spend more dealing with the losses than if it had continued to support a national terrorism risk insurance program, because it would likely pay out more in disaster assistance.

A report by the President’s Working Group on Financial Markets made public in April 2014 generally supports the insurance industry’s view that the expiration of TRIA would make terrorism coverage more expensive and difficult to obtain.

The insurance broker Marsh released its annual study of the market, “2014 Terrorism Risk Insurance Report,” in April. Among its many findings is that uncertainty surrounding the potential expiration of TRIA significantly affected the property/casualty insurance market. Some employers with large concentrations of workers and companies with property exposures in major U.S. cities found that terrorism insurance capacity was limited and prices higher, and some could not obtain coverage at all. If the law is allowed to expire or is significantly changed, the market is likely to become more volatile with higher prices and limited coverage, the study concludes.

Before Sept. 11, 2001, insurers provided terrorism coverage to their commercial insurance customers essentially free of charge because the chance of property damage from terrorist acts was considered remote. After Sept. 11, insurers began to reassess the risk. For a while, terrorism coverage was scarce. Reinsurers were unwilling to reinsure policies in urban areas perceived to be vulnerable to attack. Primary insurers filed requests with their state insurance departments for permission to exclude terrorism coverage from their commercial policies.

From an insurance viewpoint, terrorism risk is very different from the kind of risks typically insured. To be readily insurable, risks have to have certain characteristics.

The risk must be measurable. Insurers must be able to determine the possible or probable number of events (frequency) likely to result in claims and the maximum size or cost (severity) of these events. For example, insurers know from experience about how many car crashes to expect per 100,000 miles driven for any geographic area and what these crashes are likely to cost. As a result, they can charge a premium equal to the risk they are assuming in issuing an auto insurance policy.

A large number of people or businesses must be exposed to the risk of loss, but only a few must actually experience one, so that the premiums of those that do not file claims can fund the losses of those who do.

Losses must be random as regards time, location and magnitude.

Insofar as acts of terrorism are intentional, terrorism risk doesn’t have these characteristics. In addition, no one knows what the worst-case scenario might be. There have been few terrorist attacks, so there is little data on which to base estimates of future losses, either in terms of frequency or severity. Terrorism losses are also likely to be concentrated geographically, since terrorism is usually targeted to produce a significant economic or psychological impact. This leads to a situation known in the insurance industry as adverse selection, where only the people most at risk purchase coverage, the same people who are likely to file claims. Moreover, terrorism losses are never random. They are carefully planned and often coordinated.

To underwrite terrorism insurance — to decide whether to offer coverage and what price to charge — insurers must be able to quantify the risk: the likelihood of an event and the amount of damage it would cause. Increasingly, they are using sophisticated modeling tools to assess this risk. According to the modeling firm AIR Worldwide, the way terrorism risk is measured is not much different from assessments of natural disaster risk, except that the data used for terrorism are more subject to uncertainty. It is easier to project the risk of damage in a particular location from an earthquake of a given intensity or a Category 5 hurricane than a terrorist attack because insurers have had so much more experience with natural disasters than with terrorist attacks, and therefore the data to incorporate into models are readily available.

One problem insurers face is the accumulation of risk. They need to know not only the likelihood and extent of damage to a particular building but also the company’s accumulated risk from insuring multiple buildings within a given geographical area, including the implications of fire following a terrorist attack. In addition, in the U.S., workers’ compensation insurers face concentrations of risk from injuries to workers caused by terrorism attacks. Workers’ compensation policies provide coverage for loss of income and medical and rehabilitation treatment from “first dollar,” that is, without deductibles.

Extending the Terrorism Risk Insurance Act (TRIA):

There is general agreement that TRIA has helped insurance companies provide terrorism coverage because the federal government’s involvement offers a measure of certainty as to the maximum size of losses insurers would have to pay and allows them to plan for the future. However, when the act came up for renewal in 2005 and in 2007, there were some who believed that market forces should be allowed to deal with the problem. Both the U.S. Government Accountability Office and the President’s Working Group on Financial Markets published reports on terrorism insurance in September 2006. The two reports essentially supported the insurance industry in its evaluation of nuclear, biological, chemical and radiological (NBCR) risk — that it is uninsurable — but the President’s Working Group said that the existence of TRIA had inhibited the development of a more robust market for terrorism insurance, a point on which the industry disagrees. TRIA is the reason that coverage is available, insurers say. The structure of the program has encouraged the development of reinsurance for the layers of risk that insurers must bear themselves — deductible amounts and coinsurance — which in turn allows primary insurers to provide coverage. Without TRIA, there would be no private market for terrorism insurance.

Studies by various organizations have supported a temporary continuation of the program in some form, including the University of Pennsylvania’s Wharton School, the RAND Corp. and the Organization of Economic Cooperation and Development (OECD), an organization of 30 member countries, many of which have addressed the risk of terrorism through a public/private partnership. The OECD said in an analysis that financial markets have shown very little appetite for terrorism risk because of the enormousness and unpredictability of the exposure. RAND argued not only that TRIA should be extended but also that Congress should act to increase the business community’s purchase of terrorism insurance and lower its price. RAND also advocated mandatory coverage for some “vital systems,” establishing an oversight board and increasing efforts to mitigate the risks.

For the full report from which this is excerpted, click here.

Riding Out the Storm: the New Models

In our last article, When Nature Calls, we looked back at an insurance industry reeling from several consecutive natural catastrophes that generated combined insured losses exceeding $30 billion. Those massive losses were a direct result of an industry overconfident in its ability to gauge the frequency and severity of catastrophic events. Insurers were using only history and their limited experience as their guide, resulting in a tragic loss of years’ worth of policyholder surplus.

The turmoil of this period cannot be overstated. Many insurers went insolvent, and those that survived needed substantial capital infusions to continue functioning. Property owners in many states were left with no affordable options for adequate coverage and, in many cases, were forced to go without any coverage at all. The property markets seized up. Without the ability to properly estimate how catastrophic events would affect insured properties, it looked as though the market would remain broken indefinitely.

Luckily, in the mid 1980s, two people on different sides of the country were already working on solutions to this daunting problem. Both had asked themselves: If the problem is lack of data because of the rarity of recorded historical catastrophic events, then could we plug the historical data available now, along with mechanisms for how catastrophic events behave, into a computer and then extrapolate the full picture of the historical data needed? Could we then take that data and create a catalog of millions of simulated events occurring over thousands of years and use it to tell us where and how often we can expect events to occur, as well as how severe they could be? The answer was unequivocally yes, but with caveats.

In 1987, Karen Clark, a former insurance executive out of Boston, formed Applied Insurance Research (now AIR Worldwide). She spent much of the 1980s with a team of researchers and programmers designing a system that could estimate where hurricanes would strike the coastal U.S., how often they would strike and ultimately, based on input insurance policy terms and conditions, how much loss an insurer could expect from those events. Simultaneously, on the West Coast at Stanford University, Hemant Shah was completing his graduate degree in engineering and attempting to answer those same questions, only he was focusing on the effects of earthquakes occurring around Los Angeles and San Francisco.

In 1988, Clark released the first commercially available catastrophe model for U.S. hurricanes. Shah released his earthquake model a year later through his company, Risk Management Solutions (RMS). Their models were incredibly slow, limited and, according to many insurers, unnecessary. However, for the first time, loss estimates were being calculated based on actual scientific data of the day along with extrapolated probability and statistics in place of the extremely limited historical data previously used. These new “modeled” loss estimates were not in line with what insurers were used to seeing and certainly could not be justified based on historical record.

Clark’s model generated hurricane storm losses in the tens of billions of dollars while, up until that point, the largest insured loss ever recorded did not even reach $1 billion! Insurers scoffed at the comparison. But all of that quickly changed in August 1992, when Hurricane Andrew struck southern Florida.

Using her hurricane model, Clark estimated that insured losses from Andrew might exceed $13 billion. Even in the face of heavy industry doubt, Clark published her prediction. She was immediately derided and questioned by her peers, the press and virtually everyone around. They said her estimates were unprecedented and far too high. In the end, though, when it turned out that actual losses, as recorded by Property Claims Services, exceeded $15 billion, a virtual catastrophe model feeding frenzy began. Insurers quickly changed their tune and began asking AIR and RMS for model demonstrations. The property insurance market would never be the same.

So what exactly are these revolutionary models, which are now affectionately referred to as “cat models?”

Regardless of the model vendor, every cat model uses the same three components:

  1. Event Catalog – A catalog of hypothetical stochastic (randomized) events, which informs the modeler about the frequency and severity of catastrophic events. The events contained in the catalog are based on millions of years of computerized simulations using recorded historical data, scientific estimation and the physics of how these types of events are formed and behave. Additionally, for each of these events, associated hazard and local intensity data is available, which answers the questions: Where? How big? And how often?
  2. Damage Estimation – The models employ damage functions, which describe the mathematical interaction between building structure and event intensity, including both their structural and nonstructural components, as well as their contents and the local intensity to which they are exposed. The damage functions have been developed by experts in wind and structural engineering and are based on published engineering research and engineering analyses. They have also been validated based on results of extensive damage surveys undertaken in the aftermath of catastrophic events and on billions of dollars of actual industry claims data.
  3. Financial Loss – The financial module calculates the final losses after applying all limits and deductibles on a damaged structure. These losses can be linked back to events with specific probabilities of occurrence. Now an insurer not only knows what it is exposed to, but also what its worst-case scenarios are and how frequently those may occur.

Screenshot-2014-11-13-14.50.41

When cat models first became commercially available, industry adoption was slow. It took Hurricane Andrew in 1992 followed by the Northridge earthquake in 1994 to literally and figuratively shake the industry out of its overconfidence. Reinsurers and large insurers were the first to use the models, mostly due to their vast exposure to loss and their ability to afford the high license fees. Over time, however, much of the industry followed suit. Insurers that were unable to afford the models (or who were skeptical of them) could get access to all the available major models via reinsurance brokers that, at that time, also began rolling out suites of analytic solutions around catastrophe model results.

Today, the models are ubiquitous in the industry. Rating agencies require model output based on prescribed model parameters in their supplementary rating questionnaires to understand whether or not insurers can economically withstand certain levels of catastrophic loss. Reinsurers expect insurers to provide modeled loss output on their submissions when applying for reinsurance. The state of Florida has even set up a commission, the Florida Commission on Loss Prevention Methodology, which consists of “an independent body of experts created by the Florida Legislature in 1995 for the purpose of developing standards and reviewing hurricane loss models used in the development of residential property insurance rates and the calculation of probable maximum loss levels.”

Models are available for tropical cyclones, extra tropical cyclones, earthquakes, tornados, hail, coastal and inland flooding, tsunamis and even for pandemics and certain types of terrorist attacks. The first set of models started out as simulated catastrophes for U.S.-based perils, but now models exist globally for countries in Europe, Australia, Japan, China and South America.

In an effort to get ahead of the potential impact of climate change, all leading model vendors even provide U.S. hurricane event catalogs, which simulate potential catastrophic scenarios under the assumption that the Atlantic Ocean sea-surface temperatures will be warmer on average. And with advancing technologies, open-source platforms are being developed, which will help scores of researchers working globally on catastrophes to become entrepreneurs by allowing “plug and play” use of their models. This is the virtual equivalent of a cat modeling app store.

Catastrophe models have provided the insurance industry with an innovative solution to a major problem. Ironically, the solution itself is now an industry in its own right, as estimated revenues from model licenses now annually exceed $500 million (based on conversations with industry experts).

But how have the models performed over time? Have they made a difference in the industry’s ability to help manage catastrophic loss? Those are not easy questions to answer, but we believe they have. All the chaos from Hurricane Andrew and the Northridge earthquake taught the industry some invaluable lessons. After the horrific 2004 and 2005 hurricane seasons, which ravaged Florida with four major hurricanes in a single year, followed by a year that saw two major hurricanes striking the Gulf Coast – one of them being Hurricane Katrina, the single most costly natural disaster in history – there were no ensuing major insurance company insolvencies. This was a profound success.

The industry withstood a two-year period of major catastrophic losses. Clearly, something had changed. Cat models played a significant role in this transformation. The hurricane losses from 2004 and 2005 were large and painful, but did not come as a surprise. Using model results, the industry now had a framework to place those losses in proper context. In fact, each model vendor has many simulated hurricane events in their catalogs, which resemble Hurricane Katrina. Insurers knew, from the models, that Katrina could happen and were therefore prepared for that possible, albeit unlikely, outcome.

However, with the universal use of cat models in property insurance comes other issues. Are we misusing these tools? Are we becoming overly dependent on them? Are models being treated as a panacea to vexing business and scientific questions instead of as the simple framework for understanding potential loss?

Next in this series, we will illustrate how modeling results are being used in the industry and how overconfidence in the models could, once again, lead to crisis.