Tag Archives: superstorm sandy

How to Avoid Failed Catastrophe Models

Since commercial catastrophe models were introduced in the 1980s, they have become an integral part of the global (re)insurance industry. Underwriters depend on them to price risk, management uses them to set business strategies and rating agencies and regulators consider them in their analyses. Yet new scientific discoveries and claims insights regularly reshape our view of risk, and a customized model that is fit-for-purpose one day might quickly become obsolete if it is not updated for changing business practices and advances in our understanding of natural and man-made events in a timely manner.

Despite the sophisticated nature of each new generation of models, new events sometimes expose previously hidden attributes of a particular peril or region. In 2005, Hurricane Katrina caused economic and insured losses in New Orleans far greater than expected because models did not consider the possibility of the city’s levees failing. In 2011, the existence of a previously unknown fault beneath Christchurch and the fact the city sits on an alluvial plain of damp soil created unexpected liquefaction in the New Zealand earthquake. And in 2012, Superstorm Sandy exposed the vulnerability of underground garages and electrical infrastructure in New York City to storm surge, a secondary peril in wind models that did not consider the placement of these risks in pre-Sandy event sets.

Such surprises affect the bottom lines of (re)insurers, who price risk largely based on the losses and volatility suggested by the thousands of simulated events analyzed by a model. However, there is a silver lining for (re)insurers. These events advance modeling capabilities by improving our understanding of the peril’s physics and damage potential. Users can then often incorporate such advances themselves, along with new technologies and best practices for model management, to keep their company’s view of risk current – even if the vendor has not yet released its own updated version – and validate enterprise risk management decisions to important stakeholders.

See also: Catastrophe Models Allow Breakthroughs  

When creating a resilient internal modeling strategy, (re)insurers must weigh cost, data security, ease of use and dependability. Complementing a core commercial model with in-house data and analytics and standard formulas from regulators, and reconciling any material differences in hazard assumptions or modeled losses, can help companies of all sizes manage resources. Additionally, the work protects sensitive information, allows access to the latest technology and support networks and mitigates the impact of a crisis to vital assets – all while developing a unique risk profile.

To the extent resources allow, (re)insurers should analyze several macro- and micro-level considerations when evaluating the merits of a given platform. On the macro level, unless a company’s underwriting and claims data dominated the vendor’s development methodology, customization is almost always desirable, especially at the bottom of the loss curve where there is more claim data; if a large insurer with robust exposure and claims data is heavily involved in the vendor’s product development, the model’s vulnerability assumptions and loss payout and developments patterns will likely mirror that of the company itself, so less customization is necessary. Either way, users should validate modeled losses against historical claims from both their own company and industry perspectives, taking care to adjust for inflation, exposure changes or non-modeled perils, to confirm the reasonability of return periods in portfolio and industry occurrence and aggregate exceedance-probability curves. Without this important step, insurers may find their modeled loss curves differ materially from observed historical results, as illustrated below.

A micro-level review of model assumptions and shortcomings can further narrow the odds of a “shock” loss. As such, it is critical to precisely identify risks’ physical locations and characteristics, as loss estimates may vary widely within a short distance – especially for flood, where elevation is an important factor. When a model’s geocoding engine or a national address database cannot assign location, there are several disaggregation methodologies available, but each produces different loss estimates. European companies will need to be particularly careful regarding data quality and integrity as the new General Data Protection Regulation, which may mean less specific location data is collected, takes effect.

Equally as important as location is a risk’s physical characteristics, as a model will estimate a range of possibilities without this information. If the assumption regarding year of construction, for example, differs materially from the insurer’s actual distribution, modeled losses for risks with unknown construction years may be under- or overestimated. The exhibit below illustrates the difference between an insurer’s actual data and a model’s assumed year of construction distribution based on regional census data in Portugal. In this case, the model assumes an older distribution than the actual data shows, so losses on risks with unknown construction years may be overstated.

There is also no database of agreed property, contents or business interruption valuations, so if a model’s assumed valuations are under- or overstated, the damage function may be inflated or diminished to balance to historical industry losses.

See also: How to Vastly Improve Catastrophe Modeling  

Finally, companies must also adjust “off-the-shelf” models for missing components. Examples include overlooked exposures like a detached garage; new underwriting guidelines, policy wordings or regulations; or the treatment of sub-perils, such as a tsunami resulting from an earthquake. Loss adjustment difficulties are also not always adequately addressed in models. Loss leakage – such as when adjusters cannot separate covered wind loss from excluded storm surge loss – can inflate results, and complex events can drive higher labor and material costs or unusual delays. Users must also consider the cascading impact of failed risk mitigation measures, such as the malfunction of cooling generators in the Fukushima nuclear power plant after the Tohoku earthquake.

If an insurer performs regular, macro-level analyses of its model, validating estimated losses against historical experience and new views of risk, while also supplementing missing or inadequate micro-level components appropriately, it can construct a more resilient modeling strategy that minimizes the possibility of model failure and maximizes opportunities for profitable growth.

The views expressed herein are solely those of the author and do not reflect the views of Guy Carpenter & Company, LLC, its officers, managers, or employees.

You can find the article originally published on Brink.

What to Do When Catastrophes Go Viral

The power of social media is undeniable. Whether it’s political movements, disasters, or breaking news, social media delivers unfiltered information instantaneously to people around the world. When a catastrophe occurs today, comments, pictures and video are likely to appear on the Internet as it happens. For instance, a deadly explosion at a Texas fertilizer plant was caught live on video and posted to social media, as was an enormous explosion that rocked the Chinese port of Tianjin. But when social media posts about a catastrophe go viral, the company involved can be in for a struggle.

To avoid getting left behind, companies need to prepare for how they will communicate using social media when a catastrophe strikes. A company that plans ahead and is able to mount a robust response may not only salvage its reputation, but may actually enhance its public image if it is seen as managing a difficult situation well. Because many companies lack this kind of communications expertise, they may want to work with consultants that can help them prepare for a disaster and respond appropriately. In addition, they should consider insurance that provides coverage for experienced public relations catastrophe management services to protect their corporate reputation.

Social Media Plays a Crucial Role in a Crisis

When it comes to disasters, mobile apps and social media are seen by the public as crucial ways to get information, according to a Red Cross survey. During Superstorm Sandy in 2012, social media played a significant role in providing official information and combating rumors. When Cyclone Tasha struck Australia in 2010, the Queensland Police Service made extensive use of Twitter to provide information to people spread over a vast area.

Social media, however, is widespread and public information, which means that if there is an explosion, fire, or other disaster, chances are someone may be streaming it live to the Internet, tweeting about it, posting it to Facebook or uploading pictures to Instagram even before the affected company is aware of it. In essence, that means public opinion about the incident, as well as the company involved, is already being shaped, possibly without any direction from corporate communications.

Because information travels so quickly through social media, the public no longer has to wait for the evening news to receive the most up-to-date information. Therefore, companies are not afforded the luxury of time to gather all available facts before addressing the public. Traditional media and news organizations are also feeling an increased amount of pressure. Since social media has enabled news to travel quicker, stories may not receive the same level of scrutiny as they once did. That leaves plenty of opportunity for the spread of misinformation, which can be very difficult to counteract. On the Internet, inaccurate information may persist long after it has been thoroughly discredited elsewhere.

Embrace Social Media in Crisis Communications

To handle the social media aspect of a crisis, companies need to be able to act immediately or risk allowing reporters and “citizen journalists” to tell the story they want to tell, which may not provide a complete and accurate picture. Being unprepared can lead to inconsistent messaging, or even misstatements that may create confusion and ultimately damage a corporation’s reputation. A company that is seen as clumsy in its media response to a crisis risks losing credibility.

See Also: Should Social Media Have a Place?

When a disaster is handled well – by providing the public with timely and accurate information as well as proper reassurances about its products and services – an organization can actually bolster its reputation. While social media accelerates the media cycle, it can also enable a company to take control of its image by acting as a primary and reliable source of information when a catastrophe occurs. This requires planning and preparation.

An initial step is to review the corporate crisis communication plan to understand its limits in social media. A traditional crisis plan provides for one-way, controlled communication through prepared statements, press conferences, marketing tools, and commercials.

Such an approach is likely to be viewed as unresponsive by the public seeking immediate information. Incorporating social media into the traditional plan provides for two-way communication that allows for debate, insight, and opposing viewpoints that can guide the company’s responses.

The social media plan, however, should remain consistent with the company’s traditional media efforts. The company should provide consistent messaging in both traditional and social media about its culture and philosophy, the actions it is taking and the expected results, and its concern for those who have been affected.

Screen Shot 2016-04-29 at 12.48.53 PM

 

Develop a Detailed Social Media Plan

The plan should delineate the policies and procedures to be followed in the event of a catastrophe, and – most importantly – assign roles and responsibilities to specific staff. This ensures that someone who understands the company’s message will maintain control, which can help lessen potential mistakes. Both external and internal policies should be covered so that the information communicated to and among employees and the public is timely, accurate and consistent.

The written policy should detail the information to be provided – for instance – pre-vetted information about the company and its corporate philosophy. It should establish guidelines pertaining to the types of social media posts that necessitate a response. Not every
post merits a reply. Anyone who uses a computer or smartphone can post information to the Internet. Identifying legitimate posts and inquiries and providing necessary information can help preserve a company’s reputation.

Because the social media landscape is dynamic, companies shouldn’t limit themselves to just one outlet, but rather those that are most appropriate for the business, the audience and the geographic region. If an incident occurs abroad, companies should use the
social media outlet most appropriate for that region. With their massive user base, Facebook, Twitter and YouTube are obvious choices for domestic and international audiences. Others such as Instagram, Snapchat and Tumblr, should be considered. Companies active in Europe and Russia should consider the social networking site VK.

Prepare the Response

While it may not be possible to prepare material for every potential catastrophe, companies can still organize information ahead of time that can be released as soon as something happens. Information can be prepared for a “dark page” for the corporate website that can be published in the event of an emergency; however, companies should be careful not to publish a “dark page” until a crisis actually occurs.

The site can include background information about the company and its specific businesses as well as the corporate philosophy during times of crisis. Other information might be media contacts and toll-free phone numbers for claims intake. Preparing the information ahead of time makes it possible to have it reviewed by a company’s legal department, public relations, and senior management. Once the page is live, it should be monitored and updated so that it always provides the most current information.

Whether information is prepared ahead of time or developed in response to a particular incident, it should be presented in a way that is accessible for the audience. Written material should be understandable by a wide range of people. Companies should avoid industry jargon and acronyms, which may be unclear or even misunderstood by the general public.

Screen Shot 2016-04-29 at 12.53.03 PM

Monitor and Test

When not in crisis mode, it is helpful for companies to monitor social media. Viewing the social media environment in the normal course of business can help companies ascertain how their brand, products and services are viewed by the public. Companies can purchase monitoring services or build these capabilities in-house.

While monitoring social media is an important part of regular business, it becomes essential after a catastrophe to identify issues that need immediate attention. This helps to ensure that the traditional and social media messages the company is sending are having the desired impact. If the same questions continue to be asked on social media, it’s a clear sign that the message is not getting across.

As part of their overall catastrophe preparation, companies should test their communication response plan to assess their procedures as well as their staff. Testing can help ensure that everyone understands their roles and responsibilities and is able to react quickly. Drills assist in identifying blockages and help address uncertainties in the process. After the test or following an actual event, the company should conduct a thorough reevaluation and debriefing to identify the areas that worked well and those that need improvement.

Preserve the Corporate Reputation

Today, a story about a disaster can be trending on social media even before the company involved is aware of the loss. Organizations that wait too long to respond can cause lasting damage to their reputation. A company that is perceived as avoiding or failing to address a story may soon realize that its lack of response becomes the subject of that story. Undoing the damage caused by a tardy or ill-conceived response can be very difficult.

Many people realize that companies may make mistakes, but how these companies react and the decisions they make when faced with a disaster can potentially lessen confidence among customers and the wider public. Knowing how and when to respond helps project an image of competence and concern. Social media is the fastest way to reach people, project the company’s message and protect its reputation.

To become better prepared, companies have to identify their most likely risks and develop plans to mitigate those exposures, whether they are health, safety or environmental. Companies need to know how best to respond on social media if a disaster were to affect their business. To do so, companies may want to work with consultants that can provide risk analysis and mitigation services and help to prepare a crisis response. In addition, to help plan how they will respond to a crisis on social and traditional media, companies should also consider insurance that can defray the costs of hiring expert help when a disaster strikes. No one knows when a catastrophe may occur, but being prepared can help lessen the damage. Customers will look to these companies for information– companies that can provide that information are more likely to weather a crisis with their reputation unscathed.

Flood Insurance at the Crossroads

News outlets around the country are broadcasting the horrible scenes from Northern Mexico, Texas and Oklahoma of devastating floods that have killed many. Once tallies are completed, property damage will likely be in the billions of dollars. Once again, a disaster raises interest not only in the insidious nature of catastrophic flooding, but in how the insurance industry, in concert with the federal government, more specifically the National Flood Insurance Program (NFIP), tackles – or sidesteps – the vexing problems associated with this peril.

Stories abound of the heart-breaking losses as a result of flooding; homes are whisked away downstream, people’s prized possessions are destroyed and, most importantly, lives are lost. Amid the recent rampant devastation brought on by the Texas floods, what struck us was one simple statement by a local news correspondent on the scene, who described the victims’ plight: “Some residents are lucky; they have flood insurance.” “Lucky” hardly describes the harsh reality these flood victims are experiencing.

Having flood insurance with the NFIP is akin to having jumbo shrimp, in the infamous description of the oxymoron by comedian George Carlin. To understand why, consider that property damage to a house comes in three varieties: (1) damage to the actual structure, (2) damage to the contents within the structure or (3) expenses associated with not being able to live in the structure as a direct result of a flood claim and having to live elsewhere. The standard HO3 policy form has all three of those potential loss sources adequately covered. That raises the question: What does the NFIP flood policy cover?

Your Building

The maximum the NFIP will pay for the dwelling structure, referred to as Coverage A, is $250,000, even if the dwelling is worth more. There is no amount of additional premium one can pay to get more coverage for this policy. If the dwelling is worth more, the homeowner is forced to purchase another flood insurance policy to cover an amount over and above $250,000.

Your Contents

The maximum the NFIP will pay for losses to contents, referred to as Coverage C, is $100,000, again, even if the homeowner owns more than that amount. The homeowner is still out of luck even if he acquires a second flood policy to cover excess losses to the dwelling, as those types of policies do not generally cover contents. To make matters even worse, if the homeowner is “lucky” enough to have a flood insurance policy through the NFIP and should suffer a flood loss to contents, the content valuation reimbursement will be depreciated. The homeowner will NOT be reimbursed for a new carpet when forced to rip up that damaged 20-year-old carpet and will receive just enough funds from the claim to buy another 20-year-old carpet. In other words, the claim’s valuation basis via the NFIP is the actual cash value (ACV) of the damaged item, not the current replacement cost value (RCV) after applying the policy deductible.

Worse, the homeowner is forced to fill out mountains of paperwork to detail what was damaged and account for when the item was purchased and the cost. Then there are the contents in basements, which can represent a whole separate problem. Try filling out the paperwork a few hundred times over for all a household’s valuables, knowing that, regardless of whether those items are meticulously itemized, the homeowner STILL will not be paid the cost to replace them.

Loss of Use

Should a homeowner have a flood loss and need to live elsewhere while the damage is being repaired, expenses for the Loss of Use, Coverage D, is entirely borne by the homeowner. It doesn’t matter if it’s a small amount of damage requiring a one-day stay at a hotel or extensive damage requiring a new home; the homeowner is responsible to pay for all living expenses out of pocket.

If the NFIP policyholder doesn’t already feel lucky enough, then there are the lingering questions surrounding the NFIP’s solvency.  Both Hurricane Katrina and Superstorm Sandy left the NFIP with few funds to pay claims, and if the homeowner is lucky enough to have flood insurance through the NFIP she will have to wait – oftentimes months!

By now, you get the point. Flood insurance through the NFIP really is not insurance; it’s something else altogether. For starters:

  1. The NFIP is not risk-based. Two homes with very dissimilar flood exposure could pay the exact same rate.
  2. The NFIP has done little to discourage risk-taking, by subsidizing low rates for homes that have had multiple claims payments.
  3. The policies do not meet homeowners’ needs. The coverage gaps are large and the headaches dealing with getting paid are quasi-medieval – certainly not consumer-friendly.

The industry can and must do better. All the tools and resources needed to adequately price and manage risk are present. New models and maps stand ready to evaluate risk, estimate loss costs and aggregate exposure. Abundant excess capital is available, and in many cases is standing on the sidelines looking to jump in the game. What better source of risk-based premium is there than the inland flood exposures now monopolized by the NFIP and, ultimately, the taxpayers? This is the opportunity for growth, innovation and applying commonsense risk management thinking that the industry not only is starving for, but has been praying for the past 30-plus years.

The industry must now ask itself: Does it want to sustain its legacy groupthink by maintaining the status quo, or does it want to remain relevant, now and in the future, and be a part of the solution?