Tag Archives: insurers

Why to ‘Open-Source’ CAT Models

Some thought that my previous article, “How to Vastly Improve Catastrophe Modeling,” advocating an “open-source” approach to cat models that would aggregate all human knowledge at any point in time, was  anarchist and pacifist in the tradition represented by beauty-contest participant Gracie Lou Freebush of the movie “Miss Congeniality” in her pursuit of “world peace” and sundry such altruistic ends.

Some felt I was preaching from the pulpit, asking sinners to atone. Others praised the article. Some people challenged the article by saying cat models platforms from private providers were important for insurers building entry barriers and maximizing profits. Surely, maximizing profits is a fair goal.

But let us examine the issues here. Just because a particular private provider has better models than others today for specific regions and perils, does that mean that it will remain better than other models from newer providers over time? At the least, just as with the “iPhone, Apple-app and app-store” ecosystem and “android-phone, android-app and app-store,” there need to be multiple options to ensure that we stay on the cutting edge of innovation.

The insurance industry needs to see competition between competing innovation ecosystems, so that newer and better model-providers have multiple options to fairly make more money from their innovations, thereby improving the fidelity of models used by insurers at all points of time. Essentially, a simple choice of allying with one platform vendor for all time is not a good choice for any insurer, and insurers need to “follow the models” to make their money. So encouraging platform providers that might be more attractive for the newer and better model providers is in the best interest of all insurers. Other choices are not as enlightened as they might seem.

Some insurers are paying and supporting a particular private provider to construct a platform owned and operated by that provider in exchange for joint competitive advantage. The OASIS approach I recommend, on the other hand, is an attempt by a group of insurers (with some overlap to the other group) to construct a platform owned mutually by these insurers (and available to new joiners). Perhaps there is space for even more types of ecosystems so that there is competition to draw in the best models as well as to fit the “empire-building” objectives of some insurers.

The competitive advantages to insurers could come from being part of a closed/semi-open group owning the platform or getting competitive advantage through it in some way. But competitive advantage could also come through innovative relationships between insurers and model providers (dis-intermediating the platforms in some sense), just like the Internet-applications industry does not care about the telecommunications industry that provides the channel to market. That is the most likely outcome when the dust eventually settles. After all, while the laws of majority do not work in the matters of the truth (fidelity of models to underlying reality), no one person/provider has a monopoly over the truth (of models), and in the end it is the truth (of the models) that matters (in conferring lasting competitive advantage).

Insurers tend to spend upward of 20% of their IT budget on catastrophe modelling. This expenditure is expected to rise rather than fall. OASIS is an strategic catastrophe modelling option for insurer looking to reduce this expenditure. One of the strategies is to use OASIS for development, testing and usage of their own models (i.e. in their own infrastructure) rather than risk placing these into the cloud provided by a private platform provider. Used this way, OASIS can help reduce the cost of development and testing and usage of internal models. Also, insurers need to create and maintain their own applications for multi-model comparison and blending if they want to use OASIS along with private platforms. These kinds of strategies will drive down the average cost of catastrophe modeling in the enterprise and the industry, thereby improving the transformative business case for various uses that currently are not cost-effective. Insurers that start on the OASIS journey will reach these transformative business cases before than others.

How to Measure Data Breach Costs?

Businesses typically have a hard time quantifying potential losses from a data breach because of the myriad factors that need to be considered.

A recent disagreement between Verizon and the Ponemon Institute about the best approach to take for estimating breach losses could make that job a little harder.

For some time, Ponemon has used a cost-per-record measure to help companies and insurers get an idea of how much a breach could cost them. Its estimates are widely used.

The institute recently released its latest numbers showing that the average cost of a data breach has risen from $3.5 million in 2014 to $3.8 million this year, with the average cost per lost or stolen record going from $145 to $154.

Infographic: Data breaches drain profits

The report, sponsored by IBM, showed that per-record costs have jumped dramatically in the retail industry, from $105 last year to $165 this year. The cost was highest in the healthcare industry, at $363 per compromised record. Ponemon has released similar estimates for the past 10 years.

But, according to Verizon, organizations trying to estimate the potential cost of a data breach should avoid using a pure cost-per-record measure.

Free IDT911 white paper: Breach, Privacy, And Cyber Coverages: Fact And Fiction

ThirdCertainty spoke with representatives of both Verizon and Ponemon to hear why they think their methods are best.

Verizon’s Jay Jacobs

Ponemon’s measure does not work very well with data breaches involving tens of millions of records, said Jay Jacobs, Verizon data scientist and an author of the company’s latest Data Breach Investigations Report (DBIR).

Jacobs says that, when Verizon applied the cost-per-record model to breach-loss data obtained from 191 insurance claims, the numbers it got were very different from those released by Ponemon. Instead of hundreds of dollars per compromised record, Jacobs said, his math turned up an average of 58 cents per record.

Why the difference? With a cost-per-record measure, the method is to divide the sum of all losses stemming from a breach by the total number of records lost. The issue with this approach, Jacobs said, is that cost per record typically tends to be higher with small breaches and drops as the size of the breach increases.

Generally, the more records a company loses, the more it’s likely to pay in associated mitigation costs. But the cost per record itself tends to come down as the breach size increases, because of economies of scale, he said.

Many per-record costs associated with a breach, such as notification and credit monitoring, drop sharply as the volume of records increase. When costs are averaged across millions of records, per-record costs fall dramatically, Jacobs said. For massive breaches in the range of 100 million records, the cost can drop to pennies per record, compared with the hundreds and even thousands of dollars that companies can end up paying per record for small breaches.

“That’s simply how averages work,” Jacobs said. “With the megabreaches, you get efficiencies of scale, where the victim is getting much better prices on mass-mailing notifications,” and most other contributing.

Ponemon’s report does not reflect this because its estimates are only for breaches involving 100,000 records or fewer, Jacobs said. The estimates also include hard-to-measure costs, such as those of downtime and brand damage, that don’t show up in insurance claims data, he said.

An alternate method is to apply more of a statistical approach to available data to develop estimated average loss ranges for different-size breaches, Jacobs said

While breach costs increase with the number of records lost, not all increases are the same. Several factors can cause costs to vary, such as how robust incident response plans, pre-negotiated contracts for customer notification and credit monitoring are, Jacobs said. Companies might want to develop a model that captures these variances in costs in the most complete picture possible and to express potential losses as an expected range rather than use per-record numbers.

Using this approach on the insurance data, Verizon has developed a model that, for example, lets it say with 95% confidence that the average loss for a breach of 1,000 records is forecast to come in at between $52,000 and $87,000, with an expected cost of $67,480. Similarly, the expected cost for a breach involving 100 records is $25,450, but average costs could range from $18,120 to $35,730.

Jacobs said this model is not perfectly accurate because of the many factors that affect breach costs. As the number of records breached increases, the overall accuracy of the predictions begins to decrease, he said. Even so, the approach is more scientific than averaging costs and arriving at per-record estimates, he said.

Ponemon’s Larry Ponemon

Larry Ponemon, chairman and founder of the Ponemon Institute, stood by his methodology and said the estimates are a fair representation of the economic impact of a breach.

Ponemon’s estimates are based on actual data collected from individual companies that have suffered data breaches, he said. It considers all costs that companies can incur when they suffer a data breach and includes estimates from more than 180 cost categories in total.

By contrast, the Verizon model looks only at the direct costs of a data breach collected from a relatively small sample of 191 insurance claims, Ponemon said. Such claims often provide an incomplete picture of the true costs incurred by a company in a data breach. Often, the claim limits also are smaller than the actual damages suffered by an organization, he said.

“In general, the use of claims data as surrogate for breach costs is a huge problem, because it underestimates the true costs” significantly, Ponemon said.

Verizon’s use of logarithmic regression to arrive at the estimates also is problematic because of the small data size and the fact the data was not derived from a scientific sample, he said.

Ponemon said the costs of a data breach are linearly related to the size of the breach. Per-record costs come down as the number of records increases, but not to the extent portrayed by Verizon’s estimates, he said.

“I have met several insurance companies that are using our data to underwrite risk,” he said.

Venture Capital and Tech Start-ups

Unicorns – to some they are just mythical creatures of lore. To today’s tech world, a unicorn is a pre-IPO tech start-up with a billion-dollar market value. These are the companies driving innovation, technology and disruption in every corner of every business, and their impact is truly being felt across the insurance industry.

The number of unicorns is as elusive as the creatures themselves, as the herd is growing rapidly.

“Fortune counts more than 80 unicorns today, but more appear with each passing week. Some even received their horns, so to speak, as the magazine went to press. And they’re getting bigger — there are now at least eight ‘decacorns,’ unicorns valued at $10 billion or more. So much for being mythical.” — Fortune

Recognizing the powerful sway that unicorns have over new technologies, business models and more, insurers are now getting into the unicorn game themselves. They are identifying technology start-ups that can transform insurance and are becoming venture capitalists to tap into this great potential for creating the next generation of insurance.

Different models and approaches are being used to identify, assess and influence these companies’ offerings. By understanding the benefits of outside-in thinking, insurers are finding ways to leverage these innovations. Some insurers are partnering with leading technology firms. Some of the large insurers are setting up their own venture capital firms. Still others are creating consortia to fund new start-ups to help accelerate innovation.

Insurers and Unicorns

The following are a few examples of new partnerships in 2015; the trend is continuing;

AXA– In February 2015, AXA announced the launch of AXA Strategic Ventures, a €200M fund to boost technology start-ups focused on customer acquisition, climate change, travel insurance and more. The goal is to advance AXA’s digital and customer strategy by connecting with new technologies, new solutions,and new ways of thinking. The company anticipates the fund will complement AXA’s major operating investments, across all entities, into research and digital developments that will help transform how customers experience AXA.

XL Insurance – On April 1, 2015, XL Insurance announced the formation of a venture capital fund, XL Innovate, to support insurance technology start-ups, with a focus on developing new capabilities in the insurance sector. XL indicated that this effort would extend its capabilities in existing markets and give it new opportunities to address some of the most pressing and complex risk problems in the global economy. In addition, XL sees it as a critical element to driving focus on innovation forward while securing relevance in the future.

Global Insurance Accelerator – In February 2015, a group of seven Iowa-based insurers announced the formation and launch of the Global Insurance Accelerator (GIA), an insurance accelerator for start-ups. The start-ups receive $40,000 in seed money from the pool to create a minimum viable product to present to the Global Insurance Symposium. The insurers involved believe that the accelerator program will bring potential innovation and technology insights to the insurance industry.

The Future

Innovation, technology and the need to be future-ready are fueling today’s unicorns and their capital supporters rapidly expanding the herd. In turn, these new business models and market leaders are spawning challenges and opportunities for all companies.

Today’s forward-thinking insurance companies are running their businesses while simultaneously creating their futures as Next-Gen insurers. It’s critical to recognize the power and benefits of innovation and the role that unicorns play in planning for tomorrow.

This is a decisive time as Next-Gen insurers emerge along with their unicorns to disrupt and redefine insurance and competitive advantage. What is your company’s approach to leverage and experiment with emerging technologies, start-ups and unicorns to fuel the potential and enable future market leadership?

The Misconceptions About Millennials

When it comes to successfully engaging with a new generation of customers (and employees), there’s very little doubt that insurers have their work cut out for them. There can be very little doubt that members of the Millennial generation generally consider insurance to be boring and that reputation of insurance brands among this group is low. So how can insurance companies bridge this gap and find a way to meet the challenges that this new generation of customer present?

Perhaps the first thing to do is to challenge existing preconceptions of this group. Many insurers may well be oversimplifying and mythologizing the digital and financial behavior and attitudes of Millennials. Indeed, contrary to popular opinion, the vast majority of Millennials are not technology geeks. What this means for insurers is that developing and offering an app isn’t going to have the impact expected among this group. Technology for technology’s sake will not interest Millennials; they have to see clear value.

More broadly speaking, insurers still have much to do when it comes to connecting Millennials and insurance companies. It’s clear that younger customers view insurance brands as solid, safe and staid, guarantors when something goes wrong. However, they also see insurers as faceless organizations that have little understanding of their needs. The successful insurance brands of the future will be those that can provide the established, safe reputation that Millennials have come to expect from insurers, alongside an understanding of their lifestyles, which aligns with the way they interact with one another.

It’s also interesting to consider, in this context, how Millennials make strategic decisions about financial management and, specifically, around how they buy insurance. What many insurers may not realize is that many are using word-of-mouth recommendations and advice from family and friends, which can bring the reputation and brand of the insurer to the fore. For this reason, establishing brand reputation and using word-of-mouth campaigns will be key.

Because the customer journey of the Millennial is less certain, it will also be increasingly important for insurers to invest in a sound omni-channel strategy. Because they are dealing with customers – or at the very least, potential customers – who are savvy across a diverse range of channels, and who will dip in and out of them at regular intervals before they make a purchasing decision, it can be almost impossible for insurers to know exactly which channel they will use or prefer.

What is particularly striking, however, is how small a part social media plays for Millennials when it comes to how they experience customer service. Contrary to popular belief, most seem to have fenced off social media interaction into their personal world and are not convinced that this is where they’ll engage with insurers on customer service issues. Perhaps we should give greater credit to Millennials’ understanding of how social media attacks can backfire and public castigation is a waste of energy.

In any case, when it comes to complaints, insurers should consider that Millennials are probably no different than any other generation. They ask for efficient and effective response to direct complaints. While less concern should be given to Millennials causing reputational damage via social media “flaming,” they are likely to take decisive action earlier, with a third willing to switch immediately.

It’s important to remember that Millennials aren’t looking for digital-only channels, and that they place great value on personalization and self-service. Millennials want just-in-time advice and support, delivered right at the moment they need it. They do not want to get “just in case” advice and support that is delivered at some inappropriate moment (and through an inappropriate channel) and that may not be the right content (which they will have forgotten by the time they need to apply it anyway).

Perhaps the most significant consideration is the extent to which Millennials might be willing to share personal data in exchange for a discount or a reduced premium. This seems to justify experiments in telematics and may be the basis for insurers to innovate around newer technologies like wearables. All of this should strongly influence technology choices for how insurers make sure their businesses are responsive to customers. Systems that embed consistent best practice every time, as part of every interaction, to give the absolute optimum outcome for both the insurer and the customer as an individual are critical.

In summary, while not all Millennials are the same, they all share similar traits – namely, that they what they want, when they want it (just in time), and they want all of it. With this in mind, there seems very little doubt that the most successful insurers when it comes to dealing with Millennials will be those that are authentic and trustworthy and that are able to offer pricing at the “right” level. Those insurers that can incorporate all of these facets into a personalized service, which sees and leverages every previous interaction and anticipates their next requirement “like magic,” will be those that bridge the generational insurance gap and get ahead.

Stop Being Clueless About Workers’ Comp

Despite the brouhaha over the ProPublica articles that say companies are unfairly denying treatment to injured workers to save on costs, I still regard the high cost of workers’ compensation (for those companies that do have high costs) mostly as a management problem.

The companies I see — which are the ones that have huge problems — are clueless about workers’ comp. They turn their claims and injury process over to their claims administrator or carrier, hardly participating in the process, then they blame the TPA or carrier when costs go up even though they have done nothing internally to manage safety or injuries.

These companies never budget for workers’ comp management, don’t staff the risk department (if there even is a department) properly. THAT would cost money, and our headcount would increase, they say. Often, if they do have staff, they do not allow the staff to attend conferences or seminars, join organizations or purchase resources. THAT would cost money, they say.

Sometimes, their brokers offer to help by providing consulting resources, but the companies with high workers’ comp costs do not see the merit in such an approach. I worked with a major entertainment facility, speaking with them once per week, on behalf of their broker, hoping to gain insight. I offered to consult with the staff because I am a consultant: Getting to the root of the problem, finding the cost drivers and fixing them is what I do. They did not need a consultant. Then, one day I said I could “help them develop their training program,” and they accepted instantly! I had used the wrong word — they needed “training help” not “consulting help.” Within months, the high cost of their workers compensation program went down to almost zero. Problem solved.

Several things employers can do, but usually don’t, are:

1. Contact employees within a week or two after the injury to do a survey of their medical and claims adjuster experience. Speak to them via phone, just as you would ask a good customer about her experience. Jennifer Christian, chief medical officer at Webility, contacts employees to find out if each injured worker felt that care was poor, fair, good or excellent. Often, poor treatment by medical providers and callous indifference by adjusters causes employees to become angry, seek counsel or even delay recovery because of lack of expertise during the initial treatment experience.

2. Have claims reviewed periodically by an independent auditor with a medical provider on the team. Only an MD is qualified to read the medical reports to determine whether treatment was appropriate and sufficient, whether alternate causation has been considered and whether aggressive and excellent (yes, perhaps more expensive) treatment has been provided. Make sure adjusters are not using utilization review (UR) to deny care. Audit, audit, audit. Care, care, care.

Do weekly roundtables with your third-party administrator (TPA) — for instance, every Friday discuss 10 claims, etc. Don’t wait until claims reach $25,000. Discuss them when they are small, BEFORE they get astronomical.

3. Retain an MD to be part of your claims team. This can be an on-site MD part-time or full-time who also speaks with treating physicians and injured employees. Adjusters and nurses do not know “medicalese.” Applause to those insurers who have MDs on staff BUT employers still need to have their own medical advisers on the team. Employers often forget we are talking about medical injuries, not simply “claims.”

4. Assess the key cost drivers of your workers’ compensation costs. Nine out of 10 times, employers misdiagnose the cause of their high workers’ compensation costs. In one case, the employer was ready to fire the insurance company because “they thought” there was too much nurse case management. Upon more detailed analysis, including an independent review by claims experts and an MD, we found the claims were handled well 98% of the time. The cause of the problem was misidentified.

The REAL problem was a lack of a post-injury response — employees and supervisors did not have steps to follow within the first 24 hours after the injury. We then held 19 training sessions over three weeks to improve best practices related to rapid medical care and RTW/SAW (return to work/stay at work) in this mega-entertainment theme park. The workers’ compensation costs dropped 20% in a year-over-year comparison of total incurred losses with the previous 12-month period.

5. There are no tools to guide employees and supervisors. In the above case, we provided: employee brochure, physician brochure, wallet cards in English/Spanish for supervisors and employees, and other tools.

6. And, most importantly, provide the best quality medical care available. Yes, even if it’s more expensive. Pennywise is pound foolish. Get the best, not the cheapest. Pay the doctor more to spend more time with your injured employees, not less time.

7. Establish bundled pre-approval of care in account instructions so UR is not necessary — e.g., “All PTP (primary treating physician) treatments and as many as five visits to specialists are pre-authorized by insured. All testing requisitioned by PTP and specialists including physical therapy (PT) and MRIs is to be approved; do NOT submit to UR. If you strongly believe treatment or testing is unwarranted, contact the insured’s medical director before denying request.”

If you don’t manage and monitor it, the process (any process, not only workers’ compensation) will not work well.

It’s time for employers to become involved in their own business! The first step is assessing the problem at your company, not the industry in general or another company. Get that mirror out and have a look. You are most likely looking at the problem.