Tag Archives: predictive modeling

Top 6 Myths About Predictive Modeling

Even if you’ve been hiding under a rock the past 25 years, it’s almost impossible to avoid hearing about how companies are turning around their results through better modeling or how new companies are entering into insurance using the power of predictive analytics.

So now you’re ready to embrace what the 21st century has to offer and explore predictive analytics as a mainstream tool in property/casualty insurance. But misconceptions are still commonplace.

Here are the top six myths dispelled:

Myth: Predictive modeling is mostly a technical challenge.
Fact: The predictive model is only one part of the analytics solution. It’s just a tool, and it needs to be managed well to be effective.

The No. 1 point of failure in predictive analytics isn’t technical or theoretical (i.e., something wrong with the model) but rather a failure in execution. This realization shifts the burden of risk from the statisticians and model builders to the managers and executives. The carrier may have an organizational readiness problem or a management and measurement problem. The fatal flaw that’s going to derail a predictive analytics project isn’t in the model, but in the implementation plan.

Perhaps the most common manifestation of this is when the implementation plan around a predictive model is forced upon a group:

  • Underwriters are told that they must not renew accounts above a certain score
  • Actuaries are told that the models are now going to determine the rate plan
  • Managers are told that the models will define the growth strategy

In each of these cases, the plan is to replace human expertise with model output. This almost never ends well. Instead, the model should be used as a tool to enhance the effectiveness of the underwriter, actuary or manager.

Myth: The most important thing is to use the right kind of model.
Fact: The choice of model algorithm and the calibration of that model to the available data are almost never the most important things. Instead, the biggest challenge is merely having a credible body of data upon which to build a model. In “The Unreasonable Effectiveness of Data,” Google research directors Halevy, Norvig and Pereira wrote:

“Invariably, simple models and a lot of data trump more elaborate models based on less data.”

No amount of clever model selection and calibration can overcome the fundamental problem of not having enough data. If you don’t have enough data, you still have some options: You could supplement in-house data with third-party, non-insurance data, append insurance industry aggregates and averages or possibly use a multi-carrier data consortium, as we are doing here at Valen.

Myth: It really doesn’t matter which model I use, as long as it’s predictive.
Fact: Assuming you have enough data to build a credible model, there is still a lot of importance in choosing the right model — though maybe not for the reason you’d think.

The right model might not be the one that delivers the most predictive power; it also has to be the model that has a high probability of success in application. For example, you might choose a model that has transparency and is intuitive, not a model that relies on complex machine-learning techniques, if the intuitive model is one that underwriters will use to help them make better business decisions.

Myth: Predictive modeling only works well for personal lines.
Fact: Personal lines were the first areas of success for predictive modeling, owing to the large, homogeneous populations that they serve. But commercial lines aren’t immune to the power of predictive modeling. There are successful models producing risk scores for workers’ compensation, E&S liability and even directors & officers risks. One of the keys to deploying predictive models to lines with thin policy data is to supplement that data, either with industry-wide statistics or with third-party (not necessarily insurance) data.

Myth: Better modeling will give me accurate prices at the policy level.
Fact: Until someone invents a time machine, the premiums we charge at inception will always be wrong. For policies that end up being loss-free, we will charge too much. For the policies that end up having losses, we will charge too little. This isn’t a bad thing, however. In fact, this cross-subsidization is the fundamental purpose of insurance and is necessary.

Instead of being 100% accurate at the policy level, the objective we should aim for in predictive analytics is to segment the entire portfolio of risks into smaller subdivisions, each of which is accurately priced. See the difference? Now the low-risk policies can cross-subsidize one another (and enjoy a lower rate), and the high-risk policies will also cross-subsidize one another (but at a high rate). In this way, the final premiums charged will be fairer.

Myth: Good models will give me the right answers.
Fact: Good models will answer very specific questions, but, unless you’re asking the right questions, your model isn’t necessarily going to give you useful answers. Take time during the due diligence phase to figure out what the key questions are. Then when you start selecting or building models, you’ll be more likely to select a model with answers to the most important questions.

For example, there are (at least) two very different approaches to loss modeling:

  • Pure premium (loss) models can tell you which risks have the highest potential for loss. They don’t necessarily tell you why this is true, or whether the risk is profitable.
  • Loss ratio models can tell you which risks are the most profitable, where your rate plan may be out of alignment with risk or where the potential for loss is highest. However, they may not necessarily be able to differentiate between these scenarios.

Make sure that the model is in perfect alignment with the most important questions, and you’ll receive the greatest benefit from predictive analytics.

Today’s Digital Customer: It’s Me

I’m a child of the ’80s —  to be specific, 1983. Some might say I’m a Generation Xer; some might say I’m from Generation Y; others might describe me as a Millennial. Regardless of what you call me, if you’re going to sell me something, it better involve technology. You see, my life revolves around technology. I grew up with computers, just as a previous generation grew up with TV. I use a branchless bank. I stream my television content. I use social media to communicate with my friends. I don’t like paper. Technology makes my life easier. So you can imagine, as I looked at career possibilities, I never saw myself working for the insurance industry. I hardly knew anything about it, actually, except that it didn’t seem like an industry that was very innovative or technologically advanced. I remembered:

  • sitting at my insurance agent’s desk watching him use an application that looked like it belonged on the Oregon Trail.
  • having to send faxes to make policy changes.
  • being directed to call the insurer’s corporate headquarters to file a claim, and in the process repeat my member ID over and over during the same conversation.
  • not being able to know the impact that a change to my policy would have on my bill.

I’ve watched over the last several years as an entire industry has reevaluated itself and rethought how it does business and markets itself — a member of the next generation of consumers. But there’s still a long way to go. I’d like to see:

  • tremendous investment in modern technology and the delivery of useful, self-service capabilities.
  • companies embracing more forward-thinking mobile and social media trends, meeting customers like me where we are.
  • Investigation and implementation of innovative technologies involving telematics and other tools for consumers.
  • a more intimate relationship between customers and the carrier, which will leverage advancements in analytics, business intelligence and predictive modeling.
  • the industry be able to attract young, top IT talent so insurers can continue to innovate.

For me and my generation, these will be welcome developments for a couple of reasons. First, we’re digital natives. There aren’t too many facets of our lives that haven’t gone electronic. For me, my church-giving and insurance may be all that remains offline. Second, now that the industry has begun to reverse course and is upping its technology game, my generation has another employment option, which we most likely would not have considered otherwise.  No, it’s not true that we all want to work at Apple or Google, but we do want to invest our considerable talents in an industry that has interesting problems to solve and, more importantly, an environment that shares our enthusiasm and trust for technology. Although insurance has been a bit slow on the uptake, it’s truly gratifying to see an entire industry take my generation seriously, incorporate our needs into overall strategies accommodate our lifestyles and view us as something worth investing in. I look forward to watching technology shape insurance innovation. Who knows—maybe this is the year experiments like usage-based insurance will become a reality. The battle for the hearts of my generation is on. Only the tech-savvy carriers and agents will triumph.

The Truth on Workers’ Comp Premiums

As employers try to limit their workers’ compensation premiums, my suspicion is that many do not realize that insurers have traditionally relied on investment income to, in effect, subsidize underwriting costs, and that the subsidy is going away. Insurers are relying on “safer” Treasury bonds and are not realizing the returns they received before the Great Recession.

Sure, employers feel workers’ comp insurers are too profitable. In fact, the combined ratio — insurers’ total costs for covering work-related incidents, divided by total premiums — was 106 in 2013, according to the National Council on Compensation Insurance. That means that, for every $100 that insurers received in premiums, they paid $106.

Insurers need to shrink the combined ratio to be profitable and need to make up for the diminishing investment income, so premiums have been going up for the past three years. Experts expect this to continue.

So, to control premiums, employers must improve the experience modifier that is used to calculate their rates. Employers need to address the direct and indirect costs of work-related injuries, illnesses and diseases, investing in workplace safety and return-to-work and other initiatives. Insurers must use predictive modeling to produce more sensitive risk measurements. (Here is a blog post that goes into detail.)

The good news is that workers’ compensation claim costs are not out of control as they were in the past. Those of us who are old enough to remember the late 1980s and early 1990s remember just how bad it was. Liberty Mutual, often considered the largest workers’ compensation carrier in the nation, quit offering coverage in its home state of Massachusetts in the early 1990s because costs were spiraling out of control. Today, overall costs are going up but more slowly because the frequency of claims has been declining for 20 years.

There are many possible reasons for why frequency has declined.

Some point to reforms reducing claims eligibility for workers’ compensation. But, if this were a large factor, I think we would be seeing more work-related claims in the tort system.

Others cite changes in workplace exposure. Some point to the shift in the kind of work Americans are doing. For example, high-risk jobs in manufacturing have been exported in recent years. And while some manufacturing jobs are returning to the U.S. because of lower energy costs here, more work is being automated, making it less risky. Meanwhile, still-high unemployment rates mean there is less risk exposure.

I believe the No. 1 reason why frequency has declined is workplace safety. While I cannot prove this on a quantitative basis, I make my observations based on 25 years of observing workers’ compensation. Back in the early 1990s, employers were discovering how much they could lower their premiums through safety. When I was the lead reporter for BNA’s Reporters’ Compensation Report in the mid-1990s to the year 2000, I spent a lot of time writing about employers that were discovering strategies to contain workers’ compensation. Many of these approaches are now used widely.

There still remain, however, many employers who need to get religion.

While medical-cost inflation for workers’ compensation remains a concern, it is not in the double-digits as it was 20 years ago; it has been about 3% annually in recent years. Workers’ compensation insurers still pay more for procedures than health care insurers. Medicare will not pay for opiates dispensed by doctors, but workers’ compensation will in many states. The $1 billion question is how Obamacare will affect workers’ compensation claim costs. Some worry that claims that previously would have been handled under healthcare insurance will be shifted to workers’ comp, but I doubt it because workers’ comp is just too complicated. (It could turn out that Obamacare will be more complicated than workers’ compensation, but a worker still needs to prove work-relatedness for a claim.)

As a whole, indemnity claim costs have been relatively flat in recent years. In states where the maximum weekly benefit that workers can receive is relatively low, such as Virginia, indemnity costs are naturally lower than in other jurisdictions, such as the District of Columbia, where the maximum weekly benefit is much higher.

Reducing the amount of time workers are on workers’ compensation through quality medical care and return-to-work programs has also helped curtail the financial burden of claims. But, again, more employers still need to get religion, and for reasons that go beyond reducing the time that employees are on workers’ compensation. It is also true that return to work is challenging in the current economy, as there are fewer jobs available.

Besides national economic factors, employer premiums are affected by the workers’ compensation conditions in individual states. California’s combined ratio has been in the triple-digits, so employers are seeing bigger premium increases than in other states.

Meanwhile, there is always the political wildcard in workers’ compensation that can favor the interests of insurers, employers, organized labor, plaintiffs’ attorneys and others, depending on who is in power.

Employers often feel too busy to be politically involved in the workers’ compensation system but can be a critical voice for change. Employers that want to make a difference should look into joining UWC in Washington, D.C., and the Workers’ Compensation Research Institute. I have worked with both of these groups in various capacities and believe they are worth the investment. (By the way, neither organization knows I am recommending them.)

Making the case for investing in workers’ compensation is a challenge. But because insurers can no longer use investment income to soften the blow of rising workers’ compensation costs, employer investment in curtailing claim costs is more important than ever.

Make Your Data a Work-in-Process Tool

Heard recently: “Our organization has lots of analytics, but we really don’t know what to do with them.”

This is a common dilemma. Analytics (data analysis) are abundant. They are presented in annual reports and published in colorful graphics. But too often the effort ends there. Nice information, but what can be done with it? 

The answer is: a lot. It can change operations and outcomes, but only if it is handled right. A key is producing an analytics delivery system that is self-documenting.

Data evolution

Obviously, the basic ingredient for analytics is data. Fortunately, the last 30 years have been primarily devoted to data gathering.

Over that time, all industries have evolved through several phases in data collection and management. Mainframe and minicomputers produced data, and, with the inception of the PC in the '80s, data gathering became the business of everyone. Systems were clumsy in the early PC years, and there were significant restrictions to screen real estate and data volume. Recall the Y2K debacle caused by limiting year data to two characters.

Happily for the data-gathering effort, progress in technology has been rapid. Local and then wide area networks became available. Then came the Internet, along with ever more powerful hardware. Amazingly, wireless smartphones today are far more powerful computers than were the PCs of the '80s and '90s. Data gathering has been successful.

Now we have truckloads of data, often referred to as big data. People are trying to figure out how to handle it. In fact, a whole new industry is developing around managing the huge volumes of data. Once big data is corralled, analytic possibilities are endless.

The workers’ compensation industry has collected enormous volumes of data — yet little has been done with analytics to reduce costs and improve outcomes.

Embed analytic intelligence

The best way to apply analytics in workers’ compensation is to create ways to translate and deliver the intelligence to the operational front lines, to those who make critical decisions daily. Knowledge derived from analytics cannot change processes or outcomes unless it is embedded in the work  of adjusters, medical case managers and others who make claims decisions.

Consulting graphics for guidance is cumbersome: Interpretation is uneven or unreliable, and the effects cannot be verified.  Therefore, the intelligence must be made easily accessible and specific to individual workers.

Front line decision-makers need online tools designed to easily access interpreted analytics that can direct decisions and actions. Such tools must be designed to target only the issues pertinent to individuals. Information should be specific.

When predictive modeling is employed as the analytic methodology, certain claims are identified as risky. Instead, all claims data should be monitored electronically and continuously. If all claims are monitored for events and conditions predetermined by analytics, no high-risk claims can slip through the cracks. Personnel can be alerted of all claims with risky conditions. 


The system that is developed to deliver analytics to operations should automatically self-document; that is, keep its own audit trail to continually document to whom the intelligence was sent, when and why. The system can then be expanded to document what action is taken based on the information delivered.

Without self-documentation, the analytic delivery system has no authenticity. Those who receive the information cannot be held accountable for whether or how they acted on it. When the system automatically self-documents, those who have received the information can be held accountable or commended.

Self-documenting systems also create what could be called Additionality. Additionality is the extent to which a new input adds to the existing inputs without replacing them and results in something greater. When the analytic delivery system automatically self-documents guidance and actions, a new layer of information is created. Analytic intelligence is linked to claims data and layered with directed action documentation.

A system that is self-documenting can also self-verify, meaning results of delivering analytics to operations can be measured. Claim conditions and costs can be measured with and without the impact of the analytic delivery system. Further analyses can be executed to measure what analytic intelligence is most effective and in what form and, importantly, what actions generate best results.

The analytic delivery system monitors all claims data, identifies claims that match analytic intelligence and embeds the interpreted information in operations. The data has become a work-in-process knowledge tool while analytics are linked directly to outcomes.

Six Key Insurance Business Impacts From Analytics

Recently, I had the privilege of serving as chairman of the inaugural Insurance for Analytics USA conference in Chicago, which was very well organized by Data Driven Business, part of FC Business Intelligence. I am convinced that analytics is not only one of the most valuable and promising technology disciplines to ever find its way into the insurance industry ecosystem, but that its very adoption clearly identifies those carriers – and their information technology partners – that will be the most innovative.

Analytics has exceptionally broad enterprise potential, with the ability to permanently change the way carriers think and conduct their business. The future of analytics is even more promising than most can imagine.

The conference — where the excitement was palpable — showed the sheer diversity of carrier types and sizes as well as the many different operational areas in which analytics is being used to drive insight, business outcomes and innovation and create real competitive differentiation. From large carriers such as Chubb, Sun Life, Nationwide, American Family, CNA and CSAA, to smaller insurers including Fireman's Fund, Pacific Specialty, Great American, Westfield, National General and Houston Casualty, presentations demonstrated how broadly analytics should be applied through every function and every level of the organization. Presentations from information technology provider types including Dun & Bradstreet, L&T InfoTech, Fractal Analytics, Megaputer, EagleEye Analytics, Clarity Solutions Group, Dataguise, Quadrant, Actionable Analytics, Earley & Associates and DataDNA laid out the future potential.

Recent research shows that one major application of analytics — predictive modeling — is getting attention in pricing and rating, where more than 80% of carriers use it regularly. However, only about 50% use it today in underwriting, and fewer than 30% do so in reserving, claims and marketing.

Based on information shared during the conference, there are six major thrusts to the analytics trend:

• Analytics liberates and democratizes data, which in turn ignites innovation and change within carriers.

• Analytics is uniting insurance organizations, breaking down information silos and creating collaboration between operating units, even as enterprise data governance policies and practices emerge.

• Investment and M&A activity in information technology companies in data and analytics is surging and will create even greater disruption and innovation as more entrepreneurial thinkers continue blending art with science.

• New “as-a-service” pay-per-use models for delivery and pricing are emerging for software (SaaS) and data (DaaS), which will be appealing and cost-effective, especially for mid-tier and smaller carriers.

• Analytics is driving innovation in products, business processes, markets, competition and business models.

• Carriers will have to innovate or surrender market share and should watch for competition from new players, such as Google and Amazon, which understand data, the cloud, innovation and consumer engagement.

This article first appeared on Insurance & Technology