Tag Archives: bret shroyer

The 2 New Realities Because of Big Data

I have some bad news. There are no longer any easy or obvious niches of sustained, guaranteed profits in insurance. In today’s environment of big data and analytics, all the easy wins are too quickly identified, targeted and brought back to par. If you’ve found a profitable niche, be aware that the rest of the industry is looking and will eventually find it, too.

Why? The industry has simply gotten very good at knowing what it insures and being able to effectively price to risk.

Once upon a time, it was sufficient to rely on basic historical data to identify profitable segments. Loss ratio is lower for small risks in Wisconsin? Let’s target those. Today, however, all of these “obvious” wins stand out like beacons in the darkness.

To win in a game where the players have access to big data and advanced analytics, carriers should consider two new realities:

  • You can’t count on finding easy opportunities down intuitive paths. If it’s easy and intuitive, you can bet that everyone else will eventually find it, too.
  • Sustainable opportunities lie in embracing the non-obvious and the counter-intuitive: finding multivariate relationships between variables, using data from novel sources and incorporating information from other coverages.

Just knowing what you insure is only the start. The big trick is putting new information to good use. How can carriers translate information on these new opportunities into action? In particular, how can carriers better price to risk?

We see two general strategies that carriers are using in pricing to risk:

  • Put risks into categories based on predicted profitability level
  • Put risks into categories based on predicted loss

The difference appears subtle at first glance. Which approach a given carrier will take is driven by its ability to employ flexible pricing. As we will now explore, it’s possible for carriers to implement risk-based pricing in both price-constrained and flexible-rate environments.

Predicting Profitability: Triage Model

In the first strategy, carriers evaluate their ability to profitably write a risk using their current pricing structure. This strategy often prevails where there are constraints on pricing flexibility, such as regulatory constraints, and it allows a carrier to price to risk, even when the market-facing price on any given risk is fixed.

The most common application here is a true triage model: Use the predicted profitability on a single risk to determine appetite. Often, the carrier will translate a model score to a “red/yellow/green” score that the underwriter (or automated system) uses to guide her evaluation of whether the risk fits the appetite. The triage model is used to shut off the flow of unprofitable business by simply refusing to offer coverage at prices below the level of profitability.

A triage model can also be implemented as an agency-facing tool. When agents get an indication (red/yellow/green again), they start to learn what the carrier’s appetite will be and are more likely to send only business that fits the appetite. This approach has the added benefit of reducing underwriting time and expense for the carrier; the decline rate drops, and the bind/quote rate rises when the agents have more visibility into carrier appetite.

A final application carriers are using is in overall account evaluation. It may be that a carrier has little or no flexibility on workers’ compensation prices, but significant pricing flexibility on pricing for the business owners policy (BOP) cover. By knowing exactly how profitable (or unprofitable) the WC policy will be at current rates, the carrier can adjust price on the BOP side to bring the entire account to target profitability.

Predicting Loss: Pricing Model

If a carrier has pricing flexibility, pricing to risk is more straightforward: Simply adjust price on a per-risk basis. That said, there are still several viable approaches to individual risk pricing. Regardless of approach, one of the key problems these carriers must address is the disruption that inevitably follows any new approach to pricing, particularly on renewal business.

The first, and least disruptive, approach is to use a pricing model exclusively on new business opportunities. This allows the carrier to effectively act as a sniper and take over-priced business from competitors. This is the strategy employed by several of the big personal auto carriers in their “switch to us and save 12%” campaigns. Here we see “know what you insure” being played out in living color; carriers are betting that their models are better able to identify good risks, and offer better prices, than the pricing models employed by the rest of the market.

Second, carriers can price to risk by employing a more granular rate structure. This is sometimes referred to as “tiering” – the model helps define different levels of loss potential, and those varying levels are reflected in a multi-tiered rate plan. One key advantage here is that this might open some new markets and opportunities not in better risks, but in higher-risk categories. By offering coverage for these higher-cost risks, at higher rates, the carrier can still maintain profitability.

Finally, there is the most dramatic and potentially most disruptive strategy: pricing every piece of new and renewal business to risk. This is sometimes called re-underwriting the book. Here, the carrier is putting a lot of faith in the new model to correctly identify risk and identify the correct price for all risks. It’s very common in this scenario for the carrier to place caps on a single-year price change. For example, there may be renewals that are indicated at +35% rate, but annual change will be limited to +10%. Alternatively, carriers may not take price at all on renewal accounts, unless there are exposure changes or losses on the expiring policy.

Know What You Insure

Ultimately, the winners in the insurance space are the carriers that best know what they insure. Fortunately, in an environment where big data is becoming more available, and more advanced analytics are being employed, it’s now possible for most carriers to acquire this knowledge. Whether they’re using this knowledge in building strategy, smarter underwriting or pricing to risk, the results are the same: consistent profitability.

Sometimes there are pricing constraints that would, at first glance, make effectively pricing to risk challenging. As we have discussed, there are still some viable approaches for carriers facing price inflexibility. Even for carriers with unlimited price flexibility, pricing to risk isn’t as easy as simply applying a model rate to each account; insurers must take care to avoid unnecessary price disruption. We’ve discussed several approaches here, as well.

Effectively pricing to risk gives carriers the opportunity to win without relying on protecting a secret, profitable niche. In the end, this will give them the ability to profit in multiple markets and multiple niches across the entire spectrum of risk quality.

Are Market Cycles Finally Ending?

The property/casualty industry has been characterized by its market cycles since… well, forever. These cycles are multi-year affairs, where loss ratios rise and fall in step with rising and falling prices. In a hard market, as prices are rising, carriers are opportunistic and try to “make hay while the sun shines” – increasing prices wherever the market will let them. In a soft market, as prices are declining, carriers often face the opposite choice – how low will they let prices go before throwing in the towel and letting a lower-priced competitor take a good account?

Many assume that the market cycles are a result of prices moving in reaction to changes in loss ratio. For example, losses start trending up, so the market reacts with higher prices. But the market overreacts, increasing price too much, which results in very low loss ratios, increased competition and price decreases into a softening market. Lather, rinse, repeat.

But is that what’s really happening?

What’s Driving the Cycles?

Raj Bohra at Willis Re does great work every year looking at market cycles by line of business. In one of his recent studies, a graph of past workers’ compensation market cycles was particularly intriguing.

chart1

This is an aggregate view of the work comp industry results. The blue line is accident year loss ratio, 1987 to present. See the volatility? Loss ratio is bouncing up and down between 60% and 100%.

Now look at the red line. This is the price line. We see volatility in price, as well, and this makes sense. But what’s the driver here? Is price reacting to loss ratio, or are movements in loss ratio a result of changes in price?

To find the answer, look at the green line. This is the historic loss rate per dollar of payroll. Surprisingly, this line is totally flat from 1995 to the present. In other words, on an aggregate basis, there has been no fundamental change in loss rate for the past 20 years. All of the cycles in the market are the result of just one thing: price movement.

Unfortunately, it appears we have done this to ourselves.

Breaking the Cycle

As carriers move to more sophisticated pricing using predictive analytics, can we hope for an end to market cycles? Robert Hartwig, economist and president of the Insurance Information Institute, thinks so. “You’re not going to see the vast swings you did 10 or 15 years ago, where one year it’s up 30% and two years later it’s down 20%,” he says. The reason is that “pricing is basically stable…the industry has gotten just more educated about the risk that they’re pricing.”

In other words, Hartwig is telling us that more sophisticated pricing is putting an end to extreme market cycles.

The “what goes up must come down” mentality of market cycles is becoming obsolete. We see now that market cycles are fed by pricing inefficiency, and more carriers are making pricing decisions based on individual risks, rather than reacting to broader market trends. Of course, when we use the terms “sophisticated pricing” and “individual risk,” what we’re really talking about is the effective use of predictive analytics in risk selection and pricing.

Predictive Analytics – Opportunity and Vulnerability in the Cycle

Market cycles aren’t going to ever truly die. There will still be shock industry events, or changes in trends that will drive price changes. In “the old days,” these were the catalysts that got the pendulum to start swinging.

With the move to increased usage of predictive analytics, these events will expose the winners and losers when it comes to pricing sophistication. When carriers know what they insure, they can make the rational pricing decisions at the account level, regardless of the price direction in the larger market. In a hard market, when prices are rising, they accumulate the best new business by (correctly) offering them quotes below the market. In a soft market, when prices are declining, they will shed the worst renewal business to their naïve competitors, which are unwittingly offering up unprofitable quotes.

chart2

Surprisingly, for carriers using predictive analytics, market cycles present an opportunity to increase profitability, regardless of cycle direction. For the unfortunate carriers not using predictive analytics, the onset of each new cycle phase presents a new threat to portfolio profitability.

Simply accepting that profitability will wax and wane with market cycles isn’t keeping up with the times. Though the length and intensity may change, markets will continue to cycle. Sophisticated carriers know that these cycles present not a threat to profits, but new opportunities for differentiation. Modern approaches to policy acquisition and retention are much more focused on individual risk pricing and selection that incorporate data analytics. The good news is that these data-driven carriers are much more in control of their own destiny, and less subject to market fluctuations as a result.

What Does ‘Data-Driven’ Really Mean?

The more things change, the more they stay the same. What remains constant are the fundamentals of what makes insurance a well-capitalized, reliable cornerstone of the U.S. economy. The basic model of assessing risk, collecting insurance premiums, investing and paying claims still works. What’s been completely upended is how carriers evaluate and acquire the best risks – and how much more important effective risk evaluation is today.

Advanced data and predictive analytics have changed the customer acquisition and retention game. When an insurer can pinpoint which policies are going to be the most profitable 10% and also know that same small segment is delivering 50% of total profit, you know the rules have changed.

The chart below represents a study of the portfolios from a diverse set of commercial insurers and lines of business. The study shows that that this surprising statistic holds true across companies. It helps demonstrate the real advantage — and potential threat — of data analytics. The insurer that can accurately identify the best 10% of the market is going to be able to compete on, and win, this business.

chart 1

What does being data-driven mean in practice?

Information is a business enabler; you don’t need to embark on “big data” or predictive analytics initiatives just for the sake of them. You shouldn’t feel pressured to lead the rallying cry to become a data-driven organization because everyone is talking about it. You consume data to gain insights that will solve problems that matter and achieve specific objectives.

Data-driven decision making is a commitment and a passion to go beyond the limits of heuristics, because you know it’s necessary to reach a new level of understanding of where your business is today and where it’s headed in the near term. Data-driven cultures have a disciplined curiosity and rigor to find credible patterns in the data before finalizing their conclusions – which is why everyone emphasizes how important it is to create a test-and-learn culture. Armed with a solid business case, transparency and good processes, data-driven organizations use analytics in combination with human expertise to make better decisions.

Why is this so urgent?

A recent Bloomberg article reported that the workers’ compensation industry posted its first underwriting profit since 2006, which is welcome news. At the same time, the article noted that this is directly related to how insurers have reacted to the current investment environment. In the absence of meaningful investment returns, insurers are keenly focused on bridging the gap by improving underwriting profits and enhancing operational efficiencies: “The reality is, in today’s interest-rate environment, we need to be driving combined ratios under 100,” said Steve Klingel, CEO of the National Council on Compensation Insurance (NCCI).

This isn’t limited to one line of business. As Robert Hartwig, president of the Insurance Information Institute, noted in a recent interview, “You’re not going to see vast swings you did 10 or 15 years ago, where one year it’s up 30% and two years later it’s down 20%”. The reason he gave: “Pricing is basically stable… The industry has gotten just more educated about the risk that they’re pricing.”

Now what?

No one said implementing data analytics in an underwriting environment is a small task or a quick fix. Many companies focus primarily on selecting the right predictive model. In reality, the model itself is just one part of a larger process that touches many parts of the organization.

Data analytics can only be successful if developed and deployed in the right environment. You may find that you have to retool your people so that underwriters don’t feel that data analytics are a threat to their expertise, or actuaries to their tried-and-true pricing models. Never underestimate the importance of the human element in moving to a data-driven culture.

Given the choice between leading a large-scale change management initiative and getting a root canal, you may be picking up the phone to call the dentist right now. It doesn’t have to be that way: Following a thoughtful, straightforward process that involves all the stakeholders early and often goes a long way.

Best Practices for Predictive Models

There’s little doubt about the proven value in using predictive analytics for risk selection and pricing in P/C insurance. In fact, 56% of insurers at this year’s Valen Analytics Summit that are not currently using predictive analytics in underwriting plan to start within a year. However, many insurers haven’t spent enough energy planning exactly how they can implement analytics to get the results they want. It’s a common misconception that competitive advantage is won by simply picking the right model.

In reality, the model itself is just a small part of a much larger process that touches nearly every part of the insurance organization. Embracing predictive analytics is like recruiting a star quarterback; alone, he’s not enough to guarantee a win. He both requires a solid team and a good playbook to achieve his full potential.

The economic crash of 2008 emphasized the importance of predictive modeling as a means to replace dwindling investment income with underwriting gains. However, insurance companies today are looking at a more diverse and segmented market than pre-2008, which makes the “old way of doing things” no longer applicable. The insurance industry is increasing in complexity, and with so many insurers successfully implementing predictive analytics, greater precision in underwriting is becoming the “new normal.” In fact, a recent A.M. Best study shows that P/C insurers are facing more aggressive pricing competition than any other insurance sector.

Additionally, new competitors like Google, which have deep reservoirs of data and an established rapport and trust with the Millennial generation, means that traditional insurers must react to technologies faster than ever. Implementing predictive analytics is the logical place to start.

The most important first step in predictive modeling is making sure all relevant stakeholders understand the business goals and organizational commitment. The number one cause of failure in predictive modeling initiatives isn’t a technical or data problem, but instead a lack of clarity on the business objective combined with a defect in the implementation plan (or lack thereof).

red

ASSESSMENT OF ORGANIZATIONAL READINESS

If internal conversations are focused solely on the technical details of building and implementing a predictive model, it’s important to take a step back and make sure there’s support and awareness across the organization.

Senior-Level Commitment – Decide on the metrics that management will use to measure the impact of the model. What problems are you trying to solve, and how will you define success? Common choices include loss ratio improvement, pricing competitiveness and top-line premium growth. Consider the risk appetite for this initiative and the assumptions and sensitivities in your model that could affect projected results.

Organizational Buy-In – What kind of predictive model will work for your culture? Will this be a tool to aid in the underwriting process or part of a system to automate straight-through processing? Consider the level of transparency appropriate for the predictive model. It’s usually best to avoid making the model a “black box” if underwriters need to be able to interact with model scores on their way to making the final decisions on a policy.

Data Assets – Does your organization plan to build a predictive model internally, with a consultant or a vendor that builds predictive models on industry-wide data? How will you evaluate the amount of data you need to build a predictive model, and what external data sources do you plan to use in addition to your internal data? Are there resources available on the data side to provide support to the modeling team?

MODEL IMPLEMENTATION,/p>

After getting buy-in from around the organization, the next step is to lay out how you intend to achieve your business goals. If it can be measured, it can be managed. This step is necessary to gauge the success or failure post-implementation. Once you’ve set the goals for assessment, business and IT executives should convene and detail a plan for implementation, including responsibilities and a rollout timeline.

Unless you’re lucky enough to work with an entire group of like-minded individuals, this step must be taken with all players involved, including underwriting, actuarial, training and executive roles. Once you’ve identified the business case and produced the model and implementation plan, make sure all expected results are matched up with the planned deliverables. Once everything is up and running, it is imperative to monitor the adoption in real-time to ensure that the results are matching the initial model goals put in place.,/p>

UNDERWRITING TRAINING

A very important but often overlooked step is making sure that underwriters understand why the model is being implemented, what the desired outcomes are and what their role is in implementing it. If the information is presented correctly, underwriters understand that predictive modeling is a tool that can improve their pricing and risk selection as opposed to undermining the underwriters. But there are still some who rely solely on their own experience and knowledge who may feel threatened by a data-driven underwriting process. In fact, nearly half of the attending carriers at the 2015 Valen Summit cited lack of underwriting adoption as one of the primary risks in a predictive analytics initiative.

Insurers that have found the most success with predictive modeling are those that create a specific set of underwriting rules and showcase how predictive analytics are another tool to enhance their performance, rather than something that will replace them entirely. Not stressing this point can result in resistance from underwriters, and it is essential to have their buy-in. At the same time, it is also important to monitor the implementation of underwriting guidelines, ensuring that they are being followed appropriately.

KEEPING THE END IN MIND,/p>

Many of the challenges and complexities in the P/C marketplace are out of an individual insurer’s control. One of the few things insurers can control is their use of predictive modeling to know what they insure. It’s one of the best ways an insurer is able to protect its business from new competitors and maintain consistent profit margins.

Using data and analytics to evaluate your options allows you to test and learn, select the best approach and deliver results that make the greatest strategic impact.

While beginning a predictive analytics journey can be difficult and confusing at first glance, following these best practices will increase your chances of getting it right on the first try and ensuring your business goals will be met.

Big Data in Insurance: A Glimpse Into 2015

Bernard Marr is one of the big voices to pay attention to on the subject of big data. His recent piece “Big Data: The Predictions for 2015” is bold and thought-provoking. As a P&C actuary, I tend to look at everything through my insurance-colored glasses. So, of course, I immediately started thinking about the impact on insurance if Marr’s predictions come to pass this year.

As I share my thoughts below, be aware that the section headers are taken from his article; the rest of the content are my thoughts and interpretations of the impact to the insurance industry.

The value of the big data economy will reach $125 billion

That’s a really big number, Mr. Marr. I think I know how to answer my son the next time he comes to me looking for advice on a college major.

But what does this huge number mean for insurance? There’s a potential time bomb here for commercial lines because this $125 billion means we’re going to see new commerce (and new risks) that are not currently reflected in loss history – and therefore not reflected in rates.

Maybe premiums will go up as exposures increase with the new commerce – but that raises a new question: What’s the right exposure base for aggregating and analyzing big data? Is it revenue? Data observation count? Megaflops? We don’t know the answer to this yet. Unfortunately, it’s not until we start seeing losses that we’ll know for sure.

The Internet of Things will go mainstream

We already have some limited integration of “the Internet of Things” into our insurance world. Witness UBI (usage-based insurance), which can tie auto insurance premiums to not only miles driven, but also driving quality.

Google’s Nest thermostat keeps track of when you’re home and away, whether you’re heating or cooling, and communicates this information back to a data store. Could that data be used in more accurate pricing of homeowners insurance? If so, it would be like UBI for the house.

The Internet of Things can extend to healthcare and medical insurance, as well. We already have health plans offering a discount for attending the gym 12 times a month. We all have “a friend” who sometimes checks in at the gym to meet the quota and get the discount. With the proliferation of worn biometric devices (FitBit, Nike Fuel and so on), it would be trivial for the carrier to offer a UBI discount based on the quantity and quality of the workout. Of course, the insurer would need to get the policyholder’s permission to use that data, but, if the discount is big enough, we’ll buy it.

Machines will get better at making decisions

As I talk with carriers about predictive analytics, this concept is one of the most disruptive to underwriters and actuaries. There is a fundamental worry that the model is going to replace them.

Machines are getting better at making decisions, but within most of insurance, and certainly within commercial lines, the machines should be seen as an enabling technology that helps the underwriter to make better decisions, or the actuary to make more accurate rates. Expert systems can do well on risks that fit neatly into a standard underwriting box, but anything outside of that box is going to need some human intervention.

Textual analysis will become more widely used

A recurring theme I hear in talking to carriers is a desire to do claims analysis, fraud detection or claims triage using analysis of text in the claims adjusters’ files. There are early adopters in the industry doing this, and there have emerged several consultants and vendors offering bespoke solutions. I think that 2015 could be the year that we see some standardized, off-the-shelf solutions emerge that offer predictive analytics using textual analysis.

Data visualization tools will dominate the market

This is spot-on in insurance, too. Data visualization and exploration tools are emerging quickly in the insurance space. The lines between “reporting tool” and “data analysis tool” are blurring. Companies are realizing that they can combine key performance indicators (KPIs) and metrics from multiple data streams into single dashboard views. This leads to insights that were never before possible using single-dimension, standard reporting.

There is so much data present in so many dimensions that it no longer makes sense to look at a fixed set of static exhibits when managing insurance operations. Good performance metrics don’t necessarily lead to answers, but instead to better questions – and answering these new questions demands a dynamic data visualization environment.

Matt Mosher, senior vice president of rating services at A.M. Best, will be talking to this point in March at the Valen Analytics Summit and exploring how companies embracing analytics are finding ways to leverage their data-driven approach across the entire enterprise. This ultimately leads to significant benefits for these firms, both in portfolio profitability and in overall financial strength.

There will be a big scare over privacy

Here we are back in the realm of new risks again. P&C underwriters have long been aware of “cyber” risks and control these through specialized forms and policy exclusions.

With big data, however, comes new levels of risk. What happens, for example, when the insurance company knows something about the policyholder that the policyholder hasn’t revealed? (As a thought experiment, imagine what Google knows of your political affiliations or marital status, even though you’ve probably never formally given Google this information.) If the insurance company uses that information in underwriting or pricing, does this raise privacy issues?

Companies and organizations will struggle to find data talent

If this is a huge issue for big data, in general, then it’s a really, really big deal for insurance.

I can understand that college freshmen aren’t necessarily dreaming of a career as a “data analyst” when they graduate. So now put “insurance data analyst” up as a career choice, and we’re even lower on the list. If we’re going to attract the right data talent in the coming decade, the insurance industry has to do something to make this stuff look sexy, starting right now.

Big data will provide the key to the mysteries of the universe

Now, it seems, Mr. Marr has the upper hand. For the life of me, I can’t figure out how to spin prognostication about the Large Hadron Collider into an insurance angle. Well played.

Those of us in the insurance industry have long joked that this industry is one of the last to adopt new methods and technology. I feel we’ve continued the trend with big data and predictive analytics – at least, we certainly weren’t the first to the party. However, there was a tremendous amount of movement in 2013, and again in 2014. Insurance is ready for big data. And just in time, because I agree with Mr. Marr – 2015 is going to be a big year.