Tag Archives: actuarial

How Do Actuarial, Data Skills Converge?

Our survey of leading carriers shows that insurers are increasingly looking to integrate data scientists into their organizations. This is one of the most compelling and natural opportunities within the analytics function.

This document provides a summary of our observations on what insurers’ analytics function will look like in the future, the challenges carriers are currently facing to make this transition and how they can address them.

We base our observations on our experience serving a large portion of U.S. carriers. We supplemented our findings through conversations with executives at a representative sample of these carriers, including life, commercial P&C, health and specialty risk.

We also specifically address the issue of recruitment and retention of data scientists within the confines of the traditional insurance company structure.

The roles of actuaries and data scientists will be very different in 2030 than they are today

Actuaries have traditionally been responsible for defining risk classes and setting premiums. Recently, data scientists have started getting involved in building predictive analytics models for underwriting, in place of traditional intrusive procedures such as blood tests.

By 2030, automated underwriting will become the norm, and new sources of data may be incorporated into underwriting. Mortality prediction will become ever more accurate, leading to more granular (possibly at individual level) premium setting. Data scientists will likely be in charge of assessing mortality risks, while actuaries will be the ones setting premiums, or “putting a price tag on risk” – the very definition of what actuaries do.

Risk and capital management requires extensive knowledge of the insurance business and risks, and the ability to model the company’s products and balance sheet under various economic scenarios and policyholder assumptions. Actuaries’ deep understanding and skills in these areas will make them indispensable.

We do not expect this to change in the future, but by 2030, data scientists will likely play an increased role in setting assumptions underlying the risk and capital models. These assumptions will likely become more granular, based more on real-time data, and more plausible.

Actuaries have traditionally been responsible for performing experience studies and updating assumptions for in-force business. The data used for the experience studies are based on structured data in the admin system. Assumptions are typically set at a high level, varying by a few variables.

By 2030, we expect data scientists to play a leading role, and incorporate non-traditional data source such as call center or wearable devices to analyze and manage the business. Assumptions will be set at a more granular level – instead of a 2% overall lapse rate, new assumptions will identify which 2% of the policies are most likely to lapse.

See also: Wave of Change About to Hit Life Insurers

Actuaries are currently entirely responsible for development and certification of reserves per regulatory and accounting guidelines, and we expect signing off on reserves to remain the remit of actuaries.

Data scientists will likely have an increased role in certain aspects of the reserving process, such as assumptions setting. Some factor-based reserves such as IBNR may also increasingly be established based on data-driven and sophisticated techniques, which data scientists will likely play a role in.

Comparing actuarial and data science skills

Although actuaries and data scientists share many skills, there are distinct differences between their competencies and working approaches.

PwC sees three main ways to accelerate integration and improve combined value

1. Define and implement a combined operating model. Clearly defining where data scientists fit within your organizational structure and how they will interact with actuaries and other key functions will reduce friction with traditional roles, enhance change management and enable clearer delineation of duties. In our view, developing a combined analytics center of excellence is the most effective structure to maximize analytics’ value.

2. Develop a career path and hiring strategy for data scientists. The demand for advanced analytical capabilities currently far eclipses the supply of available data scientists. Having a clearly defined career path is the only way for carriers to attract and retain top data science (and actuarial) talent in an industry that is considered less cutting-edge than many others. Carriers should consider the potential structure of their future workforce, where to locate the analytics function to ensure adequate talent is locally available and how to establish remote working arrangements.

3. Encourage cross-training and cross-pollination of skills. As big data continues to drive change in the industry, actuaries and data scientists will need to step into each others’ shoes to keep pace with analytical demands. Enabling knowledge sharing will reduce dependency on certain key individuals and allow insurers to better pivot toward analytical needs. It is essential that senior leadership make appropriate training and knowledge-sharing resources available to the analytics function.

Options for integrating data scientists

Depending on the type of carrier, there are three main approaches for integrating data scientists into the operating model.

Talent acquisition: Growing data science acumen

Data science talent acquisition strategies are top of mind at the carriers with whom we spoke.

See also: Digital Playbooks for Insurers (Part 3)  

Data science career path challenges

The following can help carriers overcome common data science career path challenges.

Case study: Integration of data science and actuarial skills

PwC integrated data science skills into actuarial in-force analytics for a leading life insurer so the company could gain significant analytical value and generate meaningful insights.

Issue

This insurer had a relatively new variable annuity line without much long-term experience gauging its risk. Uncertainty about excess withdrawals and rise in future surrender rates had major implications for the company’s reserve requirements and strategic product decisions. Traditional actuarial modeling approaches were limited to six to 12 months of confidence at a high level, with only a few variables. They were not adequate for major changes in the economy or policyholder behavior at a more granular level.

Solution

After engaging PwC’s support, in-force analytics expanded to use data science skills such as statistical and simulation modeling to explore possible outcomes across a wide range of economic, strategic and behavioral scenarios at the individual household-level.

Examples of data science solutions include:

  • Applying various machine learning algorithms to 10 years of policyholder data to better identify most predictive
    variables.
  • Using statistical matching techniques to enrich the client data with various external datasets and thereby create an
    accurate household-level view.
  • Developing a simulation model to simulate policyholder behavior in a competitive environment as a sandbox to run scenario analysis over a 30-year period.

Benefit

The enriched data factored in non-traditional information, such as household employment status, expenses, health status and assets. The integrated model that simulated policyholder behavior allowed for more informed estimates of withdrawals, surrenders and annuitizations. Modeling “what if” scenarios helped in reducing the liquidity risk stemming from uncertainty regarding excess withdrawals and increase in surrender rates.

All of these allowed the client to better manage its in-force, reserve requirements and strategic product decisions.

This report was written by Anand Rao, Pia Ramchandani, Shaio-Tien Pan, Rich de Haan, Mark Jones and Graham Hall. You can download the full report here.

cloud

Secret Sauce for New Business Models?

Insurance companies were built to bring stability to an unstable world.

So, why do factors such as market instability, technological upheaval and consumer pressure seem to throw so many insurers into panic? In many cases, insurers can simply point to their rigid foundations.

It didn’t take many California earthquakes to convince California builders that foundations would need to be built with flexibility in mind. In insurance, it won’t take many disruptive upheavals to teach businesses that current foundations are ripe for disaster. New foundations are needed to support a perpetually shifting business.

In Reinventing Insurance: Leveraging the Power of Data, Analytics and Cloud Core Systems, a Majesco white paper issued in cooperation with Elagy, we look closely at how fundamental changes in the insurance business can be met with a new view of insurance infrastructure. By assembling cloud components into a fully functional virtual infrastructure, insurers remove the lethargy and overhead that bogs down everything from data aggregation and analytics to testing and product development. The goal is to build an insurance enterprise that can capitalize on market opportunities.

Risk vs. Time

To assess potential cloud value, Majesco first looked at the relationship between insights and risk assessment and at how insights are traditionally gathered and used. Traditional risk assessment regards claims experience across time and population as the best kind of informant regarding risk within any particular insurance product. This kind of risk assessment is proven. Actuarial science has been honed. Insurers have become adept at long-term predictive capabilities, and regulations have kept consumers and insurers protected from failure through adequate margins of error.

The experience of time, however, has become the sticking point. To meet market demands, every insurance process has to be shortened. The new predictive fuel of data provided through real-time digital sources (as well as increasingly insightful technologies) can give insurers a much better view of risk in a much more appropriate timeframe. But even if they can gather data and assess the data quickly, they will, in most cases, still be held back by a product development and testing infrastructure that isn’t prepared to respond to fast-acting competitive pressure. The transparency that offers such promising opportunity is widely available to anyone, not just insurers, and it is highly coveted by agile, tech-savvy, entrepreneurial disrupters.

Competition vs. Time

Entrepreneurs love innovation and crave a new, marketable idea. They especially enjoy turning age-old processes on end, because these moments are often akin to striking gold. With technology’s rapid application of telematics, sensors, geolocation information and improved data management, nearly anyone can tap into the same data pools. Creative entrepreneurs, educated investors and innovative organizations are teaming up in a new kind of gold rush where rapid opportunity recognition will be met with rapid product development and relevant marketing. At a time when consumers seem to be susceptible to instant access product messages, disruptive companies will soon be feeding them instant-access products.

Once again, the development time of legacy platforms can’t offer a competitive solution to insurers. The foundation is now susceptible to cracking because of its inflexibility.

Legacy vs. Time

Insurers still maintain dozens of advantages in the industry, the first and the foremost being experience. All of today’s new data sources, new channel options and modern infrastructure possibilities have more promise in the hands of insurers than in the hands of non-insurance disrupters. Legacy systems, however, are restrictive. They aren’t plug and play. Most aren’t operating in a unified data environment with data consolidated and available across multiple databases. So, insurers’ opportunities will be found in a system built to fit the new insurance business and infrastructure model.

Majesco’s report discusses how insurers can align cloud solutions with business strategies to capitalize on new risks, new products and new markets. With data aggregation, for example, cloud solutions available through Majesco and data-partner Elagy are rewriting analytic- and decision-making processes. A cloud data solution can integrate claims experience with third-party data and newly available data sets to relieve the need for additional IT overhead.

A Satellite Office Approach

Small and medium-sized insurers, in particular, stand to gain through a reinvention of their operational model. Market drivers—such as agents’ lack of marketing insights, the availability of relevant data and the need for low-cost process efficiencies—make an excellent case for change. The hurdles are real, however. Many insurers don’t have the needed resources to take advantage of these opportunities, and they are constrained by technology and a lack of operational capability.

The ideal solution would be to transfer the whole pipeline to the cloud, migrating the enterprise infrastructure into a cloud-based infrastructure where partners and innovators can plug their solutions into a cloud-based core administration system.

In the real world, most insurers would be served by a better strategy. When companies in any industry hope to move to a new geographic region, they sometimes open a satellite office. The satellite office is the new footprint in the foreign territory. It’s the place where testing and acclimation happen, and its approach is somewhat analogous to what insurers can do when looking at cloud development.

Insurers will find excitement and freedom running a new and improved model alongside the old model. While the organization practices its newfound agility, it will maintain the stability of legacy systems for as long as they are needed or are practical. A cloud-based insurance platform will quickly bring the insurer to the realm of data-fueled experience and competitive advantage. Its new processes and capabilities will breathe fresh life into insurers that are ready for resilient foundations.

Modeling Flood — the Peril of Inches

“Baseball is a game of inches” – Branch Rickey

Property damage because of flooding is quite different from any other catastrophic peril such as hurricane, tornado or earthquake. Unlike with those perils, estimating losses from flood requires a higher level of geospatial exactness. Not only do we need to know precisely where that property is located and the distance to the nearest potential flooding source, but we also need to know the elevation of the property in comparison to its nearby surroundings and the source of flooding. Underwriting flood insurance is a game of inches, not ZIP codes.

With flood, a couple feet can make the difference between being in a flood zone or not, and a few inches of elevation can increase or decrease loss estimates by orders of magnitude. This realization helps explain the current financial mess of the National Flood Insurance Program (NFIP). In hindsight, even if the NFIP had perfect actuarial knowledge about the risk of flood, its destiny was preordained simply because it lacked other necessary tools.

This might make the reader believe that insuring flood is essentially impossible. Until just a few years ago, you’d be right. But, since then, interesting stuff has happened.

In the past decade, technologies like data storage, processing, modeling and remote sensing (i.e. mapping) have improved incredibly. All of a sudden it is possible to measure and store all topographical features of the U.S. — it has been done. Throw in analytical servers able to process trillions of calculations in seconds, and all of a sudden processing massive amounts of data is relatively easy. Meanwhile, the science around flood modeling, including meteorology, hydrology and topology, has been developed in a way that the new geospatial information and processing power can be used to produce models that have real predictive capabilities. These are not your grandfather’s flood maps. There are now models and analytics that provide estimates for frequency AND severity of flood loss for a specific location, an incredible leap forward from zone or ZIP code averaging. Like baseball, flood insurance is also a game of inches. And now it’s also a game that can be played and profited from by astute insurance professionals.

For the underwriting of insurance, having dependable frequency and severity loss estimates at a location level is gold. There is no single flood model that will provide all the answers, but there is definitely enough data, models and information available to determine frequency and severity metrics for flood to enable underwriters to segment exposure effectively. Low-, moderate- and high-risk exposures can be discerned and segregated, which means risk–based, actuarial pricing can be confidently implemented. The available data and risk models can also drive the design of flood mitigation actions (with accurate credit incentives attached to them) and marketing campaigns.

With the new generation of models, all three types of flooding can be evaluated, either individually or as a composite, and have their risk segmented appropriately. The available geospatial datasets and analytics support estimations of flood levels, flood depths and the likelihood of water entering a property by knowing the elevation of the structure, floors of occupancy and the relationship between the two.

In the old days, if your home was in a FEMA A or V zone but you were possibly safe from their “base flood” (a hypothetical 1% annual probability flood), you’d have to spend hundreds of dollars to get an elevation certificate and then petition the NFIP, at further cost, hoping to get a re-designation of your home. Today, it’s not complicated to place the structure in a geospatial model and estimate flood likelihood and depths in a way that can be integrated with actuarial information to calculate rates – each building getting rated based on where it is, where the water is and the likelihood of the water inundating the building.

In fact, the new models have essentially made the FEMA flood maps irrelevant in flood loss analysis. We don’t need to evaluate what flood zone the property is in. We just need an address. Homeowners don’t need to spend hundreds of dollars for elevation certificates; the models already have that data stored. Indeed, much of the underwriting required to price flood risk can be handled with two to three additional questions on a standard homeowners insurance application, saving the homeowner, agent and carrier time and frustration. The process we envision would create a distinctive competitive advantage for the enterprising carrier and one that would create and capture real value throughout the distribution chain, if done correctly. This is what disruption looks like before it happens.

In summary, the tools are now available to measure and price flood risk. Capital is flooding (sorry, we couldn’t help ourselves) into the insurance sector, seeking opportunities to be put to work. While we understand the skepticism of the industry to handle flood, the risk can be understood well enough to create products that people desperately need. Insuring flood would be a shot in the arm to an industry that has become stale at offering anything new. Billions of dollars of premium are waiting for the industry to capitalize on. One thing the current data and analytics make clear is this: There are high-, medium- and low-risk locations waiting to be insured based on actuarial methods. As long as flood insurance is being rated by zone (whether it is FEMA zone or Zipcode), there is cherry-picking to be done.

Who wants to get their ladder up the cherry tree first? And who will be last?

Where is Real Home for Analytics?

One of the fascinating aspects of technology consulting is having the opportunity to see how different organizations address the same issues. These days, analytics is a superb example. Even though every organization needs analytics, they are not all coming to the same conclusions about where “Analytics Central” lies within the company’s structure. In some carriers, marketing picked up the baton first. In others, actuaries have naturally been involved and still are. In a few cases, data science started in IT, with data managers and analytical types offering their services to the company as an internal partner, modeled after most other IT services.

In several situations that we’ve seen, there is no Analytics Central at all. A decentralized view of analytics has grown up in the void – so that every area needing analytics fends for itself. There are a host of reasons this becomes impractical, so often we find these organizations seeking assistance in developing an enterprise plan for data and analytics. This plan accounts for more than just technology modernization and nearly always requires some fresh sketches on the org chart.

Whichever situation may represent the analytics picture in your company, it’s important to note that no matter where analytics begins or where it currently resides, that location isn’t always where it is going to end up.

Ten years ago, if you had asked any senior executive where data analytics would reside within the organization, he or she would likely have said, “actuarial.” Actuaries are, after all, the original insurance analytics experts and providers. Operational reporting, statistical modeling, mortality on the life side and pricing and loss development on the P&C side – all of these functions are the lifeblood that keep insurers profitable with the proper level of risk and the correct assumptions for new business. Why wouldn’t actuaries also be the ones to carry the new data analytics forward with the right assumptions and the proper use of data?

Yet, when I was invited to speak at a big data and analytics conference with more than 100 insurance executives and interested parties recently, there was not one actuary in attendance. I don’t know why — maybe because it was quarter-end — but I can only assume that, even though actuaries may want to be involved, their day jobs get in the way. Quarterly reserve reviews, important loss development analysis and price adequacy studies can already consume more time than actuaries have. In many organizations, the actuarial teams are stretched so thin they simply don’t have the bandwidth to participate in modeling efforts with unclear benefits.

Then there is marketing. One could argue that marketing has the most to gain from housing the new corps of data scientists. If one looks at analytics from an organizational/financial perspective, marketing ROI could be the fuel for funding the new tools and resources that will grow top-line premium. Marketing also makes sense from a cultural perspective. It is the one area of the insurance organization that is already used to blending the creative with the analytical, understanding the value of testing methods and messages and even the ancillary need to provide feedback visually.

The list of possibilities can go on and on. One could make a case for placing analytics in the business, keeping it under IT, employing an out-of-house partner solution, etc. There are many good reasons for all of these, but I suspect that most analytics functions will end up in a structure all their own. That’s where we’ll begin “Where is the Real Home for Analytics, Part II” in two weeks.

How to Prevent IRS Issues for Captives

A regulator of captive insurance is responsible for many aspects of the business of captive insurance companies. He or she must coordinate the application process for obtaining a license, including the financial analysis and financial examination of each captive insurance company. The regulator is also a key marketing person in promoting the domicile as a favorable place to do business, thus fostering economic development for the state.

The captive regulator is not, however, a tax adviser. No statute and regulation in any domestic domicile requires an analysis of the potential tax status of the captives under consideration or under regulation. If the application complies with the stated statutory and regulatory requirements, the regulator must favorably consider the application and allow the new company to be licensed as an insurance company under state law.

That new insurance company may not, however, be considered an insurance company under federal tax law. The Internal Revenue Service recently listed captives as one of their annual “Dirty Dozen” tax scams, citing “esoteric or improbable risks for exorbitant premiums.” And at least seven captive managers (and therefore their clients) have been targeted for “promoter” audits, for allegedly promoting abusive tax transactions.

Yet all of these captives received a license from a regulator, mostly in the U.S. Obviously these regulators did not consider the pricing of the risks to be transferred to the captive, except perhaps at the macro level.

Should the domicile care about the potential tax status of licensed captives? David Provost, Vermont’s Deputy Commissioner of Captive Insurance, has said, “We do not license Section 831(b) captives; we license insurance companies.” While that statement is technically correct, this paper argues that, with respect to small captives, regulators should care about the tax implications of licenses in extreme cases, consistent, of course, with the laws and regulations under which it operates.

Small captives, i.e. those with annual premiums of no more than $1.2 million, can elect under section 831(b) of the Internal Revenue Code to have their insurance income exempt from federal taxation. This provision, combined with certain revenue rulings and case law, creates a strong tax and financial planning incentive to form such a captive insurance company.

This incentive can lead to an “over-pricing” of premiums being paid to the new captive, to maximize the tax benefits on offer. The premiums may be “over-priced” relative to market rates, even after being adjusted for the breadth of policy form, size and age of the insurance company and, in some cases, the uniqueness of the risk being insured by the captive. But “over-priced” in whose eyes?

Insurance regulators are usually more concerned with whether enough premium is being paid to a captive to meet its policy obligations. From that perspective, “too much” premium can never be a bad thing. Indeed, captive statutes and regulations generally use the standard of being “able to meet policy obligations” as the basis of evaluating captive applications or conducting financial reviews. And actuarial studies provided with captive applications generally conclude that “…the level of capitalization plus premiums will provide sufficient funds to cover expected underwriting results.”

These actuarial studies do not usually include a rate analysis, by risk, because none is required by captive statute or regulation.

Small “831(b)” captives, therefore, may easily satisfy the financial requirements set forth in captive statutes and regulations. If, however, the Internal Revenue Service finds on audit that the premiums paid to that captive are “unreasonable,” then the insured and the captive manager may face additional taxes and penalties, and the captive may be dissolved, to the loss of the domicile.

And, as has happened recently, the IRS may believe that a particular captive manager has consistently over-priced the risk being transferred to its captives and may initiate a “promoter” audit, covering all of those captives. Such an action could result in unfavorable publicity to the domiciles that approved those captive applications, regardless of the fact that the regulators were following their own rules and regulations to the letter.

It is that risk of broad bad publicity that should encourage regulators to temper the rush to license as many captives as possible. There should be some level of concern for the “reasonableness” of the premiums being paid to the captives.

One helpful step would be to change captive statutes or regulations to require that actuarial feasibility studies include a detailed rate analysis. Such an analysis would compare proposed premium rates with those of the marketplace and offer specific justifications for any large deviations from market. (Given the competition among jurisdictions for captive business, such a change would only be possible if every domicile acted together, eliminating the fear that a domicile would lose its competitive edge by acting alone.)

Absent such a change, however, regulators still have the power to stop applications that do not pass the “smell test.” Most captive statutes require each applicant to file evidence of the “overall soundness” of its plan of operation, which would logically include its proposed premiums. If the premiums seem unreasonably high for the risks being assumed, the plan of operation may not be “sound,” in that it might face adverse results upon an IRS audit.

Regulators are not actuaries and often have had little or no underwriting experience. They, therefore, could not and should not “nit-pick” a particular premium or coverage. But some applications may be so egregious on their face that even non-insurance people can legitimately question the efficacy of the captive’s business plan.

Insurance professionals know from both experience and nationally published studies that the cost of risk for most companies is less than 2% of revenue. “Cost of risk” includes losses not covered by traditional third-party insurance, which are generally the type of losses covered by “small” captive insurance companies.

If a captive regulator receives an application in which the “cost” of coverage by that captive is, say, 10% to 12% or more of the revenue of the insured, alarm bells should go off. That captive certainly would have plenty of assets to cover its policy obligations! But in the overall scheme of things, including the real world of taxation, that business plan is not likely “sound.”

At that point, the regulator has a choice of rejecting the applicant, requiring a change in the business plan/premiums or demanding additional support for the proposed plan. We are aware of one case in which the captive regulator required the applicant to provide a rate analysis from an independent actuary when he received an application whose premiums did not appear reasonable.

A rate analysis is not, of course, a guarantee that the IRS will find the premiums acceptable on audit. No one can expect guarantees, but a properly done rate analysis has a better chance of assuring all the parties that the captive has been properly formed as a real insurance company and not simply as a way to reduce the taxable income of the insured and its owners.

Captive insurance regulators have a big job, particularly as the pace of captive formations increases. To protect the domicile from appearing on the front page of the Wall Street Journal, the regulator must consider all aspects of the proposed captive’s business, including, in extreme cases, its vulnerability to adverse federal tax rulings.