Tag Archives: standardization

Insurance at a Tipping Point (Part 2)

This is the second in a series of three articles. The first is here.

With the entire insurance industry at a tipping point, where many of the winners and losers will be determined in the next five to 10 years, it’s important to think through all the key strategic factors that will determine those outcomes. Those factors are what we call STEEP: social, technological, environmental, economic and political.

In this article, we’ll take a look at all five.

Social: The Power of Connections

The shifts in customer expectations present challenges for life insurers, many of which are caught in a product trap in which excessive complexity reduces transparency and increases the need for advisers. This creates higher distribution costs.

A possible solution lies in models that shift the emphasis from life benefits to promoting health, well-being and quality of life. In a foretaste of developments ahead, a large Asian life insurer has shifted its primary mission from insurance to helping people lead healthier lives. This is transforming the way the company engages with its customers. Crucially, it’s also giving a renewed sense of purpose and value to the group’s employees and distributors.

Further developments that could benefit both insurers and customers include knowledge sharing among policyholders. One insurer enables customers to share their health data online to help bring people with similar conditions together and help the company build services for their needs. Similarly, a DNA analysis company provides insights on individual conditions and creates online communities to pool the personal data of consenting contributors to support genetic studies.

A comparable shift in business models can be seen in the development of pay-as-you-drive coverage within the P&C sector. In South Africa, where this model is well advanced, insurers are realizing higher policyholder retention and lower claims costs.

This kind of monitoring is now expanding to home and commercial equipment. These developments are paving the 
way for a move beyond warranty or property insurance to an all-’round
 care, repair and protection service. These offerings move the client engagement from an annual transaction to something that’s embedded in their everyday lives. Agents could play
 an important role in helping to design aggregate protection and servicing.

In banking, we’ve seen rapid growth in peer-to-peer lending; the equivalent in insurance are the affinity groups that are looking to exercise their buying power, pool resources and even self-insure. While most of the schemes cover property, the growth in carpooling could see them play an increasing role within auto insurance.

Technological: Shaping the Organization Around Information Advantage

More than 70% of insurance participants in our 2014 Data and Analytics Survey say that big data or analytics have changed the way they make decisions. But many insurers still lack the vision and organizational integration to make the most of these capabilities. Nearly 40% of the participants in the survey see “limited direct benefit to my kind of role” from this analysis, and more than 30% believe that senior management lacks the necessary skills to make full use of the information.

The latest generation of models is 
able to analyze personal, social and behavioral data to gauge immediate demands, risk preferences, the impact of life changes and longer-term aspirations. If we look at pension planning, these capabilities can be part of an interactive offering for customers that would enable them to better understand and balance the financial trade-offs between how much they want to live off now and their desired standard of living when they retire. In turn, the capabilities could eliminate product boundaries as digital insights, along with possible agent input, provide the basis for customized solutions that draw together mortgages, life coverage, investment management, pensions, equity release, tax and inheritance planning. Once the plan is up and running, there could be automatic adjustments to changes in income, etc.

Reactive to preventative

The increasing use of sensors and connected devices as part of the Internet of Things offers ever more real-time and predictive data, which has the potential to move underwriting from “what has happened” to “what could happen” and hence more effective preemption of risks and losses. This in turn could open up opportunities for insurers to gravitate from reactive claims payer to preventative risk adviser.

As in many other industries, the next frontier for insurers is to move from predictive to prescriptive analytics (see Figure 2). Prescriptive analysis would help insurers to anticipate not only what will happen, but also when and why, so they are in a better position to prevent or mitigate adverse events. Insurers could also use prescriptive analytics to improve the sales conversion ratio in automated insurance underwriting by continually adjusting price and coverage based on predicted take-up and actual deviations from it. Extensions of these techniques can be used to model the interaction between different risks to better understand why adverse events can occur, and hence how to develop more effective safeguards.

figure-2

Environmental: Reshaping Catastrophe Risks and Insured Values

Catastrophe losses have soared since the 1970s. While 2014 had the largest number of events over the course of the past 30 years, losses and fatalities were actually below average. Globally, the use of technology, availability of data and ability to locate and respond to disaster in near real-time is helping to manage losses and save lives, though there are predictions that potential economic losses will be 160% higher in 2030 than they were in 1980.

Shifts in global production and supply are leading to a sharp rise in value at risk (VaR) in under-insured territories; the $12 billion of losses from the Thai floods of 2011 exemplify this. A 2013 report by the UN International Strategy for Disaster Reduction (UNISDR) and PwC concluded that multinationals’ dependencies on unstable international supply chains now pose a systemic risk to “business as usual.”

Environmental measures to mitigate risk

Moves to mitigate catastrophe risks
 and control losses are increasing. Organizations, governments and UN bodies are working more closely to share information on the impact of disaster risk. Examples include R!SE, a joint UN-PwC initiative, which looks at how to embed disaster risk management into corporate strategy and investment decisions.

Governments also are starting to develop plans and policies for addressing climatic instability, though for the most part policy actions remain unpredictable, inconsistent and reactive.

Developments in risk modeling

A new generation of catastrophe models is ushering in a transformational expansion in both geographical 
breadth and underwriting applications. Until recently, cat models primarily concentrated on developed market peak zones (such as Florida windstorm). As the unexpectedly high insurance losses from the 2010 Chilean earthquake and the 2011 Thai floods highlight, this narrow focus has failed to take account of the surge in production and asset values in fast-growth SAAAME markets (South America, Africa, Asia and the Middle East). The new models cover many of these previously non-modeled zones.

The other big difference for insurers is their newfound ability to plug different analytics into a single platform. This offers the advantages of being able to understand where there may be pockets of untapped capacity or, conversely, hazardous concentrations. The result is much more closely targeted risk selection and pricing.

The challenge is how to build these models into the running of the business. Cat modeling has traditionally been the preserve of a small, specialized team. The new capabilities are supposed to be easier to use and hence open to a much wider array of business, IT and analytical teams. It’s important to determine the kind of talent needed
 to make best use of these systems, as well as how they will change the way underwriting decisions are made.

Emerging developments include new monitoring and detection systems, which draw
 on multiple fixed and drone sensors.

Challenges for evaluating and pricing risk

Beyond catastrophe risks are disruptions to asset/insured values resulting from constraints on water, land and other previously under-evaluated risk factors. There are already examples of industrial plants that have had to close because of limited access to water.

Economic: Adapting to a Multipolar World

Struggling to sustain margins

The challenging economic climate has 
held back discretionary spending on life, annuities and pensions, with the impact being compounded by low interest rates and the resulting difficulties in sustaining competitive returns for policyholders. The keys to sustaining margins are likely to be simple, low-cost, digitally distributed products for the mass market and use of the latest risk analytics to help offer guarantees at competitive prices.

The challenges facing P&C insurers center on low investment returns and a softening market. Opportunities to seek out new customers and boost revenues include strategic alliances. Examples could include affinity groups, manufacturers or major retailers. A further possibility is that one of the telecoms or Internet giants will want a tie-up with an insurer to help it move into the market.

More than 30% of insurance CEOs
 now see alliances as an opportunity to strengthen innovation. Examples include the partnership between a leading global reinsurer and software group, which aims to provide more advanced cyber risk protection for corporations.

Surprisingly, only 10% of insurance CEOs are looking to partner with start- ups, even though such alliances could provide valuable access to the new ideas and technologies they need.

SAAAME growth

Growth in SAAAME insurance markets will continue to vary. Slowing growth 
in some major markets, notably Brazil, could hold back expansion. In others, notably India, we are actually seeing a decline in life, annuity and pension take-up as a result of the curbs on commissions for unit-linked insurance plans (ULIP). Further development in capital markets will be necessary to encourage savers to switch their deposits to insurance products.

As the reliance on agency channels adds to costs, there are valuable opportunities to offer cost- effective digital distribution. Successful models of inclusion include an Indian national health insurance program, which is aimed at poorer households and operates through a public/private partnership. More than 30 million households have taken up the smart cards that provide them with access to hospital treatment.

The already strong growth (10% a year) in micro-insurance is also set to increase, drawing on models developed within micro-credit. The challenge for insurers is the need to make products that are sufficiently affordable and comprehensible to consumers who have little or no familiarity with the concept of insurance.

Rather than waiting for a market-wide alignment of data and pricing, some insurers have moved people onto the ground to build up the necessary data sets, often working in partnership with governments, regional and local development authorities and banks and local business groups.

Urbanization

The urban/rural divide may actually be more relevant to growth opportunities ahead than the emerging/developed market divide. In 1800, barely one in 50 people lived in cities. By 2009, urban dwellers had become a majority of the global population for the first time. Now, every week, 1.5 million people are added to the urban population, the bulk of them in SAAAME markets.

Cities are the main engines of the global economy, with 50% of global GDP generated in the world’s 300 largest metropolitan areas. The result is more wealth to protect. Infrastructure development alone will generate an estimated $68 billion in premium income between now and 2030. Urban citizens will be more likely to be exposed to insurance products and have access to them. Urbanization is also likely to increase purchases of life, annuities and pensions’ products, as people migrating into cities have to make individual provision for the future rather than relying on extended family support.

Yet as the size and number of mega-metropolises grow, so does the concentration of risk. Key areas of exposure go beyond property and catastrophe coverage to include the impact of air pollution and poor water quality and sanitation on health.

Tackling under-insurance

A Lloyd’s report comparing the level 
of insurance penetration and natural catastrophe losses in countries around the world found that 17 fast-growth markets had an annualized insurance deficit of $168 billion, creating threats to sustained economic growth and the ability to recover from disasters.

Political: Harmonization, Standardization and Globalization of the Insurance Market

Government in the tent

At a time when all financial services businesses face considerable scrutiny, strengthening the social mandate through closer alignment with government goals could give insurers greater freedom. Insurers also could be in a stronger position to attract quality talent at a time when many of the brightest candidates are looking for more meaning from their chosen careers.

Government and insurers can join forces in the development of effective retirement and healthcare solutions (although there are risks). Further opportunities include a risk partnership approach to managing exposures that neither insurers nor governments have either the depth of data or financial resources to cover on their own, notably cyber, terrorism and catastrophe risks.

Impact of regulation

Insurers have never had to deal with an all-encompassing set of global prudential regulations comparable to the Basel Accords governing banks. But this is what the Financial Stability Board (FSB) and its sponsors in the G20 now want to see as the baseline requirements for not just the global insurers designated as systemically risky, but also a tier of internationally active insurance groups.

The G20’s focus on insurance regulation highlights the heightened politicization of financial services. Governments want to make sure that taxpayers no longer have to bail out failing financial institutions. The result 
is an overhaul of capital requirements 
in many parts of the world and a new basic capital requirement for G-SIIs. The other game-changing development is the emergence of a new breed of cross-state/cross-border regulator, which has been set up to strengthen co-ordination of supervision, crisis management and other key topics. These include the European Insurance and Occupational Pensions Authority (EIOPA) and the Federal Insurance Office (FIO) in the U.S.

Dealing with these developments requires a mechanism capable of looking beyond basic operational compliance at how new regulation will affect the strategy and structure of the organization and using this assessment to develop a clear and coherent company-wide response.

Technology will allow risk to be analyzed in real time, and predictive models would enable supervisors to identify and home in on areas in need
 of intervention. Regulators would also be able to tap into the surge in data and analysis within supervised organizations, creating the foundations for machine-to-machine regulation.

A more unstable world

From the crisis in Ukraine to the rise of ISIS, instability is a fact of life. Pressure on land and water, as well as oil and minerals, is intensifying competition for strategic resources and potentially bringing states into conflict. The ways these disputes are playing out is also impinging on corporations to an ever-greater extent, be this trade sanctions or state-directed cyber-attacks.

Businesses, governments and individuals also need to understand the potential causes of conflict and their ramifications and develop appropriate contingency planning and response. At the very least, insurers should seek to model these threats and bring them into their overall risk evaluations. For some, this will be an important element of their growing role as risk advisers and mitigators. Investment firms are beginning to hire ex-intelligence and military figures as advisers or calling in dedicated political consultancies as part of their strategic planning. More insurers are likely to follow suit.

The final article in this series will look at scenarios that could play out for insurers and will lay out a way to formulate an effective strategy. If you want a copy of the report from which these articles are excerpted, click here.

A How-To on Nurse Case Management

Nurse case management (NCM) has a powerful impact on workers’ compensation claim cost and outcome. Positive results of nurse involvement have long been anecdotally accepted, but widespread evidence of nurse impact has not emerged, and objective proof of value is still missing. Several factors account for this.

Inconsistent Referrals

For one thing, NCMs are usually considered an adjunct to the claims process, called upon in sticky situations. Too often, referrals to nurses is a last resort rather than an integral and standardized part of claim management. When claims adjusters have the sole responsibility to refer to NCMs, it can be subjective, uneven and therefore unmeasurable.

Besides receiving referrals for sundry issues at different points in the course of the claim, nurses have not clearly articulated their case management interventions. Claims adjusters sometimes misunderstand the nurses’ approach. However, consistent referrals and standardized procedures can bring about major change.

Consistent referrals

Referrals to NCM should be made based on specific medical conditions in claims such as comorbidity like diabetes or problematic injuries like low back strains that tend to morph into complexity and high cost. Specific risky situations found in claims data should automatically trigger NCM notification.

A recent article published in Business Insurance, “Nurses a linchpin in reducing workers’ comp costs,” points out how Liberty Mutual has developed a tool that notifies claims adjusters of cases that would most benefit from a nurse’s involvement. Decision burdens for claims adjusters are eliminated. Referrals to NCM are automatic based on specific high-risk situations found in the claim. Inconsistency disappears, and several benefits evolve from this approach.

Process standardization

An operational process can be dissected and categorized, thereby gaining better understanding of its components and relative importance. Review the data to determine which medical conditions in claims result in longer disability, lower rates of return to work and, of course, higher costs. Select the conditions in claims that should activate an NCM referral.

An example is a mental health diagnosis appearing in the data well into the claim process. A mental health diagnosis appearing during the claim for a physical injury such as a low back strain is a strong indicator of trouble. The injured worker is not progressing toward recovery. However, the only way to know this diagnosis has occurred in a claim is to electronically monitor claims on a continuous basis.

Data monitoring

To identify problematic medical situations in claims and intervene early enough to affect outcome, the data should be monitored continually. Clearly, this is an electronic, not a human function. When the data in a claim matches a select indicator, an automatic notice is sent to the appropriate person.

Standardized procedures

Catching high-risk conditions in claims is just the first step. NCM procedures must be established to guide responses to each situation triggered. Standardized procedures should describe what the NCM should evaluate and advise possible interventions. Such processes not only explain the NCM contribution, they assist in documentation and are the basis for defining value.

Measuring value

NCM has been under-appreciated in the industry because measuring apples-to-apples cost benefit has been impractical. When claims adjusters decide about referring to NCMs and individual nurses create their own methodology, variables are endless and little is measurable.

In contrast to the subjective approach, specific conditions in claims found through continuous data monitoring can automatically trigger a referral to the NCM. In response, the nurse is guided by the standard procedures of the organization. When referrals are based on specific conditions in claims and response procedures are delineated, outcomes can be analyzed and objectively scored.

The Science (and Art) of Data, Part 2

Given the high need and growing demand for data scientists, there are definitely not enough of them. Accordingly, it is important to consider how an insurer might develop a core talent pool of data scientists. As it is often the case when talent is in short supply, acquiring (i.e., buying) data scientist talent is an expensive but fairly quick option. It may make sense to consider hiring one or two key individuals who could provide the center of gravity for building out a data science group. A number of universities have started offering specialist undergraduate and graduate curricula that are focused on data science, which should help address growing demand in relatively soon. Another interim alternative is to “rent” data scientists through a variety of different means – crowdsourcing (e.g., Kaggle), hiring freelancers, using new technology vendors and their specialists or consulting groups to solve problems and engaging consulting firms that are creating these groups in-house.

The longer term and more enduring solution to the shortage of data scientists is to “build” them from within the organization, starting with individuals who possess at least some of the necessary competencies and who can be trained in the other areas. For example, a business architect who has a computational background and acts as a liaison between business and technology groups can learn at least some of the analytical and visualization techniques that typify data scientists. Similarly, a business intelligence specialist who has sufficient understanding of the company’s business and data environment can learn the analytical techniques that characterize data scientists. However, considering the extensive mathematical and computational skills necessary for analytics work, it arguably would be easier to train an analytics specialist in a particular business domain than to teach statistics and programming to someone who does not have the necessary foundation in these areas.

Another alternative for creating a data science office is to build a team of individuals who have complementary skills and collectively possess the core competencies. These “insight teams” would address high-value business issues within tight time schedules. They initially would form something like a skunk works and rapidly experiment with new techniques and new applications to create practical insights for the organization. Once the team is fully functional and proving its worth to the rest of the organization, then the organization can attempt to replicate it in different parts of the business.

However, the truth is there is no silver bullet to addressing the current shortage of data scientists. For most insurers, the most effective near-term solution realistically lies in optimizing skills and in team-based approaches to start tackling business challenges.  

Designing a data science operating model: Customizing the structure to the organization’s needs

To develop a data science function that operates in close tandem with the business, it is important that its purpose be to help the company achieve specific market goals and objectives. When designing the function, ask yourself these four key strategic questions:

  • Value proposition: How does the company define its competitive edge?  Local customer insight? Innovative product offerings? Distribution mastery? Speed?
  • Firm structure: How diverse are local country/divisional offerings and go-to-market structures, and what shared services are appropriate? Should they be provided centrally or regionally?
  • Capabilities, processes and skills: What capabilities, processes and skills do each region require? What are the company’s inherent strengths in these areas? Where does the company want to be best-in-class, and where does it want to be best-in-cost?
  • Technology platform: What are the company’s technology assets and constraints?

There are three key considerations when designing an enterprisewide data science structure: (a) degree of control necessary for effectively supporting business strategy; (b) prioritization of costs to align them with strategic imperatives; and (c) degree of information maturity of the various markets or divisions in scope.

Determining trade-offs: Cost, decision control and maturity

Every significant process and decision should be evaluated along four parameters: (a) need for central governance, (b) need for standardization, (c) need for creating a center of excellence and (d) need for adopting local practices. The figure below illustrates how to optimize these parameters in the context of cost management, decision control and information maturity.

This model will encourage the creation of a flexible and responsive hub-and-spoke model that centralizes in the hubs key decision science functions that need greater governance and control, and harnesses unique local market strengths in centers of excellence. The model localizes in regional or country-specific spokes functions or outputs that require local market data inputs, but adheres to central models and structures.

Designing a model in a systematic way that considers these enterprise-wide business goals has several tangible benefits. First, it will help to achieve an enterprisewide strategy in a cost-effective, timely and meaningful way. Second, it will maximize the impact of scarce resources and skill sets. Third, it will encourage a well-governed information environment that is consistent and responsive throughout the enterprise. Fourth, it will promote agile decision-making at the local market level, while providing the strength of heavy-duty analytics from the center. Lastly, it will mitigate the expensive risks of duplication and redundancy, inconsistency and inefficiency that can result from disaggregation, delayed decision making and lack of availability of appropriate skill sets and insights.