Download

How Machine Learning Changes the Game

Machine learning will improve compliance, cost structures and competitiveness -- but insurers must overcome cultural obstacles.

|
Insurance executives can be excused for having ignored the potential of machine learning until today. Truth be told, the idea almost seems like something out of a 1980s sci-fi movie: Computers learn from mankind’s mistakes and adapt to become smarter, more efficient and more predictable than their human creators. But this is no Isaac Asimov yarn; machine learning is a reality. And many organizations around the world are already taking full advantage of their machines to create new business models, reduce risk, dramatically improve efficiency and drive new competitive advantages. The big question is why insurers have been so slow to start collaborating with the machines. Smart machines Essentially, machine learning refers to a set of algorithms that use historical data to predict outcomes. Most of us use machine learning processes every day. Spam filters, for example, use historical data to decide whether emails should be delivered or quarantined. Banks use machine learning algorithms to monitor for fraud or irregular activity on credit cards. Netflix uses machine learning to serve recommendations to users based on their viewing history and recommendations. In fact, organizations and academics have been working away at defining, designing and improving machine learning models and approaches for decades. The concept was originally floated back in the 1950s, but – with no access to digitized historical data and few commercial applications immediately evident – much of the development of machine learning was largely left to academics and technology geeks. For decades, few business leaders gave the idea much thought. Machine learning brings with it a whole new vocabulary. Terms such as "feature engineering," "dimensionality reduction," "supervised and unsupervised learning," to name a few. As with all new movements, an organization must be able to bridge the two worlds of data science and business to generate value. Driven by data Much has changed. Today, machine learning has become a hot topic in many business sectors, fueled, in large part, by the increasing availability of data and low-cost, scalable, cloud computing. For the past decade or so, businesses and organizations have been feverishly digitizing their data and records – building mountains of historical data on customers, transactions, products and channels. And now they are setting their minds toward putting it to good use. The emergence of big data has also done much to propel machine learning up the business agenda. Indeed, the availability of masses of unstructured data – everything from weather readings through to social media posts – has not only provided new data for organizations to comb through, it has also allowed businesses to start asking different questions from different data sets to achieve differentiated insights. The continuing drive for operational efficiency and improved cost management has also catalyzed renewed interest in machine learning. Organizations of all stripes are looking for opportunities to be more productive, more innovative and more efficient than their competitors. Many now wonder whether machine learning can do for information-intensive industries what automation did for manual-intensive ones. Graphic_page_02_1024x512px A new playing field For the insurance sector, we see machine learning as a game-changer. The reality is that most insurance organizations today are focused on three main objectives: improving compliance, improving cost structures and improving competitiveness. It is not difficult to envision how machine learning will form (at least part of) the answer to all three. Improving compliance: Today’s machine learning algorithms, techniques and technologies can be used on much more than just hard data like facts and figures. They can also be used to  analyze information in pictures, videos and voice conversations. Insurers could, for example, use machine learning algorithms to better monitor and understand interactions between customers and sales agents to improve their controls over the mis-selling of products. Improving cost structures: With a significant portion of an insurer’s cost structure devoted to human resources, any shift toward automation should deliver significant cost savings. Our experience working with insurers suggests that – by using machines instead of humans – insurers could cut their claims processing time down from a number of months to a matter of minutes. What is more, machine learning is often more accurate than humans, meaning that insurers could also cut down the number of denials that result in appeals they may ultimately need to pay out. Improving competitiveness: While reduced cost structures and improved efficiency can certainly lead to competitive advantage, there are many other ways that machine learning can give insurers the competitive edge. Many insurance customers, for example, may be willing to pay a premium for a product that guarantees frictionless claim payout without the hassle of having to make a call to the claims team. Others may find that they can enhance customer loyalty by simplifying re-enrollment processes and client on-boarding processes to just a handful of questions. Overcoming cultural differences It is surprising, therefore, that insurers are only now recognizing the value of machine learning. Insurance organizations are founded on data, and most have already digitized existing records. Insurance is also a resource-intensive business; legions of claims processors, adjustors and assessors are required to pore over the thousands – sometimes millions – of claims submitted in the course of a year. One would therefore expect the insurance sector to be leading the charge toward machine learning. But it is not. One of the biggest reasons insurers have been slow to adopt machine learning clearly comes down to culture. Generally speaking, the insurance sector is not widely viewed as being "early adopters" of technologies and approaches, preferring instead to wait until technologies have become mature through adoption in other sectors. However, with everyone from governments through to bankers now using machine learning algorithms, this challenge is quickly falling away. The risk-averse culture of most insurers also dampens the organization’s willingness to experiment and – if necessary – fail in its quest to uncover new approaches. The challenge is that machine learning is all about experimentation and learning from failure; sometimes organizations need to test dozens of algorithms before they find the most suitable one for their purposes. Until "controlled failure" is no longer seen as a career-limiting move, insurance organizations will shy away from testing new approaches. Insurance organizations also suffer from a cultural challenge common in information-intensive sectors: data hoarding. Indeed, until recently, common wisdom within the business world suggested that those who held the information also held the power. Today, many organizations are starting to realize that it is actually those who share the information who have the most power. As a result, many organizations are now keenly focused on moving toward a "data-driven" culture that rewards information sharing and collaboration and discourages hoarding. Starting small and growing up The first thing insurers should realize is that this is not an arms race. The winners will probably not be the organizations with the most data, nor will they likely be the ones that spent the most money on technology. Rather, they will be the ones that took a measured and scientific approach to building their machine learning capabilities and capacities and – over time – found new ways to incorporate machine learning into ever-more aspects of their business. Insurers may want to embrace the idea of starting small. Our experience and research suggest that – given the cultural and risk challenges facing the insurance sector – insurers will want to start by developing a "proof of concept" model that can safely be tested and adapted in a risk-free environment. Not only will this allow the organization time to improve and test its algorithms, it will also help the designers to better understand exactly what data is required to generate the desired outcome. More importantly, perhaps, starting with pilots and "proof of concepts" will also provide management and staff with the time they need to get comfortable with the idea of sharing their work with machines. It will take executive-level support and sponsorship as well as keen focus on key change management requirements. Take the next steps Recognizing that machines excel at routine tasks and that algorithms learn over time, insurers will want to focus their early "proof of concept" efforts on those processes or assessments that are widely understood and add low value. The more decisions the machine makes and the more data it analyzes, the more prepared it will be to take on more complex tasks and decisions. Only once the proof of concept has been thoroughly tested and potential applications are understood should business leaders start to think about developing the business case for industrialization (which, to succeed in the long term, must include appropriate frameworks for the governance, monitoring and management of the system). While this may – on the surface – seem like just another IT implementation plan, the reality is that it machine learning should be championed not by IT but rather by the business itself. It is the business that must decide how and where machines will deliver the most value, and it is the business that owns the data and processes that machines will take over. Ultimately, the business must also be the one that champions machine learning. All hail, machines!           At KPMG, we have worked with a number of insurers to develop their "proof of concept" machine learning strategies over the past year, and we can say with absolute certainty that the Battle of Machines in the insurance sector has already started. The only other certainty is that those that remain on the sidelines will likely suffer the most as their competitors find new ways to harness machines to drive increasing levels of efficiency and value. The bottom line is that the machines have arrived. Insurance executives should be welcoming them with open arms.

Gary Richardson

Profile picture for user GaryRichardson

Gary Richardson

Gary Richardson leads a team of data scientists and data engineers at KPMG in the agile development of data science solutions. The focus of the team is raising the bar in terms of industrializing data science solutions and getting the science into business-as-usual functions.

Google Applies Pressure to Innovate

Agents and carriers have a level of expertise that is unmatched by the Googles of the world, but it is in danger of being wasted.

This article was first published at re/code. It’s a common thread in nearly every industry: Innovation occurs when consumers’ growing needs and expectations converge with intense competition. It’s no surprise, then, that insurance — not exactly known for being on the forefront of technology — is one of the last remaining industries to innovate and fully embrace data, analytics and customer communication technologies. Insurance is a complex purchase business with a convoluted ecosystem and ever-changing regulatory requirements that has kept the industry in a well-protected bubble from external competition for decades. Now in 2015, the announcement of Google Compare for auto insurance pushes the industry to innovate from a technology standpoint, but most importantly from a structural standpoint, by changing the way insurance companies interact with their customers. The reasons below outline why Google has the greatest chance to succeed where others have not. A Lesson From Other Industries Google has previously disrupted numerous industries to great success — think health, travel and navigation — mostly because of its dominance in search. Many of Google’s consumer-facing businesses have followed as logical next steps in the Google search process. For example, do you want to use Google to search for the best insurance company, or would you prefer to find the best insurance company with the cheapest policy? Do you want to use Google to find the route for your road trip, or would you prefer to have Google find you the best route? Google’s constant innovation stems from a simple but effective idea: Eliminate an unnecessary extra step (or steps) in the process, and give the consumer what they desire most — ease and simplicity. There are some who believe that the tech giant may not be doing anything noticeably different from other aggregators in the auto insurance space. However, if its accomplishments in other industries tell us anything, Google will find a way to engage the consumer better than incumbent insurers do. Rather than writing its own business and determining individual risks, Google has teamed up with carriers of all sizes to reach customers efficiently, allowing them to quickly search, get rates and compare policies “pound for pound.” Already, this platform has helped shift the insurance industry’s emphasis on the customer by allowing peer-to-peer ratings and allowing consumers to openly disclose any negative or positive experiences, which will breed superior customer service and experience. Millennials Trust Google It is highly unlikely that Google will ever become a full insurance company with its own agents and underwriters, but Google brings a brand name that elicits trust and familiarity. This is especially true of Millennials, who are set to overtake Baby Boomers as the largest consumer demographic, at 75.3 million in 2015. When Strategy Meets Action reported in early 2014 that two-thirds of insurance customers would consider purchasing products from organizations other than an insurer — including 23% from online service providers like Google — it created tension in the insurance industry. These findings are largely a reflection of consumer discontent with insurance companies and their seeming lack of transparency. Millennials do not trust insurance companies, but they do trust Google with just about every engagement they have with the Internet. And consumers trust other consumers: Google Compare’s user feedback platform brings transparency to consumers and requires the insurance industry to reevaluate how to effectively engage customers in a tech-driven environment. Pushed by Google’s unique insight into Millennials, traditional insurance companies must acquaint themselves with their new consumers, who are often considered impatient, demanding and savvy about social media. Establishing a Preferred Consumer Platform An eye-opening Celent study recently found that less than 10% of North American consumers actually choose financial service products based on better results. Instead, a vast majority places higher importance on ease (26%) and convenience (26%). Based on these findings, Google is using a business model that embodies the preferred consumer experience, a notion that is being reinforced by initial pilot results in California. According to Stephanie Cuthbertson, group product manager of Google Compare, millions of people have used Google to find quotes since its launch in March, and more than half received a quote cheaper than their existing policy. Other new entrants, like Overstock, have reported issues with completion of purchase because consumers will browse offerings but still hesitate to complete their purchase online in a single visit to a website. Google’s platform is attempting to avoid this issue by announcing agency support through its partnership with Insurance Technologies, allowing consumers peace of mind by speaking to an agent before purchasing a policy — but maintaining the online price quote throughout the buying experience. Potential for Future Growth While Google Compare is beginning with auto insurance, work with CoverHound gives a glimpse into where it may be looking to expand. CoverHound’s platform specializes in homeowners’ and renters’ insurance, the latter of which is growing exponentially with the Millennial generation, who prefer to rent rather than buy. According to a recent TransUnion study, seven out of 10 Millennials prefer to conduct research online with their laptop, computer or mobile device when searching for a new home or apartment to rent. Google Compare has also already shown momentum by recently announcing its expansion of services to Texas, Illinois and Pennsylvania, while adding a ratings system for each company it works with — much like the insurance version of TripAdvisor or Expedia. The Bottom Line Nearly every industry undergoes disruption when consumer expectations shift and businesses are forced to adapt and keep up. For decades, insurance didn’t have the kind of pressure from outside entrants that it is currently facing. Whether Google fails or succeeds early on makes little difference: Its entrance is a wake-up call. The more tech companies enter the space, the more traditional insurance must struggle to play catch-up. These new entrants are helping to not only force innovation from a technology standpoint but also to bring an innovation culture to the industry so insurers can stay ahead of consumers demands around buying and customer service. Agents and insurance carriers have a level of expertise that is unmatched by the Googles of the world, but it will be wasted if insurers can’t figure out a way to integrate that expertise in a modern way and connect to consumers through different social channels. The writing is on the wall, and how traditional insurance reacts will ultimately decide its relevance in the industry of the future.

Dax Craig

Profile picture for user DaxCraig

Dax Craig

Dax Craig is the co-founder, president and CEO of Valen Analytics. Based in Denver, Valen is a provider of proprietary data, analytics and predictive modeling to help all insurance carriers manage and drive underwriting profitability.

What Will Workers' Comp Be in 20 Years?

Should the workers comp requirement be eliminated for the 75% of employers and occupations with negligible additional risk?

At the 67th annual SAWCA Convention, Frank Neuhauser, executive director of the Center for the Study of Social Insurance (CSSI) at University of California at Berkeley, opened his keynote by discussing the future of workers’ compensation. He noted that the current system does not efficiently address the issues that employers and employees face today. In 1915, the system was created primarily to support a heavily industrialized workforce, but that is no longer the case. The organizational costs associated with administering $1.00 of medical treatment under workers comp is estimated to be $1.25. To administer at this same level of service under a group health plan costs just 14 cents. In addition, the use of workers’ compensation Medicare set asides (MSAs) creates Medicare to lose between 25% and 40% of the amount, and the MSA process is inefficient and has the potential to become costly. Neuhauser recommends that a more streamlined approach be designed by the states to limit the duration of employers’ liability to two years and then assess insurers and self-insurers “fair” payment to a Medicare trust fund. Non-traumatic injuries make up 67% of claims and 75% of claim dollars. So where do these injuries frequently occur? An employee is four times more likely to suffer a fatal injury away from the workplace, and 75% of workers are in occupations that are low hazard. In the next 20 years, this may lead to the conclusion that the workers’ compensation requirement for employers and occupations with negligible additional risk should be eliminated. Based on this conclusion and the ability to administer medical benefits more efficiently, Neuhauser recommends that 75% of employers should administer their workers’ compensation through a group health plan.

Reducing Losses From Extreme Events

Imagine this combination: less damage to property from extreme events, lower costs to insurers and lower costs to the government.

||||
The number of presidential disaster declarations in the U.S. has dramatically increased over the past 50 years. Figure 1 depicts the total number of presidential disaster declarations and those that were triggered by flooding events (inland flood and storm surge from hurricanes). This pattern highlights the need to encourage those at risk to invest in loss reduction measures prior to a disaster rather than waiting until after the event occurs. Insurance coupled with other risk management programs can play an important role, as it is designed to spread and reduce risk. Each policyholder pays a relatively small premium to an insurer, which can then cover the large losses suffered by a few. Ideally, those who invest in loss prevention measures are rewarded by having the price of their coverage reduced to reflect their lower expected claims payments. fig1
Insurance against low-probability, high-consequence (LP-HC) events presents a special challenge for individuals at risk, insurers and regulators, for good reason. Decision-makers have limited experience with these events, and even experts are likely to conclude that there is considerable uncertainty as to the probability of these events occurring and their resulting consequences. As a result, insurance decisions often differ from those recommended by normative models of choice. Consider the following examples:
Example 1: Most homeowners in flood-prone areas do not voluntarily purchase flood insurance—even when it is highly subsidized—until after they suffer flood damage. If they then do not experience losses in the next few years, they are likely to cancel their policy. Demand for earthquake insurance in California increased significantly after the Northridge earthquake of 1994— the last severe quake in the state; today relatively few homeowners have coverage.
Example 2: Prior to the terrorist attacks of Sept. 1, 2001, actuaries and underwriters did not price the risk associated with terrorism, nor did they exclude this coverage from their standard commercial policies. Their failure to examine the potential losses from a terrorist attack was surprising given the truck bomb that al Qaeda detonated below the North Tower of the World Trade Center in 1993, the 1995 Oklahoma City bombing and other terrorist-related events throughout the world. Following 9/11, most insurance companies refused to offer coverage against terrorism, considering it to be an uninsurable risk. Example 3: State insurance regulators sometimes have restricted insurers from setting premiums that reflect risk, in part to address equity and fairness issues for those in need of homeowners’ insurance. For example, following Hurricane Andrew in 1992, the Florida insurance commission did not allow insurers to charge risk-based rates and restricted them from canceling existing homeowners’ policies. After the severe hurricanes of 2004 and 2005 in Florida, the state-funded company Citizens Property Insurance Corp., which had been the insurer of last resort, offered premiums in high-risk areas at subsidized rates, thus undercutting the private market. Today, Citizens is the largest provider of residential wind coverage in Florida.
The three examples indicate that insurance today is not effectively meeting two of its most important objectives:
  • providing information to those residing in hazard-prone areas as to the nature of the risks they face;
  • giving incentives to those at risk to undertake loss reduction measures prior to a disaster.
The insurance industry played both of these roles very effectively when the factory mutual companies were founded in the 19th century, as detailed in Box 1. This paper proposes a strategy for insurance to take steps to return to its roots. The examples and empirical data presented here are taken primarily from experience in the U.S.; however, the concepts have relevance to any country that uses insurance to protect its residents and businesses against potentially large losses. The next three sections explore the rationale for the actions taken by each of the interested parties illustrated in the above three examples by focusing on their decision processes prior to and after a disaster. I then propose two guiding principles for insurance and outline a long-term strategy with roles for the private and public sectors if these principles are implemented. Reforming the National Flood Insurance Program (NFIP) to encourage mitigation for reducing future losses while providing financial protection to those at risk is a target of opportunity that should be seriously considered. The concluding section suggests directions for future studies and research so that insurance can play a central role in reducing losses from extreme events. fig2 DECISION PROCESSES
Intuitive and deliberative thinking A large body of cognitive psychology and behavioral decision research over the past 30 years has revealed that individuals and organizations often make decisions under conditions of risk and uncertainty by combining intuitive thinking with deliberative thinking. In his thought-provoking book Thinking, Fast and Slow, Nobel laureate Daniel Kahneman has characterized the differences between these two modes of thinking. Intuitive thinking (System 1) operates automatically and quickly with little or no effort and no voluntary control. It is often guided by emotional reactions and simple rules of thumb that have been acquired by personal experience. Deliberative thinking (System 2) allocates attention to intentional mental activities where individuals undertake trade-offs and recognize relevant interdependencies and the need for coordination. Choices are normally made by combining these two modes of thinking and generally result in good decisions when individuals have considerable experience as a basis for their actions. With respect to LP-HC events, however, there is a tendency to either ignore a potential disaster or overreact to a recent one, so that decisions may not reflect expert risk assessments. For example, after a disaster, individuals are likely to want to purchase insurance even at high prices, while insurers often consider restricting coverage or even withdraw from the market. In these situations, both parties focus on the losses from a worst-case scenario without adequately reflecting on the likelihood of this event occurring in the future. Impact of intuitive thinking on consumer behavior Empirical studies have revealed that many individuals engage in intuitive thinking and focus on short-run goals when dealing with unfamiliar LP-HC risks. More specifically, individuals often exhibit systematic biases such as the availability heuristic, where the judged likelihood of an event depends on its salience and memorability. There is thus a tendency to ignore rare risks until after a catastrophe occurs. This is a principal reason why it is common for individuals at risk to purchase insurance only after a disaster. Purchase of flood insurance A study of the risk perception of homeowners in New York City revealed that they underestimate the likelihood of water damage from hurricanes. This may explain why only 20% of those who suffered damage from Hurricane Sandy had purchased flood insurance before the storm occurred.
An in-depth analysis of the entire portfolio of the NFIP in the U.S. revealed that the median tenure of flood insurance was between two and four years, while the average length of time in a residence was seven years. For example, of the 841,000 new policies bought in 2001, only 73% were still in force one year later. After two years, only 49% were in force, and eight years later only 20%. Similar patterns were found for each of the other years in which a flood insurance policy was first purchased. One reason that individuals cancel their policies is that they view insurance as an investment rather than a protective activity. Many purchase coverage after experiencing a loss from a disaster but feel they wasted their premiums if they have not made a claim over the next few years. They perceive the likelihood of a disaster as so low that they do not pay attention to its potential consequences and conclude they do not need insurance. A normative model of choice, such as expected utility theory, implies that risk-averse consumers should value insurance, as it protects them against large losses relative to their wealth. Individuals should celebrate not having suffered a loss over a period rather than canceling their policy because they have not made a claim. A challenge facing insurers is how to convince their policyholders that the best return on an insurance policy is no return at all. Purchase of earthquake insurance Another example that reveals how the availability bias affects the choice process is the decision of California homeowners on whether to purchase earthquake insurance. Surveys of owner-occupied homes in California counties affected by the 1989 Loma Prieta earthquake showed a significant increase in the purchase of coverage. Just prior to the disaster, only 22% of the homes had earthquake insurance. Four years later, 37% had purchased earthquake insurance—a 64% increase. Similarly, the Northridge earthquake of 1994 led to a significant demand for earthquake insurance. For example, more than two-thirds of the homeowners surveyed in Cupertino county had purchased earthquake insurance in 1995. There have been no severe earthquakes in California since Northridge, and only 10% of those in seismic areas of the state have earthquake insurance today. If a severe quake hits San Francisco in the near future, the damage could be as high as $200 billion, and it is likely that most homeowners suffering damage will be financially unprotected.
Impact of intuitive thinking on insurer behavior Two factors play an important role in insurers’ behavior with respect to pricing and coverage decisions: the role of experience and the role of ambiguous risk. We examine each of these features in turn.
Role of experience on supply of insurance When insurers have experienced significant losses from a particular extreme event, there is a tendency for them to focus on worst-case scenarios without adequately considering their likelihood. In some instances, because of extreme losses from hurricanes, floods, earthquakes and terrorist attacks, insurers determined that they could not continue to market coverage in the U.S. without involvement by the public sector. In these situations, either the state or federal government stepped in to fill the void. Hurricane wind-related losses Following catastrophic wind losses from hurricanes in Florida, insurers felt they had to significantly raise their homeowners’ premiums. Rather than using catastrophe models to justify rate increases, insurers pointed to their large losses following Hurricane Andrew in 1992 as a basis for demanding higher premiums, without considering the likelihood of another disaster of this magnitude. The insurers were denied these rate increases and reduced their supply of new homeowners’ policies. By the beginning of 2004, most insurers viewed their Florida rates as being close to adequate except in the highest-risk areas. However, after four major hurricanes battered Florida in 2004 and two more in 2005, many insurers again began to file for major premium increases, and many of them were denied, or approved at lower increases by the regulators. In 2007, the Florida Office of Insurance Regulation (FLOIR) took a position against any further rate increases of homeowners’ insurers and denied requests by all insurers. In December 2008, State Farm asked for a 67% increase in premiums that was denied by the FLOIR, leading the insurer to announce that it would no longer offer homeowners’ coverage in Florida. Five years later (March 2014), State Farm announced that it would again begin offering homeowners and renters insurance in the state on a limited basis. Flood insurance Following the severe Mississippi floods of 1927 and continuing through the 1960s, there was a widespread belief among private insurance companies that the flood peril was uninsurable by the private sector for several reasons: Adverse selection would be a problem because only particular areas are subject to the risk; risk-based premiums would be so high that no one would be willing to pay them; and flood losses could be so catastrophic as to cause insolvencies or have a significant impact on surplus. This lack of coverage by the private sector triggered significant federal disaster relief to victims of Hurricane Betsy in 1965 and led to the creation of the NFIP in 1968. The NFIP subsidized premiums to maintain property values on structures in flood-prone areas; new construction was charged premiums reflecting risk. Even though premiums on existing property were highly subsidized, relatively few homeowners purchased coverage, leading the U.S. Congress to pass the Flood Disaster Protection Act (FDPA) of 1973. This bill required all properties receiving federally backed mortgages to purchase flood insurance. The NFIP has grown extensively in the past 40 years; as of January 2015, it had sold more than 5.2 million policies in 22,000 communities and provided almost $1.3 trillion in coverage. Insurance tends to be concentrated in coastal states, with Florida and Texas alone composing nearly 40% of the entire program (in number of policies, premiums and coverage). After making claims payments from Hurricane Katrina in 2005, the NFIP found itself $18 billion in debt, so that its borrowing authority had to be increased from $1.5 billion to $20.78 billion. To date, the program has borrowed nearly $27 billion from the U.S. Treasury to meet its claims obligations in the aftermath of the 2004, 2005, 2008 and 2012 hurricane seasons.
In July 2012 (three months before Hurricane Sandy), Congress passed and the president signed the Biggert–Waters Flood Insurance Reform Act of 2012 (BW12), which applied the tools of risk management to the increasingly frequent threat of flooding. Among its many provisions, the legislation required that the NFIP produce updated floodplain maps, strengthen local building code enforcement, remove insurance subsidies for certain properties and move toward charging premiums that reflect flood risk. Soon after becoming law, BW12 faced significant challenges from some homeowners who had reason to complain that the new flood maps overestimated their risk. These residents and other homeowners in flood-prone areas felt that their proposed premium increases were unjustified and that they could not afford the increased premiums that they would face. In March 2014, Congress passed the Homeowner Flood Insurance Affordability Act (HFIAA14), which required the Federal Emergency Management Agency (FEMA) that operates the NFIP to draft an affordability framework based on the recommendations of a National Academy of Sciences’ study that addresses the affordability of flood insurance premiums. Earthquake insurance Until the San Fernando earthquake of 1971, few homeowners and businesses in California had purchased earthquake insurance even though coverage had been available since 1916. In 1985, the California legislature passed a law requiring insurers writing homeowners’ policies on one- to four-family units to offer earthquake insurance to these residents. The owners did not have to buy this coverage; the insurers only had to offer it. At that time and still today, banks and financial institutions do not require earthquake insurance as a condition for a mortgage. The Northridge earthquake of January 1994 caused insured losses of $20.6 billion, primarily to commercial structures. In the three years following Northridge, demand for earthquake insurance by homeowners increased 19% in 1994, 20% in 1995 and 27% in 1996, leading private insurance companies in California to re-evaluate their seismic risk exposures. Insurers concluded that they would not sell any more policies on residential property, as they were concerned about the impact of another catastrophic earthquake on their balance sheets. The California Insurance Department surveyed insurers and found that as many as 90% of them had either stopped or had placed restrictions on the selling of new homeowners’ policies. This led to the formation of a state-run earthquake insurance company—the California Earthquake Authority (CEA)—in 1996.
Terrorism insurance Following the terrorist attacks of 9/11, most insurers discontinued offering terrorism coverage given the refusal of global reinsurers to provide them with protection against severe losses from another attack. The few that did provide insurance charged extremely high premiums to protect themselves against a serious loss. Prior to 9/11, Chicago’s O’Hare Airport had $750 million of terrorism insurance coverage at an annual premium of $125,000. After the terrorist attacks, insurers offered the airport only $150 million of coverage at an annual premium of $6.9 million. This new premium, if actuarially fair, implies the annual likelihood of a terrorist attack on O’Hare Airport to be approximately 1 in 22 ($6.9 million/$150 million), an extremely high probability. The airport was forced to purchase this policy because it could not operate without coverage. Concern about high premiums and limited supply of coverage led Congress to pass the Terrorism Risk Insurance Act (TRIA) at the end of 2002 that provided a federal backstop up to $100 billion for private insurance claims related to terrorism. The act was extended in 2005 for two years, in 2007 for seven years and in January 2015 for another six years, with some modification of its provisions each time the legislation was renewed. In return for federal protection against large losses, TRIA requires that all U.S. primary insurance companies offer coverage against terrorism risk on the same terms and conditions as other perils provided by their commercial insurance policies. Firms are not required to purchase this coverage unless mandated by state law, which is normally the case for workers’ compensation insurance. TRIA also established a risk-sharing mechanism between the insurance industry, the federal government and all commercial policyholders in the U.S. for covering insured losses from future terrorist attacks. Role of ambiguity After 9/11, insurers determined that they could not offer terrorism insurance because the uncertainties surrounding the likelihood and consequences of another terrorist attack were so significant that the risk was uninsurable by the private sector alone. Because terrorists are likely to design their strategy as a function of their own resources and their knowledge of the vulnerability of the entity they want to attack, the nature of the risk is continuously evolving. This dynamic uncertainty makes the likelihood of future terrorist events extremely difficult to estimate. Empirical evidence based on surveys of underwriters reveals that insurers will set higher premiums when faced with ambiguous probabilities and uncertain losses than for a well-specified risk. Underwriters of primary insurance companies and reinsurance firms were surveyed about the prices they would charge to insure a factory against property damage from a severe earthquake when probabilities and losses were well specified and when the probabilities and losses were ambiguous. The premiums the underwriters charged for the ambiguous case were 1.43–1.77 times higher than if underwriters priced a precise risk.
A recent web-based experiment provided actuaries and underwriters in insurance companies with scenarios in which they seek advice and request probability forecasts from different groups of experts and then must determine what price to charge for coverage for flood damage and wind damage from hurricanes. The average premiums that insurers would charge was approximately 30% higher for coverage against either of these risks if the probability of damage was ambiguous rather than well-specified and if the experts were conflicted over their estimates. The data reveal that they would likely charge more in the case of conflict ambiguity (i.e., experts disagree on point estimates) than imprecise ambiguity (i.e., experts agree on a range of probability, recognizing that they cannot estimate the probability of the event precisely). Impact of intuitive thinking on regulator behavior Rate regulation and restriction on coverage has had more impact on property insurance than on any other line of coverage, particularly in states that are subject to potentially catastrophic losses from natural disasters. Homeowners’ insurance in Florida Following Hurricane Andrew in August 1992, Florida regulators imposed a moratorium on the cancellation and nonrenewal of homeowners’ insurance policies during the coming hurricane season for insurers that wanted to continue to do any business in Florida. In November 1993, the state legislature enacted a bill that these insurers could not cancel more than 10% of their homeowners’ policies in any county in Florida in one year and not cancel more than 5% of their property owners’ policies statewide for each of the next three years. During the 1996 legislative session, this phase-out provision was extended until June 1, 1999. Early in 2007, Florida enacted legislation that sought to increase regulatory control over rates and roll them back based on new legislation that expanded the reinsurance coverage provided by the Florida Hurricane Catastrophe Fund (FHCF). Insurers were required to reduce their rates to reflect this expansion of coverage, which was priced below private reinsurance market rates. This requirement applies to every licensed insurer even if an insurer does not purchase reinsurance from the FHCF. Citizens Property Insurance Corp., Florida’s state-funded company, was formed in 2002 and has experienced a significant increase in market share of the residential property market in recent years. Consumers are allowed to purchase a policy from Citizens if a comparable policy would cost 15% more in the private market. The most serious defect of such a system is that it encourages individuals to locate in high-hazard areas, thus putting more property at risk than would occur under a market system. This is the principal reason not to introduce such a system in the first place. Since 2005, there have been no hurricanes causing severe damage in Florida. But should there be a serious disaster that depletes Citizens’ reserves, the additional claims are likely to be paid from assessments (taxes) charged to all homeowners in Florida.
Earthquake insurance in California As pointed out earlier, when insurers refused to continue to offer earthquake insurance in California, the state formed the CEA. The CEA set the premiums in many parts of the state at higher levels than insurers had charged prior to the Northridge earthquake of 1994. At the same time, the minimum deductible for policies offered through the CEA was raised from 10% to 15% of the insured value of the property. There was no consideration by the state insurers as to how this change would affect the demand for coverage. This increased price/reduced coverage combination was not especially attractive to homeowners in the state. A 15% deductible based on the amount of coverage in place is actually quite high relative to damages that typically occur. Most homes in California are wood-frame structures that would likely suffer relatively small losses in a severe earthquake. For example, if a house was insured at $200,000, a 15% deductible implies that the damage from the earthquake would have to exceed $30,000 before the homeowner could collect a penny from the insurer. Given that only 10% of homeowners in California have quake insurance today, if a major earthquake were to occur in California next year so that many homes were partially damaged, the uninsured losses could be very high. It is surprising that there has been little interest by private insurers in offering earthquake coverage at competing or lower rates to those offered by the CEA, even though there is no regulation preventing them from doing so. GUIDING PRINCIPLES The following two guiding principles should enable insurance to play a more significant role in the management and financing of catastrophic risks. Principle 1—Premiums should reflect risk Insurance premiums should be based on risk to provide individuals with accurate signals as to the nature of the hazards they face and to encourage them to engage in cost-effective mitigation measures to reduce their vulnerability. Risk-based premiums should also reflect the cost of capital that insurers need to integrate into their pricing to ensure an adequate return to their investors. Catastrophe models have been developed and improved over the past 25 years to more accurately assess the likelihood and damages resulting from disasters of different magnitudes and intensities. Today, insurers and reinsurers use the estimates from these models to determine risk-based premiums and how much coverage to offer in hazard-prone areas.
If Principle 1 is applied to risks where premiums are currently subsidized, some residents will be faced with large price increases. This concern leads to the second guiding principle. Principle 2—Dealing with equity and affordability issues Any special treatment given to low-income individuals currently residing in hazard-prone areas should come from general public funding and not through insurance premium subsidies. Funding could be obtained from several different sources such as general taxpayer revenue or state government or by taxing insurance policyholders depending on the response to the question, “Who should pay?” It is important to note that Principle 2 applies only to those individuals who currently reside in hazard-prone areas. Those who decide to locate in these regions in the future would be charged premiums that reflect the risk. Developing long-term strategies for dealing with extreme events Given the nature of intuitive thinking for LP-HC events, this section proposes strategies for applying the two guiding principles so that insurance in combination with other policy tools can reduce future losses from extreme events. The proposed risk management strategy involves:
  • Choice architecture to frame the problem so that the risks are transparent and key interested parties recognize the importance of purchasing and maintaining insurance while also undertaking protective measures to reduce their losses from the next disaster.
  • Public–private partnerships to assist those who cannot afford to invest in protective measures and to provide financial protection against catastrophic losses for risks that are considered uninsurable by the private sector alone.
  • Multi-year insurance to provide premium stability to policyholders and lower marketing costs to insurers and to reduce cancellation of coverage by those at risk.
Choice architecture The term choice architecture, coined by Thaler and Sunstein, indicates that people’s decisions often depend in part on how different options are framed and presented. Framing in the context of LP-HC events typically refers to the way in which likelihoods and outcomes are characterized. One can also influence decisions by varying the reference point or by changing the order in which alternatives or their attributes are presented, or by setting one option as the no-choice default option.
Framing the risk People are better able to evaluate low-probability risks when these are presented via a familiar concrete context. For example, individuals might not understand what a one-in-a-million risk means but can more accurately interpret this figure when it is compared to the annual chance of dying in an automobile accident (1-in-6,000) or lightning striking your home on your birthday (less than one in a billion). Probability is more likely to be a consideration if it is presented using a longer time frame. People are more willing to wear seat belts if they are told they have a 1-in-3 chance of an accident over a 50-year lifetime of driving, rather than a 1-in-100,000 chance of an accident on each trip they take. Similarly, a homeowner or manager considering earthquake protection over the 25-year life of a home or factory is far more likely to take the risk seriously if told that the chance of at least one severe earthquake occurring during this time is greater than 1-in-5, rather than 1-in-100 in any given year. Studies have shown that even just multiplying the single-year risk so the numerator is larger— presenting it as 10-in-1,000 or 100-in-10,000 instead of 1-in-100—makes it more likely that people will pay attention to the event. Studies have also found that comparisons of risks— rather than just specifying the probability of a loss or an insurance premium—are much more effective in helping decision-makers assess the need for purchasing insurance. Another way to frame the risk so that individuals pay attention is to construct a worst-case scenario. Residents in hazard-prone areas who learn about the financial consequences of being uninsured if they were to suffer severe damage from a flood or earthquake would have an incentive to purchase insurance coverage and may refrain from canceling their insurance if they have not made a claim for a few years. One could then provide them with information on the likelihood of the event occurring over the next 25 years rather than just next year. Insurers could also construct worst-case scenarios and then estimate the likelihood of the event’s occurrence when pricing their insurance policies. They could then determine a premium that reflects their best estimate of their expected loss while at the same time factoring in the uncertainty surrounding the risk. Default options Field and controlled experiments in behavioral economics reveal that consumers are more likely to stick with the default option rather than going to the trouble of opting out in favor of some alternative. Many examples of this behavior are detailed in Thaler and Sunstein’s important book, Nudge. To date, this framing technique has been applied to situations where the outcome is either known with certainty, or when the chosen option (such as a recommended 401(k) plan), has a higher expected return than the other options. It is not clear whether people who failed to purchase coverage would reverse course if having insurance against an extreme event was the default option, given the intuitive thinking that individuals employ for these types of risks. More empirical research is needed to more fully understand the role that default options can play with respect to encouraging insurance protection for LP-HC events.
Public–private partnerships
Individuals at risk may be reluctant to invest in cost-effective loss reduction measures when these involve a high, upfront cash outlay. Given budgetary constraints and individuals’ focus on short time horizons, it is difficult to convince them that the expected discounted benefits of the investment over the expected life of the property exceeds the immediate upfront cost. Decision-makers’ resistance is likely to be compounded if they perceive the risk to be below their threshold level of concern. Residents in hazard-prone areas may also be concerned that, if they move in the next few years, the property value of their home will not reflect the expected benefits from investing in loss reduction measures because the new owner will not be concerned about the risk of a disaster. Mitigation grants and loans FEMA created the Flood Mitigation Assistance (FMA) program in 1994 to reduce flood insurance claims. FMA is funded by premiums received by the NFIP to support loss reduction measures, such as elevation or relocation of property, flood-proofing commercial structures or demolition and rebuilding of property that has received significant damage from a severe flood. In July 2014, Connecticut initiated its Shore Up CT program designed to help residential or business property owners elevate buildings or retrofit properties with additional flood protection, or assist with wind-proofing structures on property that is prone to coastal flooding. This state program, the first in the U.S., enables homeowners to obtain a 15-year loan ranging from $10,000 to $300,000 at an annual interest rate of 2 3⁄4%. More generally, long-term loans to homes and businesses for mitigation would encourage individuals to invest in cost-effective risk-reduction measures. Consider a property owner who could pay $25,000 to elevate his coastal property from three feet below Base Flood Elevation (BFE) to one foot above BFE to reduce storm surge damage from hurricanes. If flood insurance is risk-based, then the annual premium would decrease by $3,480 (from $4,000 to $520). A 15-year loan for $25,000 at an annual interest rate of 2 3⁄4% would result in annual payments of $2,040 so that the savings to the homeowner each year would be $1,440 (that is, $3,480−$2,040). Means-tested vouchers One way to maintain risk-based premiums while at the same time addressing issues of affordability is to offer means-tested vouchers that cover part of the cost of insurance. Several existing programs could serve as models for developing such a voucher system: the Food Stamp Program, the Low Income Home Energy Assistance Program (LIHEAP) and Universal Service Fund (USF). The amount of the voucher would be based on current income and determined by a specific set of criteria as outlined in the National Research Council’s report on the affordability of flood insurance. If the property owners were offered a multi-year loan to invest in mitigation measure(s), the voucher could cover not only a portion of the resulting risk-based insurance premium, but also the annual loan cost to make the package affordable. As a condition for the voucher, the property owner could be required to invest in mitigation.
An empirical study of homeowners in Ocean county, NJ, reveals that the amount of the voucher is likely to be reduced significantly from what it would have been had the structure not been mitigated, as shown in Figure 2 for property in a high-hazard flood area (the V Zone) and a lower-hazard area (the A Zone). fig3
Catastrophe coverage Insurers’ withdrawal from certain markets because of lack of reinsurance capacity and other risk transfer instruments (e.g. catastrophe bonds) led to the establishment of government-backed programs such as the CEA, NFIP and TRIA. If insurers were permitted to charge risk-based premiums, they would very likely want to market coverage against earthquakes and floods as long as they were protected against catastrophic losses. State reinsurance facilities could play an important role in this regard if premiums were risk-based using data provided by catastrophe models. One such facility exists today—the FHCF. It was established in 1993 following Hurricane Andrew to supplement private reinsurance and reimburse all insurers for a portion of their losses from catastrophic hurricanes.
TRIA provides protection to insurers against catastrophic losses from future terrorist attacks. American taxpayers will not be responsible for any payments until the total commercial losses from a terrorist attack exceed $60 billion. In other words, insurers will cover the entire losses from future terrorist attacks that are not catastrophic. Lewis and Murdock proposed that the federal government auction a limited number of catastrophe reinsurance contracts annually to private insurers to provide them with more capacity to handle truly extreme events. The design of such contracts would have to be specified, and a more detailed analysis would have to be undertaken to determine the potential impact of such an auction mechanism on the relevant stakeholders. Well-enforced regulations and standards Given the reluctance of individuals to voluntarily purchase insurance against losses, one should consider requiring catastrophic coverage for all individuals who face risk. Social welfare is likely to be improved under the assumption that individuals would have wanted insurance protection had they perceived the risk correctly, not exhibited systematic biases and used simplified decision rules that characterize intuitive thinking. If the public sector were providing protection against catastrophic losses from these extreme events, they could pass regulations requiring insurance coverage for individuals at risk. Risk-based insurance premiums could be coupled with building codes so that those residing in hazard-prone areas adopt cost-effective loss-reduction measures. Following Hurricane Andrew in 1992, Florida re-evaluated its building code standards, and coastal areas of the state began to enforce high-wind design provisions for residential housing. As depicted in Figure 3, homes that met the wind-resistant standards enforced in 1996 had a claim frequency that was 60% less than that for homes that were built prior to that year. The average reduction in claims from Hurricane Charley (2004) to each damaged home in Charlotte County built according to the newer code was approximately $20,000. Homeowners who adopt cost-effective mitigation measures could receive a seal of approval from a certified inspector that the structure meets or exceeds building code standards. A seal of approval could increase the property value of the home by informing potential buyers that damage from future disasters is likely to be reduced because the mitigation measure is in place. Evidence from a July 1994 telephone survey of 1,241 residents in six hurricane-prone areas on the Atlantic and Gulf Coasts provides supporting evidence for some type of seal of approval. More than 90% of the respondents felt that local home builders should be required to adhere to building codes, and 85% considered it very important that local building departments conduct inspections of new residential construction. fig4 Multi-year insurance
As a complement to property improvement loans, insurers could consider designing multi-year insurance (MYI) contracts of three to five years. The insurance policy would be tied to the structure rather than the property owner and carry an annual premium reflecting risk that would remain stable over the length of the contract. Property owners who cancel their insurance policy early would incur a penalty cost in the same way that those who refinance a mortgage have to pay a cancellation cost to the bank issuing the mortgage. With an MYI contract, insurers would have an incentive to inspect the property over time to make sure that building codes are enforced, something they would be less likely to do with annual contracts. To compare the expected benefits of annual vs multi-year contracts, Jaffee et al. developed a two-period model where premiums reflect risk in a competitive market setting. They show that an MYI policy reduces the marketing costs for insurers over one-period policies and also eliminates the search costs to policyholders should their insurer decide to cancel their coverage at the end of period 1. Should the policyholder learn that the cost of a one-period policy is sufficiently low to justify paying a cancellation cost, it is always optimal for the insurer to sell an MYI policy and for a consumer to purchase it. The insurer will set the cancellation cost at a level that enables it to break even on those policies that the insured decides to let lapse before the maturity date. Several factors have contributed to the non-marketability of MYI for protecting homeowners’ properties against losses from fire, theft and large-scale natural disasters. Under the current state-regulated arrangements in which many insurance commissioners have limited insurers’ ability to charge risk-based premiums in hazard-prone areas, no insurance company would even entertain the possibility of marketing a homeowner’s policy that was longer than one year. Insurers would be concerned about the regulator clamping down on them now or in the future regarding what price they could charge. Uncertainty regarding costs of capital and changes in risk over time may also deter insurers from providing MYI.
For the private sector to want to market coverage if the above issues are addressed, there needs to be a sufficient demand to cover the fixed and administrative costs of developing and marketing the product. To empirically test the demand for MYI, a web- based experiment was undertaken with adults in the U.S.; most were older than 30 years, so they were likely to have experience purchasing insurance. The individuals participating in the experiment were offered a choice between one-year and two-year contracts against losses from hurricane-related damage. A large majority of the responders preferred the two-year contract over the one-year contract, even when it was priced at a higher level than the actuarially fair price. Introducing a two-year insurance policy into the menu of contracts also increased the aggregate demand for disaster insurance. Modifying the National Flood Insurance Program The NFIP provides a target of opportunity to implement a long-term strategy for reducing risk that could eventually be extended to other extreme events. The two guiding principles for insurance would be used in redesigning the rate structure for the program:
  • Premiums would reflect risk based on updated flood maps so that private insurers would have an incentive to market coverage.
  • Means-tested vouchers would be provided by the public sector to those who undertook cost-effective mitigation measures. This would address the affordability issue. Homeowners who invested in loss-reduction measures would be given a premium discount to reflect the reduction in expected losses from floods. Long-term loans for mitigation would encourage investments in cost-effective mitigation measures. Well-enforced building codes and seals of approval would provide an additional rationale for undertaking these loss-reduction measures.
  • An MYI policy tied to the property would deter policyholders from canceling their policies if they did not suffer losses for several years.
  • Reinsurance and risk-transfer instruments marketed by the private sector could cover a significant portion of the catastrophic losses from future floods. Some type of federal reinsurance would provide insurers with protection against extreme losses.
The social welfare benefits of this proposed program would be significant: less damage to property, lower costs to insurers for protecting against catastrophic losses, more secure mortgages and lower costs to the government for disaster assistance. Directions for future studies and research In theory, insurance rewards individuals who undertake loss reduction measures by lowering their premiums. For insurance to play this role, premiums have to reflect risk; otherwise, insurers will have no financial incentive to offer coverage or will not want to reduce premiums when those at risk undertake protective measures. Charging risk-based premiums raises questions of affordability for those low-income residents in hazard-prone areas who are currently paying subsidized prices for coverage or have elected to be uninsured because of budget constraints or misperceptions of the risk. In addition, insurers may elect not to offer coverage if they are concerned about the impact that catastrophic losses will have on their balance sheet as evidenced by their decisions not to offer flood, earthquake or terrorism insurance in the U.S. without some type of back-up from the state or federal government. To determine the price of risk-based premiums, there is a need for more accurate data. In the U.S., FEMA is now updating its flood-risk maps as recommended by a Government Accountability Office (GAO) study and by recent federal legislation on the NFIP. The impact of changing climate patterns on future damage from flooding because of potential sea-level rise and more intense hurricanes also needs to be taken into account. There is evidence that federal agencies and other bodies have underestimated the risks of damage from extreme weather events because of climate change. Hurricane Sandy has stimulated studies on ways that communities can be more prepared for future disaster damage as well as highlighting the need for a suite of policy tools including insurance to address the climate change problem. Studies are also needed as to ways that other policy tools, such as well-enforced building codes to encourage good construction practices, can complement insurance. Enforcing building codes for all residences in Florida could reduce by nearly half the risk-based prices of insurance under climate change projections with respect to hurricane damage in 2020 and 2040. In this regard, Chile serves an example for the U.S. to emulate. The country passed a law that requires the original construction company to compensate those who suffer any structural damage from earthquakes and other disasters if the building codes were not followed. Furthermore, the original owner of a building is held responsible for damage to the structure for a decade, and a court can sentence the owner to prison. Well-enforced building codes in Chile account for the relatively low death toll from the powerful earthquake (8.8 on moment magnitude scale) that rocked the country on Feb. 27, 2010.
The challenge facing the U.S. today is how to capitalize on the concerns raised by hurricanes Katrina and Sandy and discussions on the renewal of the NFIP in 2017. The case for making communities more resilient to natural disasters by investing in loss reduction measures is critical today given economic development in hazard-prone areas. For risk-based insurance to be part of such a strategy, there is a need for support from key interested parties. These include the real estate agents, developers, banks and financial institutions, residents in hazard-prone areas and public sector organizations at the local, state and federal levels. The principle of risk-based premiums coupled with concerns regarding affordability and catastrophic losses apply to all countries that use insurance as a policy tool for dealing with risk. Studies on the role that the private and public sectors play with respect to risk sharing of these losses reveal significant differences between countries. Other countries face similar problems and would do well to consider how to develop long-term strategies that have a chance of being implemented because they address short-term concerns.

Howard Kunreuther

Profile picture for user HowardKunreuther

Howard Kunreuther

Howard C. Kunreuther is professor of decision sciences and business and public policy at the Wharton School, and co-director of the Wharton Risk Management and Decision Processes Center.

Better Way to Rate Work Comp Doctors?

Evaluating and rating the performance of work comp doctors based on the data is do-able and important.

USA Today recently published a story about ProPublica, a nonprofit news organization that has developed a metric to score surgeons’ performance, comparing them with their peers. The study is intended as a tool for consumers, but it has generated concern among surgeons, who feel they are being treated unfairly. What the article neglects to mention is that rating doctors and hospitals is not new in the general health world. Scoring medical providers has been a practice for decades. The Leapfrog Group, which scores hospitals, has been in business much more than 20 years. Doctor Scorecard scores medical doctors, and a Google search will offer more. What is different about the ProPublica analysis is that it is based entirely on data and singles out surgeons treating the Medicare population. It also uses an adjustment score for the difficulty of cases analyzed called an adjusted complication rate. The ProPublica study includes 17,000 doctors performing what are called low-risk, elective surgical procedures derived from Medicare data. The adjusted complication rate selects cases that are considered low risk, such as gall bladder removal or hip replacement. The study looks for complications such as infection or blood clots that require post-operative care, in this case re-hospitalization. The cost of post-operative care requiring hospital readmission amounted to $645 million, which was billed to taxpayers for 66,000 Medicare patients from 2009 to 2013. Logic says that if surgical complications requiring hospitalization are so costly for Medicare patients, the costs must translate to astounding rates in workers’ compensation, as well. However, the study does not directly apply to work comp doctors. The ProPublica study does not directly translate to workers’ compensation because the study examines Medicare patients only. While some injured workers qualify for Medicare, the majority are healthy, working adults under Medicare age. What does translate from the study is that evaluating and rating medical doctor performance based on the data is do-able and important. However, it should not be limited to surgeons. The analysis of doctor performance must be comprehensive, accurate and fair. Rather than using the limited measure of adjusted complication rate following surgery, a broader view of the claim and claimant is appropriate for workers’ compensation. Analysis is not limited to those cases with complications. Instead, all claims are analyzed. Results are adjusted by the claimant’s age, general health (indicated by co-morbidities), and the type and severity of the injury itself. Administrative management analyses are also important in workers’ compensation such as direct medical costs, indemnity costs, return to work, and case duration, among others. Case complexity, sometimes presented as case mix adjustment, is important to fairness in rating doctors in workers’ compensation. Also, analyzing a broad scope of data elements smoothes the variability, leading to more accuracy. Fortunately, in workers’ compensation, claims have a very wide range of revealing data elements that can be drawn from a payer's multiple data silos. The ProPublica study has created pushback from the physician community for several reasons. For one, gall bladder surgery is often performed in an outpatient setting, so re-hospitalization is a meaningless metric. The same is also true for others of the so-called low-risk surgery category. Moreover, the study names names. Published provider ratings from a national survey caused much of the angst noted in the USA article. Names were even published in local papers, naming physicians well-known in their communities. Doctors cried foul! Expecting the general population of patients to understand what the ratings mean, regardless of their accuracy, is naive. Ratings listed as 2.5 or 1.6 have obscure meanings to the uninitiated. Fortunately, workers’ compensation providers do not face that level of exposure. Doctor ratings in workers’ compensation are not published for the general public or made available for consumer interpretation.

Karen Wolfe

Profile picture for user KarenWolfe

Karen Wolfe

Karen Wolfe is founder, president and CEO of MedMetrics. She has been working in software design, development, data management and analysis specifically for the workers' compensation industry for nearly 25 years. Wolfe's background in healthcare, combined with her business and technology acumen, has resulted in unique expertise.

Better Approach to Soft Tissue Injury

Soft tissue injury is extremely common but hard to diagnose and subject to unnecessary and expensive treatment. EFA provides a better tool.

Musculoskeletal diseases, defined as injuries to the soft tissues, currently affect more than one out of every two persons in the U.S. age 18 and older, and nearly three out of four over the age of 65. Low back pain affects at least 80% of adults at some point, with an estimated annual cost of more than $100 billion. Trauma, back pain and arthritis -- the three most common musculoskeletal conditions -- are the most common reason for patient visits to physicians’ offices, emergency departments and hospitals every year. With the aging U.S. population, musculoskeletal diseases are becoming a greater burden every year. A determination must be made if the pain generator is muscular or structural, and incorrect diagnoses can lead to inappropriate treatments and, in the worst case, unnecessary surgeries. About 80% of healthcare and social costs related to low back pain are attributed to just 10% of patients with chronic pain and disability. This statistic suggests that improved interventions to reduce the recurrence of low back pain can underpin significant cost savings and improvement in patient outcomes. The standard approach to managing soft tissue injuries is to obtain a medical history and perform a physical examination. Imaging or testing usually is not needed in the early phases of treatment. In most cases, the natural history of a soft tissue injury resolves without intervention. There are excellent tools to diagnose structural abnormalities or nerve injuries. These include imaging studies, nerve condition tests and disograms. X-rays can be used to assess the possibility of fracture or dislocation. Nerve conduction studies may be used to localize nerve dysfunction. But they are not adequate for soft tissue injury or functional assessments. MRI and CT scans, while excellent tests to evaluate structure, are generally static and not designed to assess muscle function dynamically. In addition, these standard tests all carry a high rate of false positives. There is no magic bullet or one test that does everything. While many tests are good for what they are designed to evaluate, they are not appropriate to diagnose a soft tissue injury. Enter electrodiagnostic functional assessment -- EFA testing. The EFA is a diagnostic tool that combines and enhances five medically accepted tests: electromyography, range of motion, functional capacity evaluation, pinch and grip strength. The EFA is non-invasive and non-loading. The advantage the EFA presents over performing these tests individually is that it performs all tests simultaneously and in a dynamic fashion. This equipment has a 510 (k) registration with the Food and Drug Administration as a Class II diagnostic device. Furthermore, the FDA has recognized in the intended use section that the technology can distinguish between acute and chronic pathology, is able to look at referred pain patterns and is useful with treatment recommendations and baseline testing. Physicians encounter patients daily with complaints of injuries to the soft tissues, particularly the paraspinal muscles. In many cases, objective findings are obvious, but many patients may have injuries that are subtle but continue to cause symptoms. In other cases, the injuries may be less recent, and the physical findings may not be apparent. Direct palpation of soft tissues can, in some cases, reveal the nature or type of injury, but this manner of diagnosis relies on static testing. For some individuals, problems may only be encountered during activity. Measuring muscle activity during range-of-motion testing is difficult at best. The extent to which a patient exerts herself also presents a subjective bias with soft tissue injury. Better outcomes will be demonstrated by using the correct tools to evaluate the underlying pathology. In Adam Seidner’s paper “Assessing disease and wellness in the occupational setting: Electrodiagnostic Functional Assessment from wired to wireless,” he demonstrated that, when the EFA was implemented as a case management tool, it enhanced the level of discussion among treating providers, injured workers and claim professionals. The study demonstrated that medical and lost wage payments to injured workers and their healthcare providers were 25% lower in the EFA group, for an average savings of $10,000 per claim versus the control group. Most importantly, the average return to work was 213 days in the EFA group versus 275 for the control group, or an average of 62 days sooner. The EFA was able to provide better diagnostic information on soft tissue injuries and return the individual back to activities of daily living sooner. Better patient care leads to better outcomes. The EFA results are further demonstrated in the paper “Musculoskeletal disorders early diagnosis: A retrospective study in the occupational medicine setting.” The study found EFA test results affected the course of treatment, improved clinical and functional outcomes, increased patient satisfaction and decreased dispute litigation. In fact, 98 of the 100 cases resulted in return to maximum medical improvement with no rateable impairment and full release to active duty. Only 2% of the cases were challenged, and 98% of those in the EFA control group returned to their pre-injury jobs. These cases were tracked over a three-year period. The EFA-STM baseline program is just another example of better diagnostics providing better patient care. This book-end solution allows for the best care possible for the work-related injury. If a condition is not deemed to be work-related, the individual can still receive the best care and a quicker resolution. The EFA does not replace the other, well-established diagnostic tests; it is simply a better diagnostic alternative for soft tissue injuries. All the tests can complement one another. At the end of the day, when it comes workers’ compensation, the issue is providing better patient care. It’s a win-win for all parties.  

Frank Tomecek

Profile picture for user FrankTomecek

Frank Tomecek

Frank J. Tomecek, MD, is a clinical associate professor of the Department of Neurosurgery for the University of Oklahoma College of Medicine-Tulsa. Dr. Tomecek is a graduate of DePauw University in chemistry and received his medical degree from Indiana University. His surgical internship and neurological spine residency were completed at Henry Ford Hospital.

$1 Million Reward to Show Wellness Works

We'll settle the wellness debate the old-fashioned way: offering a $1 million reward to anyone who can prove it isn't a horrible investment.

We hope at least a few of you have lamented –we’ll settle for noticed -- our absence from ITL for the last six months. There are two reasons. First, in the immortal words of the great philosopher Gerald Ford, “When a man is asked to make a speech, the first thing he has to do is decide what to say.” We needed something compelling to say, and at this point yet-another-vendor-making-up-outcomes is old news. In any event, there is now an entire website devoted to that topic. (New news:  US Preventive Medicine is NOT making up its outcomes. It is the first wellness vendor to be validated.) Second, we have spent the last six months answering the perennial question: “So what would you do instead?” by developing www.quizzify.com. Quizzify teaches employees that “just because it's healthcare doesn’t mean it’s good for you,” and does it in an enjoyable Jeopardy-meets-health-education-meets-Comedy Central way, as playing the demo game will show. Quizzify’s savings are, uniquely in this industry, 100% guaranteed. But we digress. The news of the day is that we want to settle once and for all the he said-she said debate about whether wellness saves money, and we'll do it the old-fashioned way: by offering a million-dollar reward for anyone who can show that wellness isn’t a horrible investment. All someone has to do is show that the employer community as a whole breaks even on its wellness investment. The inspiration for this reward came when a group calling itself “The Global Wellness Institute Roundtable” released a report criticizing us for “mud-slinging on ROI.” (In other words, “proving that there is no ROI.”) We are not familiar with this group. Their headliner seems to be a Dr. Michael Roizen. If that name sounds familiar, it’s because he used to work with Dr. Oz, though to Dr. Roizen’s credit he was not implicated in the congressional investigation of Dr. Oz. This $1-million reward is – as an attorney recently posted-- a binding legal contract. It is also totally fair. The “pro” party is allowed to use the wellness industry’s own “official” outcomes report, which was compiled with no input from anyone opposed to wellness. Further, the panel of judges is selected from an independent email list, run by healthcare policy impresario Peter Grant. This is no ordinary independent email list—this is the invitation-only “A List” of healthcare policymakers, economists, journalists and government officials who make, influence or report the decisions and rules we live by. The “pro” party invites two people, we invite two and those four pick the fifth. This is truly the ultimate in fairness. Unfortunately, “fairness” is perhaps the second-scariest word to a wellness vendor (“validity” being the first), so there is no chance of anyone taking us up on this. (There is a slight risk in challenging us—whichever party loses has to pay the expenses of the contest, including the panelist fees. This will run likely $100,000. Still, that makes the proposition at worst 10-to-one odds, and the "pro" forces get their $100,000 back if they win.) Not being taken up on this offer is, of course, the entire point of making the offer. The wellness industry’s inaction will prove what numerous gaffes and misstatements  have already revealed: Wellness industry leaders know that wellness loses money. For them, wellness is all about maintaining the façade of saving money so that they don’t get fired from the employers they’ve been snookering.

Debunking 'Opt-Out' Myths (Part 4)

Contrary to myth, option programs create competitive pressures that reduce workers’ comp costs and benefit both large and small employers.

I’m aware of no logic, facts or data to support the assertion that options increase workers’ compensation premiums. The exact opposite can be easily demonstrated. Ask yourself, are prices higher or lower when employers have only one product to choose from vs. when they are able to choose among competing products? Texas went from the 10th most expensive workers’ compensation system in the U.S. in 2003 to the 38th most expensive state in 2013 through a combination of workers’ compensation system reforms and competitive pressures from employers electing the Texas “nonsubscriber” option – choosing not to be part of the state’s workers’ compensation system. One-third of all Texas employers have elected the option. Employers representing hundreds of thousands of Texas workers evaluated the impact each system would have on their claim costs, compared insurance premiums and exited the state system between 2003 and 2013. Likewise, Oklahoma simultaneously enacted workers’ compensation reform and option legislation in 2013. Workers’ compensation premiums have since dropped more than 20%, and Oklahoma option programs are saving even more. Further debunking the myth option program raise workers’ compensation costs, a 2015 report from the Workers’ Compensation Research Institute studied workers’ compensation claims in 17 states and found that the total average cost per claim for injured workers in Texas was among the lowest. Costs per claim grew in Texas only 2.5% per year from 2008 to 2013, as measured in 2014. In contrast, for National Council on Compliance Insurance (NCCI) states, the average indemnity cost per lost-time claim increased by 4% in 2014, and the average medical cost per lost-time claim increased by 4% in 2014. Texas workers’ compensation is outperforming national averages because Texas employers have a choice. The option creates a greater sense of urgency among regulators and workers’ compensation insurance carriers to manage claims better so they can reduce premium rates and compete with the alternative system. The option also makes implementation of workers’ compensation reforms more manageable, because they happen across a smaller base of claims. Further, consider that most employers that implement option programs have some frequency of injury claims. Very few employers with no injury claims are willing to go to the time, effort and expense of adopting and communicating a special injury benefit plan, buying special insurance coverage, contracting a claims handling specialist and satisfying newly applicable state and federal compliance requirements (which may include a state qualification process and filing fee). Because options take many companies that have injury claim losses out of the workers’ compensation system, workers’ compensation insurance carriers suffer fewer losses and can reduce workers’ compensation premiums. The carriers must compete harder for business, and they have no justification for charging higher premiums when their total loss experience improves. Associations that represent workers’ compensation insurance companies have labeled options an “external threat” to the industry at a time when premium volume and carrier profits are up and losses are at a 17-year low.  Calendar-year 2014 underwriting results, combined with investment gains on insurance transactions, produced a workers’ compensation pretax operating gain of 14%. These insurance companies urge state legislators to protect their monopolistic, one-size-fits-all product and its profits. They also fight to maintain an anti-competitive web of price-setting collaborations that would violate antitrust laws in other industries. As David DePaolo recently noted on WorkCompCentral, in “the business of workers' compensation insurance… investors (the business side) want to know whether they are going to make money, and how much, by financing the system; not whether the system is working ‘correctly’ or not.” This is an important insight in the context of workers’ compensation insurance lobbyist objections to an option. The lobbyists promote the idea that workers’ compensation systems are superior and working fine, but that is not their primary motivation in trying to shut down competitive alternatives. Some insurance association members have defected and embrace free-market competition. More than $150 million in the Texas nonsubscriber option insurance premium was written last year alone. The Oklahoma option insurance market is just starting up. Many “A-rated” insurance companies now oversee the successful resolution of approximately 50,000 injury claims per year under option programs. An option can be authorized by a state legislature before, after or at the same time as workers’ compensation reforms are adopted. Legislators suffering from “workers’ comp fatigue” find option legislation to be dramatically less voluminous, time-consuming, confusing and contentious than major workers’ compensation reform.  And, as proven in Texas and Oklahoma, the option can slash employer claims costs by 40% or more. A single state (like Tennessee or South Carolina) can see lower government regulatory expense and more than $100 million in annual public and private employer savings. That impact grows exponentially through economic development multipliers. Those are dollars that can be used to create private-sector jobs and invest in education, safety, transportation and other legislative priorities. In contrast, when standing alone, workers’ compensation system reforms are typically returning single-digit premium rate reductions that do not move the needle on injured employee medical outcomes or economic development. Even the widely referenced Oregon premium ranking study (like many others) questions the ability of traditional workers’ comp reforms to create significant movement in employer costs or employee satisfaction. Options to workers’ compensation have particularly worked to the advantage of small employers, which pay most of the workers’ compensation industry premiums. Small companies that experience few, if any, on-the-job injuries typically purchase workers’ compensation insurance coverage on a guaranteed-cost (zero-deductible) basis. They get competitive quotes on both workers’ compensation and option insurance products, then typically choose to write the workers’ compensation premium check and be done. However, both big and small businesses can benefit from option programs. There are several Texas nonsubscriber insurance carriers that write policies for hundreds, even thousands, of small employers. In fact, the vast majority of Texas and Oklahoma employers that have elected the option are small, local businesses. Many reputable insurance providers sell “bundled” programs for small business that supply all option program components, including the insurance policy, injury benefit plan, employee communications, claims administration and legal compliance. It is a simple, turnkey service for insurance agents and employers, delivering better medical outcomes and higher employee satisfaction when the rare injury occurs. If an employer that has elected the option does not like it (for whatever reason), it can go back into the workers’ compensation system at any time. These facts are all reflected in the migration of small employers back and forth between workers’ compensation and option programs in Texas, choosing the best route for their companies and employees as workers’ compensation premium rates have moved up and down over the past quarter century. Even if (as seen in Texas) a significant percentage of a state’s employers elect an option, the “pool” of workers’ compensation premiums can still be hundreds of millions of dollars, a figure large enough to spread the risk and absorb catastrophic claims. Those who say that workers’ compensation premium rates will go up when a state legislature authorizes an option need to back up their fear mongering with similar logic, facts and data or admit their true, anti-competitive motivations.

Bill Minick

Profile picture for user BillMinick

Bill Minick

Bill Minick is the president of PartnerSource, a consulting firm that has helped deliver better benefits and improved outcomes for tens of thousands of injured workers and billions of dollars in economic development through "options" to workers' compensation over the past 20 years.

Seriously? Artificial Intelligence?

Artificial intelligence can create natural dialogue with customers, nurture those leads, prioritize them for agents and follow through.

|||
I don't know about you, but when I think of artificial intelligence, I think Steven Spielberg and Arnold. That was until I saw a solution offered by Conversica, a Salesforce partner. AI is here, it's happening now and it's a lot more pervasive than you think. The rise of "robo advisers" in financial services, Ikea's "Anna" customer service rep and Alaska Airline's "Jenn" all point to the growing adoption of technology that personalizes customer experiences....at scale. One of the 5 D's of Disruption in insurance is "Dialogue." And AI is driving it. Today, in insurance, AI is used to create natural dialogue with customers, nurture those leads, prioritize them for agents and follow through as needed. Conversica, for example, gets smarter as it interacts more with customers. And, yes, it has passed the Turing test. It is particularly well-suited for B2C because the volume of interactions with prospects can be overwhelming for insurance agents. As insurers embrace omni-channel, new prospects can be created from any source, whether it be a contact center, social media or a face-to-face meeting. Not only is lead volume increasing, but it takes as many as six before an agent can get a prospect on the phone. This becomes a time and energy suck for agents; he is unable to follow through on every lead, and the quality of interactions goes down. So how are insurers and agents responding? In this webinar, Eric (Conversica) and Alex (Spring Venture Group) explain to me how AI is used to nurture and convert leads. My takeaway: AI is not just a science project. It works. It'll become more invisible to consumers. And it creates real value to both customers and employees. As Marc Benioff, CEO of Salesforce, said recently in Fortune magazine, "We’re in an AI spring. I think for every company, the revolution in data science will fundamentally change how we run our business because we’re going to have computers aiding us in how we’re interacting with our customers."

Jeffery To

Profile picture for user JefferyTo

Jeffery To

Jeff To is the insurance leader for Salesforce. He has led strategic innovation projects in insurance as part of Salesforce's Ignite program. Before that, To was a Lean Six Sigma black belt leading process transformation and software projects for IBM and PwC's financial services vertical.

Confessions of Sleep Apnea Man

Elements of medical care in the U.S. just plumb confound me. One is the requirement of a prescription for the most mundane of items.

There are elements of medical care in the U.S. that just plumb confound me. One is the requirement of a prescription for the most mundane of items, particularly when you think about where we could be focusing our efforts. Please indulge me a moment while I 'splain the background on this. I went through a sleep study back in 2002, where I was diagnosed with sleep apnea. Apnea is a condition most identified with snoring, although not all snorers are apnea sufferers. After the diagnosis, I was provided with a CPAP machine, the device most commonly used in the treatment of that particular condition. Sleep apnea is described as a potentially serious sleep disorder in which breathing repeatedly stops and starts. What it really was, however, was a condition that kept my wife awake at night. I don't know why the doctors didn't treat her instead. The CPAP (Continuous Positive Airway Pressure) machine is designed to gently pressurize your airway, keeping it open, providing for a more sound sleep. Mostly for your wife. You see, the CPAP literature says the machine is designed to alleviate apnea episodes and reduce potentially fatal risks. The fatal risk it is most likely to alleviate is stopping your spouse from shooting you in the face with a bazooka at 3 am. I have used the same CPAP machine since 2002, and it has performed very well. I do sleep much better using it, as does my wife. I usually take it with me in my travels, and therein lies the conundrum that has produced this missive. My unit, now about 13 years old, is somewhat clunky for the frequent traveler. This is especially true when one does not generally check luggage. Somewhat bigger than a large box of Kleenex, the device either must be packed within my carry-on or in its own travel bag. As a medical device, it does not count as one of my two carry-on items under FAA rules, but it is nevertheless bothersome to have to tote a fairly significant extra bag around. Prior to the advent of PreCheck, it had to come out of the bag and be run through the X-ray equipment on its own. Until about five years ago, it even had to be pulled aside by TSA for explosives testing. If TSA was efficient, that would occur while I was having my prostate checked by Two Finger Lou. If not, the testing added a few minutes to every pass through security. Today, as a government-fingerprinted "Known Traveler" with my very own "Trusted Traveler" ID number (don't get me started on that), I always fly as a PreCheck passenger. The device no longer has to come out of the bag, so for trips of just a few days I pack it inside my carry-on. Of course, as we all really know, size does matter, and this is an issue for trips longer than just a few days. While I have become a very efficient packer and can get four or five days of clothes into a carry-on with the machine, anything longer requires that the unit be carried separately. With that in mind, I ordered a "travel CPAP": a machine about a quarter of the size of the one I have been using. After I placed the order with an online company, it notified me that it required a prescription for the machine to be on file before it could fulfill the order. I have a prescription for CPAP supplies on file with the company, but apparently being able to buy the supplies is different than buying the machine that uses them. According to the FDA, CPAP devices are considered Class II medical devices and require prescription by law. The issue is that my sleep specialist, whom I have not seen in more than 12 years, changed practices a decade ago, and records no longer exist with the practice where I was diagnosed. Without those records, no prescription will be forthcoming. I frankly don't know what my options are with the practice. I suppose I could set up an appointment, go through another two-night sleep study, spend a couple hundred in co-pays and have my insurance billed God knows what for the effort, all to get a piece of paper confirming something we already know I have. All for a machine whose basic function is blowing air. If we applied that logic here, you would need a prescription just to read my blog. Can someone in the medical community take a moment to explain this to me, an admitted medical ignoramus? Have these machines been abused in some unimaginable way? Were teens buying these machines in droves to huff air? Are they somehow vital in the making of meth? For Christ's sake, in the hands of evil men, what indeterminate hell could they unleash? What aren't you people telling us???? Someone should tell the FDA that CPAPs don't kill people; drugs kill people. Maybe the FDA should focus some of its enforcement zeal toward those things that really matter. Perhaps the FDA has heard of the need for a national prescription drug monitoring database. Unless, of course, I am mistaken, and rogue CPAPs are slaughtering more than the 20,000 people every year who die from prescription drug overdoses. My solution to this dilemma will, I hope, be found through my primary care physician. I have made an appointment with him for the sole and single purpose of getting that magic prescription. It will cost me $30, and my insurance company significantly more, all to tell the good doc that I'm feeling fine and that there is nothing wrong. I just need one of those air-huffing, meth-cooking, chaos-reigning machines -- but a small one to make my travel schedule easier to bear. There is a chance that he will not be able to authorize one without another complete sleep study, in which event it will represent a colossal waste of resources. In the absence of a logical explanation, this scenario simply serves to show the ridiculous waste of time, effort and resources in a system where common sense often struggles for its moment in the sun. In a world where we are trying to figure out how five or six remaining practicing physicians are going to treat 350 million people, is this really where we need to devote so much effort? It simply makes no sense to me. But then again, there may be reasons of which I am not aware. I am sure some medical wizard out there, or a medical-equipment salesperson, should be able to enlighten me and remove my veil of ignorance on the matter. I encourage you to do so, and you don’t even need to be gentle about it. It certainly won’t be my first time.

Bob Wilson

Profile picture for user BobWilson

Bob Wilson

Bob Wilson is a founding partner, president and CEO of WorkersCompensation.com, based in Sarasota, Fla. He has presented at seminars and conferences on a variety of topics, related to both technology within the workers' compensation industry and bettering the workers' comp system through improved employee/employer relations and claims management techniques.