Download

How to Make Flood Insurance Affordable

A combination of government vouchers and risk-based pricing shows promise.

|
Our study on Charleston County, SC, assesses the cost impacts of flood insurance if premiums were increased to risk‐based prices. We then consider a program that addresses affordability concerns coupled with cost‐effective risk reduction measures. We follow these two principles for flood insurance:
  1. Flood insurance premiums should be priced to accurately reflect risk. Premiums reflecting risk inform individuals as to the hazardousness of the area and encourage investment in cost‐effective adaptation measures.
  2. Issues of affordability should be addressed, but not by subsidizing insurance premiums. Many low‐ and middle‐income homeowners living in older homes in flood‐prone areas are not able to afford flood insurance if premiums are priced to reflect risk.
We determined risk‐based premiums for a subset of National Flood Insurance Program (NFIP) policyholders who live in Charleston County’s inland and coastal Special Flood Hazard Areas (SFHAs) who currently receive NFIP premium discounts despite their location in high-risk areas. If premiums were risk‐based, these homeowners in Charleston County’s FEMA‐mapped high‐risk flood zones that currently receive subsidies would see their costs increase substantially. If premiums were risk‐based, they would increase from their current levels by 108% on average for policies in the high‐risk 100‐year floodplain (A zone) and by 159% on average in the high-risk 100‐year coastal floodplain (V zone). Elevating a house by a few feet can decrease the risk‐based premium by 70% to 80%, saving thousands of dollars annually. However, elevating a house is very expensive. We propose a voucher program that has two key aspects: (1) insurance premiums are based on risk; (2) vouchers are used to offset both the premium and the cost of the loan for risk mitigation. Implementation of the voucher program with mitigation can reduce government expenditures by more than half over a program that does not require mitigation if the cost of elevating a house is around $25,000 in the A zone. In the high-hazard coastal V zone, cost savings can be achieved even when the cost of elevation is as high as $75,000. Mitigation does not lead to reduction in the cost of the voucher if the policyholder’s household income is below $10,000. Elevation is not feasible for all homes, notably those in historic districts. Other actions could also be considered, such as making a higher deductible the standard option. Other mitigation measures might also lead to lower NFIP premiums, such as wet flood‐proofing the ground floor or moving all habitable areas to the second floor in multi‐story homes. Premiums reflecting risk in the NFIP are primarily a function of the designated flood zone, coverage limits and the property’s structural features such as the height of the lowest floor relative to the base flood elevation (BFE). The BFE is the estimated height of floodwaters during a 100‐year flood. Charleston County is vulnerable to both inland and hurricane flood risks so there are incentives for many homeowners to take steps to invest in cost‐effective measures to reduce their risk and hence their risk‐based insurance premiums. To elevate the home, the homeowner would take a 20‐year, 3% interest loan. A voucher would offset both the reduced risk‐based premium and the cost of the loan to elevate the house. We assume that a household earning $50,000 gross income per year can contribute 5% ($2,500) to flood insurance. After the policyholder’s $2,500 contribution, the voucher covers the additional costs. To illustrate, consider a family living in the A zone with a house one foot below BFE where the risk‐based premium is $5,596. As shown in Table 1, at low and medium elevation costs, the annual cost (loan payment plus the flood insurance premium) is less than the voucher would have been had the homeowner not elevated the house. In fact, if mitigation were required, no federal expenditure would be incurred when elevation costs are low, because the loan cost and the risk‐based premium would be less than $2,500. Savings generated from risk mitigation are even greater in the V zone; even when elevation costs are high, the reduction in premium justifies the investment, as shown in Table 2. Table 1: Voucher Costs in the A Zone without and with Elevation (U.S. Dollars)
Insurance voucher – no mitigation
Risk‐based premium without elevation 5,596
Homeowner pays 5% of gross income 2,500
Government voucher 3,096
 
Insurance voucher -- after house elevation Low cost Medium cost High cost
Cost to elevate the house 2 feet 24,635 50,970 74,756
Risk‐based premium after elevation 839 839 839
Annual loan payment (3% interest, 20 years) 1,656 3,426 5,025
Total annual cost 2,495 4,265 5,864
Homeowner pays 2,495 2,500 2,500
Government voucher -- 1,765 3,364
  Table 2: Voucher Costs in the V Zone without and with Elevation (U.S. Dollars)
Insurance voucher – no mitigation
Risk‐based premium without elevation 19,218
Homeowner pays 5% of gross income 2,500
Government voucher 16,718
 
Insurance voucher -- after house elevation Low cost Medium cost High cost
Cost to elevate the house 2 feet 24,635 50,970 74,756
Risk‐based premium after elevation 5,304 5,304 5,304
Annual loan payment (3% interest, 20 years) 1,656 3,426 5,025
Total annual cost 6,960 8,730 10,329
Homeowner pays 2,500 2,500 2,500
Government voucher 4,460 6,230 7,829
  When a family’s income is below $50,000, the homeowners’ proposed contribution would be less than $2,500 and the government’s contribution would therefore increase from the above examples. In both A and V zones, we find that when annual household income is below $10,000 it is cost‐effective for the government to offer a voucher without requiring the house to be elevated. For households in the $10,000–20,000 income bracket, elevation is cost‐effective for the government only when elevation cost is low. In the V zone, for incomes of more than $20,000, a voucher with a mitigation loan is always financially preferable even when the elevation cost is high. State‐Level Natural Disaster Programs in South Carolina South Carolina currently has several natural disaster programs and tax incentives to assist homeowners in purchasing insurance and fortifying homes.
    • South Carolina’s Omnibus Coastal Insurance Act of 2007 created the Safe Home grants program for low‐and middle‐income homeowners to retrofit primary residences against high‐wind and hurricane damages. Families making less than 80% of the county median household income and with home value below $150,000 qualify to receive as much as $5,000 in non‐matching grants. Families with income above that threshold and home value less than $300,000 are eligible for as much as a $5,000 matching grant. From 2008 to 2011, the Safe Home program awarded 2,500 grants totaling $12.1 million.
    • The Residential Retrofit Tax Credit provides state income tax credits of as much as $1,000 for expenses incurred when retrofitting a home against natural disasters. From 2008 to 2011, 670 Residential Retrofit Credits have been claimed, totaling $781,106.
    • The South Carolina Excess Insurance Premium Tax Credit allows homeowner to claim as much as $1,250 in state income tax credit against excess premium paid on property and casualty insurances. Excess premium is defined as the portion of the premium greater than 5% of the taxpayer’s annual gross income. Additionally, the state offers Catastrophe Saving Accounts, which are interest‐bearing accounts not subject to state income taxes if funds are used for qualified catastrophe expenses.
/ul>

Howard Kunreuther

Profile picture for user HowardKunreuther

Howard Kunreuther

Howard C. Kunreuther is professor of decision sciences and business and public policy at the Wharton School, and co-director of the Wharton Risk Management and Decision Processes Center.

Are Workers' Comp Systems Broken?

Here is a summary of research on how injured employees fare in workers' compensation systems.

|
As national conversations occur about the direction of workers’ compensation (WC) systems, I sometimes hear comments that “WC systems are broken.” Rarely are public programs either all black or all white. The material below summarizes relevant information from a variety of published Workers Compensation Research Institute (WCRI) studies about how most workers fare in WC systems. How workers fare is critical to assessing the performance of WC systems, and the effectiveness of these systems affects the competitiveness of American business. See also: States of Confusion: Workers Comp Extraterritorial Issues The data from WCRI illustrates that the majority of workers in the workers’ compensation system:
  • Return to work within a few weeks of the injury, typically to their pre-injury employers at the same jobs and pay as before the injury
  • Receive their first income benefit payment in 30 days or less from the time that the payer is notified of the claim
  • Report that they were satisfied with the overall medical care received, including the time it took to have the first non-emergency visit with a provider and access to the desired medical services
See also: Return to Work Decisions on a Worker’s Comp Claim Here is more detail: Return to work and recovery of earnings capacity The overwhelming majority of injured workers return to work and do so to their pre-injury employer at the same or higher pay:
  • 75% to 85% of workers had less than one week of lost time[1]
  • 80% to 90% of workers had four weeks of lost time or less[2]
  • 85% to 95% of workers had six weeks or less of lost time[3]
  • Only 5% to 10% said they earned a lot less when they first returned to work (for those who returned to work)[4]
  • 87% to 95% said they returned to their pre-injury employer[5]
  • Half of those changing employers reported that the change was not due to the injury[6]
Timely payment Most workers receive their first indemnity payment without dispute or substantial delay:[7]
  • 40% to 50% in 14 days or less from the time payer was notified of injury
  • 60% to 75% in 30 days or less
Satisfaction and access to medical care By an overwhelming majority, most workers were satisfied with medical care received (workers with more than seven days of lost time):
  • 75% to 85% reported somewhat or very satisfied[8]
  • 85% to 90% reported no problems or small problems getting desired care[9]
  • 80% to 90% reported being somewhat or very satisfied with the time it took to have the first nonemergency care[10]
While the systems may serve most workers reasonably well, the data also shows that there are injured workers with certain attributes that make them less likely to receive these good outcomes. Discussions that focus on system changes that improve outcomes for these injured workers have high-impact potential. A subsequent post will highlight the evidence on who these workers are.

Richard Victor

Profile picture for user RichardVictor

Richard Victor

Dr. Richard A. Victor is a senior fellow with the Sedgwick Institute. He is the former president and CEO of the Workers Compensation Research Institute (WCRI), an independent, not-for-profit research organization that he founded.

What Is an Extra Expense? (in English)

You don’t want to find out how your coverage works during a claim or that you’ve been paying for coverage you don’t need.

|
I am not sure why policy language has to be so confusing. Truly, there are some complicated risks that insurance covers, but even the simple ones seem to be made complicated by the language used. One example is extra expense. The words themselves seem pretty self-explanatory; a policyholder spends extra money due to an occurrence and submits the expenses as part of the claim. Though it sounds straight forward, within a property claim these expenses require different types of measurement, documentation and coverage. To ensure you are buying the right coverage for your risks, it’s important to understand the details and the differences. Per the International Risk Management Institute (IRMI), extra expenses are defined as: “…additional costs in excess of normal operating expenses that an organization incurs to continue operations while its property is being repaired or replaced after having been damaged by a covered cause of loss. Extra expense coverage can be purchased in addition to or instead of business income coverage, depending on the needs of the organization.” This is true, but there is another kind of “extra expense” that is included as part of your business income - this is commonly known as “expense to reduce loss.” These expenses meet the definition of extra expense, but they are incurred to reduce the duration or magnitude of the business income loss. See also: The Most Effective Insurance Policy Consider this scenario: A manufacturer is shut down because of a covered cause of loss. Despite damaged machinery, they manage to resume operations in the facility by performing work manually with more than normal labor. The extra labor costs enables the insured to maintain some production that reduces lost sales. Is this a business income loss, extra expense loss or both? In this case, extra expense coverage in excess of the business income would not be necessary since the extra expenses reduced the business income loss. Any sales that were lost could still be recovered as well. If only extra expense coverage was purchased, the manufacturer could recover the extra expenses but not any lost sales. The distinction between “extra expense” and “expense to reduce loss” is important when you are placing coverage. Quantification and documentation of extra expense exposures depends on the types of expenses and the scenarios envisioned. If the only extra expenses that are foreseen would be to reduce a greater business income loss, then it might not be necessary to purchase the additional coverage. If business income is not at risk or can be avoided entirely with extra expenses, extra expense coverage may be the way to go. Another category of coverage that gets confused with extra expense is expediting expense. Per the International Risk Management Institute (IRMI) expediting expenses are defined as: “…expenses of temporary repairs and costs incurred to speed up the permanent repair or replacement of covered property or equipment.” The need for expediting expense coverage came from a time when boiler and machinery coverage applied to specific objects written on separate policies. Modern all risk policies will include expediting expense as a part of expense to reduce loss or extra expense coverage. See also: Shouldn’t Your Insurance Coverage Become More Than An Expense? Again it is important to understand how you might incur these loss related expenses when placing coverage. To the extent that you can save the insurance company money by expediting, you are less likely to meet resistance. If you will need to expedite repairs for other reasons, regardless of cost or time savings, you may need to get coverage that provides full reimbursement. Understanding the different types of expense coverage and how they apply to your business is critical when buying insurance. You don’t want to find out how your coverage works during a claim or that you’ve been paying for coverage you don’t need. Think through your potential scenarios, consult your broker and a forensic accountant to explore what coverages and limits are best for your risks. Then, share your conclusions with your underwriter to make sure everyone is speaking the same language.

Christopher Hess

Profile picture for user ChristopherHess

Christopher Hess

Christopher B. Hess is a partner in the Pittsburgh office of RWH Myers, specializing in the preparation and settlement of large and complex property and business interruption insurance claims for companies in the chemical, mining, manufacturing, communications, financial services, health care, hospitality and retail industries.

Distribution Debunked (Part 1)

We try technology solutions, but distribution channels have no motivation to accept them.

|
Over the past two years, there has been a rapidly accelerated emphasis on insurance technology, data and distribution. But are we as an industry spinning our wheels? I think the answer to that question is a big yes. Why? Because we haven’t asked the right questions and are not trying to solve for the right problem. All of the major technology, big data and distribution initiatives out there have a few common origination points, namely, underwriting profitability and transactional efficiency. There is a ton of money and resource spent on this, then we charge distribution with leveraging them in existing channels and in line with current transactional norms. In other words, we are trying to apply technology solutions to distribution channels that are not motivated or prepared to accept them – we then scratch our heads and wonder why we are so far behind as an industry. See also: Fast and Slow: the Changing Landscape   Asking the right questions: To fully leverage our capabilities and move our industry forward, we fundamentally need to start asking different questions – we need to go at the problem from the customers’ perspective and then drive the solutions backward. This means having the courage to understand that a distribution infrastructure that is unwilling to change will have to be shelved in favor of distribution outlets that embrace change. Without that realization, there can be no progress. The only technology advancements that can take hold are the ones that support the traditional avenues and solidify the position of the stagnated channels. Until we understand this, we will never improve. Don’t believe me? Let’s look at the landscape: Why are we being commoditized? Insurers battle the commoditization of their product – yet distribution insists that the primary customer decision point is price, even though study after study shows that customers will pay a higher price when there is value and convenience provided. Because of this, the traditional distribution channels insist on building comparative quoting infrastructures and “get a quote now” facilities that escalate the commoditization. What does value mean? We insist on defining value in our own terms instead of on the customers’ terms. We continue to hear from insurers that they will not be the lowest price but that they provide significantly better coverage. That’s all fine and dandy, but the reality is that other insurers can mimic your offering in less time than it takes for you to educate your distribution, and then get them to start selling the product. In other words, your competitive advantage is hijacked before it ever gets to market. We fail to recognize that DISTRIBUTION ADOPTION TAKES LONGER THAN CUSTOMER ADOPTION. That has been OK for a lot of years because everyone has been looking at things the same way, but what happens when your competitors wake up and finally “get it”? See also: A Practical Tool to Connect to Customers   What does the customer want? Isn’t it fascinating that this is what is at the bottom of the list? Ok let’s dig in…
  • Customers want a process that is not PAINFUL.
  • Customers want to feel like they are buying the right thing from the right company and feel good about the transaction.
  • Customers want more than just a promise to pay.
  • Customers want to get their questions answered quickly and clearly.
  • Customers want to communicate in a way that works best for them.
It’s important to ask the right questions so you can solve for the right problem. In our next installment, we’ll look at each of these and how distribution breaks down.

Donn Vucovich

Profile picture for user DonnVucovich

Donn Vucovich

Donn Vucovich is a managing partner at MVP Advisory Group. Vucovich has more than 25 years of combined financial services industry and consulting experience.

Are Malpractice Claims Fading? (Video)

Do we no longer need to focus so much on tort reform related to medical malpractice?

|
Healthcare Matters sits down with Dr. Richard Anderson, chairman and CEO of the Doctors Company. In Part 6 of the series, we ask Dr. Anderson if further tort reform is necessary, given that the medical malpractice insurance industry is experiencing a drop in claims and medical malpractice insurance rates continue to fall, creating what is essentially the longest softest market in history.

Erik Leander

Profile picture for user ErikLeander

Erik Leander

Erik Leander is the CIO and CTO at Cunningham Group, with nearly 10 years of experience in the medical liability insurance industry. Since joining Cunningham Group, he has spearheaded new marketing and branding initiatives and been responsible for large-scale projects that have improved customer service and facilitated company growth.


Richard Anderson

Profile picture for user RichardAnderson

Richard Anderson

Richard E. Anderson is chairman and chief executive officer of The Doctors Company, the nation’s largest physician-owned medical malpractice insurer. Anderson was a clinical professor of medicine at the University of California, San Diego, and is past chairman of the Department of Medicine at Scripps Memorial Hospital, where he served as senior oncologist for 18 years.

Is Flood Map Due for a Big Data Make-Over?

The Internet of Things and big data technologies could turn the flood map into a poster child for the idea of smart cities.

|
One of the staples of many cities’ and regions’ disaster planning and readiness is the flood map showing areas and, if you zoom in, structures at risk from floods of a given magnitude. These are published by FEMA in the U.S. and equivalent government agencies in other countries. Flood maps are not glamorous or technologically exciting things.  They have done their work for many years and, provided that they are up to date, are an effective way of communicating a generalized level of risk. However, they are far from perfect, and it is possible to identify a number of improvements that could be made with some of the Internet of Things and big data technologies now available. In so doing, the flood map could become a poster child for the idea of smart cities. See also: The 2 New Realities Because of Big Data First, flood maps are regularly not up date, because they are updated on a five- or 10-year cycle (or, in poorer or or less capably governed locations, whenever funds are made available). In the interim, new understanding of weather patterns, sea level rise and the like can change the definition of appropriate flood scenarios to apply, and entirely new settlement and urbanization patterns can emerge. Flood maps would clearly be more useful if they were more dynamic – if the timescale for their updating was compressed. At the same time, because of their scale, flood maps cannot really capture localized variations in risk. The example below shows how these may apply even at the scale of individual homes, in this case in Florida. (With thanks to Coastal Risk Consulting, an IBM Business Partner) If this local variation was just applicable to residential properties, that would be one thing (although bad enough for the owners of the higher-risk homes!). But if the variation made the difference between having part of the local phone or internet system working or not, or if it meant that a hospital that was thought to be safe was actually at risk of its ER wing being under 18 inches of water, that would clearly be something else again, because it could badly de-rail emergency response. Flood maps clearly need to be more granular – more detailed – as well as more dynamic. Improvements in dynamism are already being made, as the availability of commercial mapping services from Google, TomTom and others might make one suspect. These are updated rather more frequently than five to 10 years! There are also considerable improvements in granularity now available, as the above example showed – companies like Coastal Risk Consulting will provide LIDAR-based risk assessments at the level of individual properties. Different flood models can be plugged in to allow a city, business or a homeowner (or their insurers) to assess risk arising at individual locations from different scenarios. See also: Flood Insurance at the Crossroads But the improvements in dynamism and granularity could, in theory, go much further. The concept of elevation (above sea level or above a river) probably brings to mind something that is a given, fixed and invariable, unless you happen to be looking at geological timescales. But there are factors that can mediate the value of elevation that operate on a much shorter timescale. Consider a building that is 10 feet above sea level but protected by a levee 10 feet high. It may be said to have 20 feet of “virtual elevation,” inasmuch as it would require a flood crest of more than 20 feet above sea level to flood the property. Similarly, take a property 10 feet above sea level but in the area covered by a flood pump or storm drain that can remove 1.5 feet of water from that area. The property may be said to have 11.5 feet of “virtual elevation.” A property may also have a virtual elevation of less than its physical elevation if, for example, building work or a wall or pavement channels additional water toward it. The point about virtual elevation is that it may change in any given location by the year as, say, gophers undermine the levee; by the month, as an area is paved; by the day, if the flood pump is being maintained; or even by the minute if the pump suddenly fails (perhaps when its power supply is compromised by flooding elsewhere)! Virtual elevation is a highly dynamic, highly granular concept that a typical flood map would fail to capture – yet one that may make the difference between a critical asset being operable or not, or an evacuation route being open or not.  A city faced with an oncoming storm-surge or a rainfall event upstream of where it is located might therefore need to ask “what’s our virtual elevation – our disposition - right now?” The answer might make a significant difference to its standing emergency management plans and require significant adjustments. All of which tends to imply that the traditional flood map really needs a makeover. At a minimum, while it still provides the baseline, the structures and urban extents that it shows need to be updated, say, annually; making the flood map part of a more interactive tool that allowed for different weather scenarios to be applied, say, would also be a step forward. In reality, the flood map would represent one end of a continuum stretching to something much more contemporaneous. Using the same core baseline data, changes to virtual elevation could be assessed as plans are approved or building permits are issued, or as assets are maintained and their records are updated. In this way the flood map would illustrate the observation that “big data” should really be labeled “small data” – but at enormous scale. If the extra data flows can be added to improve the flood map’s dynamism to, say, a daily or weekly update, and its granularity to the individual property or asset level, it would be transformed from some form or reference baseline that may or may not be up to date at any given point in time, to a live tool that supports day to day decision making.

Peter Williams

Profile picture for user PeterWilliams

Peter Williams

Dr Peter Williams is the Chief Technology Officer, Big Green Innovations, at IBM. His focus areas are Smarter Cities, with special reference to resilience to natural disasters and chronic stresses; and technology developments for governments.

I Got 99 Problems, but a Glitch Ain't One

Although the Jay-Z song isn't about workers' comp, the industry needs to see where its problems are -- and aren't.

|
I have taken some time to review notes from the Workers’ Comp Roundtable 2016 WC Summit. The laundry list of glitches and gripes is bountiful with few surprises. Although the notes themselves do little to move the needle, they clearly show where the needle points. The collected bulk of issues contributed from various corners illuminate a fantastically disjointed hopelessness. If nothing else, this summit is a general acknowledgement of workers’ comp as a systemic failure. This is very useful. Accepting failure is essential to force a widened perspective and arrest the status quo. Accepting failure means we don’t need a complicated sorting of issues to provide sense and direction. We need to stop glitch-fixing and work from a higher level. In that spirit, only two items picked from the vast summit notes are necessary to depict the problem and re-align a solution focus: Item #1: “Every single service provider makes more money if the case goes south.” Item #2: “80% of the system is working appropriately, but 20% needs addressing.” Consider that Item #1 is a truth caused by the incorrect assumption that “20% needs addressing,” per Item #2. The WC claims failure rate of between 10% and 20% has been an accepted statistical constant since at least the start of my career in the 1980s. It has not changed. Therefore, I submit that we must realize that this 20% is a societal-social-human element, which no part of the WC vendor arsenal can, nor should be expected to, fix. We need to stop addressing the 20% as if it has any potential for cure and return to work (RTW) and resolution. See also: States of Confusion: Workers Comp Extraterritorial Issues Fueling item #1 is the decades-long growth of various for-profit interventions, managed care controls and other misguided efforts aimed at the 20%. These remain alive and well, all “going south” for profit. No one corner of the industry has incentive to change. Each has a value proposition that makes some sense standing alone but falls apart and creates cross purposes in practice. Consider that most of the other summit notes are a sub-set of this fact, relating directly to glitches in execution and the lack of human consideration in the process-monster this industry has created, All address the 20%, with the backdrop of legislative pendulums swinging to over-correct and triggering counter forces to over-react. Consider the absurdity in this simple example: What if state law required restaurants to prepare food with 20% of their raw ingredients spoiled? Would any of their dishes be fit for consumption once the 20% was blended into recipes with the 80%? What if the restaurant’s solution was to charge more money to engage more specialized cooks and more expensive spices and techniques that promised to make the spoiled parts more palatable? What if the restaurant charged even more money to predict which dishes would be the most spoiled, yet served them anyway? What if over time the entire restaurant industry saw fit to lose money on the actual menu items but have profit rely entirely on the added services aimed at placating diners’ fears over spoiled food? This absurdity is our workers’ comp system.  A restaurant should be able to throw away ingredients unfit for use. It is not that simple in WC, yet is it so far-fetched to consider legitimizing the statutory marking of such WC cases early or at any stage in real-time?  Can some escape-hatch of “skipping medicine for resolution” be a legitimate mutual position from the claimant and defense side? Can the system open means for very early strategies and legal methods to dispatch the 20% without a need to pretend to “cure?” This has happened, in small doses. Remember back when California mandated vocational rehabilitation, and it became mostly an under-the-radar holding pattern to failure and a means to propel claimants into bigger and badder disability positions? Recall that the solution at one point was to allow the option for claimants to be paid the value of Voc Re, as if they attended. This situation is a legislative acknowledgement of my main point. Let’s expand this thinking on a grander scale. Let us also agree that employers should have 90% of the responsibility to identify the 20% – they should know their employees better than any predictive model, and adjusters should have the time and mandate to properly decipher real-time information with employers. Further, employers should strive to reduce the 20% as part of overall workplace culture efforts, just as a restaurant supplier is expected to minimize the delivery of spoiled produce. This is not just about WC. Better employer culture serves to better overall productivity. See also: Are Our Working Patterns Outdated? The industry needs to eliminate much of what it does that keeps claims churning open. Fees for claims and related services should be based on outcome performance. Eliminate rewards for false notions of “saved” medical dollars or simple transaction fees for late-timed or ill-fitted interventions. Think of how efficient the WC process would be if the 80% with outcome responsibility suddenly became the 99%. Many managed care schemes and other interventions would become unnecessary. Legislatively, we need an acceptance of the 20% as a human/societal anomaly and need to require judges to account for it in tougher court decisions. We need to craft law reforms that open different avenues to resolve these cases very early under a “nuisance” presumption. We don’t need to fix the 20%. We need very big changes that relieve workers' comp of this 20% burden. Once that happens, most every other item from the summit notes will be minimized or vanquished. Note: PDF downloads of complete summit notes can be found here.

Barry Thompson

Profile picture for user BarryThompson

Barry Thompson

Barry Thompson is a 35-year-plus industry veteran. He founded Risk Acuity in 2002 as an independent consultancy focused on workers’ compensation. His expert perspective transcends status quo to build highly effective employer-centered programs.

Telematics: Time to Move Beyond Pricing

North American insurers could offer a wide range of services based on telematics data, like those already offered in other countries.

|
Sometimes it is difficult to believe that vehicle telematics for usage-based insurance is 20 years old. While the likes of Norwich Union and Progressive began planning and piloting long ago, most of the real activity in the market has taken place over the last few years. SMA’s recently released research report: Telematics in P&C Insurance: The Need to Move Beyond Pricing, profiles the state of the UBI market in North America, with a special emphasis on data and how that translates into value propositions. As the report title suggests, the North American market tends to be stuck in neutral, focusing primarily on offering premium discounts to policyholders that exhibit certain driving behaviors, as tracked by the telematics device. While there are isolated instances of insurers that have gone beyond pricing, the majority of the programs in the market and the pilots underway concentrate on attracting customers through discounts (which are often substantial). In other markets, notably Italy, Brazil, South Africa, and the U.K., other value propositions are already in the market, including safety advice, theft deterrence and concierge services. See also: Telematics: No Longer Just For Cars In theory, there should be a clear market advantage to insurers that can more precisely determine the risk characteristics for a customer and price accordingly, grabbing market share while maintaining profitability in the process. However, in many cases it has not been quite that simple. Once you get beyond the Progressives and Allstates of the insurance world, which have collected data from billions of telematics miles, most of the remaining companies lack the historical data to satisfy actuaries in pricing and profitability. And customer adoption has not been as rapid as expected, either. Still, the move movement toward an increasing usage of UBI programs continues, with almost three in five insurers that write vehicle insurance either having programs in the market or plans to do so in the next few years. See also: Telematics: Now a ‘Movie,’ Not ‘Snapshot’ The SMA research confirms that primary data sources are used for pricing, but also identifies other types of data that are being collected and could be used for future value propositions. For example, most insurers with UBI programs are collecting data on location, routes driven, vehicle diagnostics and other information. There are a wide range of new services that insurers might offer to both personal and commercial lines customers based on this data, like those already offered in other countries. And the beauty of many of the offerings that go beyond pricing is that insurers will be able to test propositions in the market without requiring regulatory approval.

Mark Breading

Profile picture for user MarkBreading

Mark Breading

Mark Breading is a partner at Strategy Meets Action, a Resource Pro company that helps insurers develop and validate their IT strategies and plans, better understand how their investments measure up in today's highly competitive environment and gain clarity on solution options and vendor selection.

An Underestimated Source of Risk

Human resource risk is often underestimated, and that can be a serious misjudgment -- as recent lawsuits and settlements prove.

|
When directors or CEOs or senior managers think about risk, they generally envision risks associated with the company’s finances, manufacturing, data, supply chain and customers. Human resource risk is often underappreciated, and that can be a serious misjudgment. Recent events, lawsuits and settlements prove this point. It is true that the risk associated with talent and a lack thereof has risen in the risk hierarchy of most organizations. However, the many other serious risks associated with managing existing talent are often relegated to the bottom of the risk register. The reasons for this underestimation are varied. Many executives tend to think that: 1) human resource matters are supplemental to the business rather than integral, 2) being an “employer at will” protects the company and enables it to make human resource decisions however it sees fit, 3) a single employee, applicant or retiree is no risk to the organization as a whole (even though a single employee can potentially cause a “class” to be formed under the law). The danger inherent in underestimating HR risk is that it does not get adequately addressed with mitigation plans. Not all organizations will have the same exposure to risks. Even if they did have the same exposure, some will have more safeguards already in place and warrant a lower risk ranking than some other organization. The discussion that follows is not meant to imply that all HR risks must be prioritized at the top right hand corner of a heat map. It is meant to highlight the potential impact that some HR risks can have on an organization. Rogue Employee Risk The rogue employee is one of the most amazing phenomena among human resource risk categories. In financial services, rogue employees have wreaked havoc on otherwise solid and long-standing businesses. Two noteworthy examples are Barings Bank, London’s oldest merchant banks, and UBS, one of Switzerland’s financial giants. Roughly 20 years ago, Nick Leeson, a Barings Bank derivatives trader, gambled away the equivalent of $1. 4 billion of bank money from a secret “error” account. The bank went bust and was bought by ING for a nominal sum. In 2011, UBS announced it had lost $2 billion due to unauthorized trades by a director at its global synthetic equities desk. And financial institutions are not the only organizations exposed to rogue employee actions that create huge risks and large losses. For instance, GNP, parent of Just BARE and Gold'n Plump, just recalled 55,608 pounds of chicken because of what it called a "product tampering incident" at one of its processing plants. Here are some of the ways in which such an employee can create risk in just about any industry sector and for which organizations need to develop safeguards as part of their mitigation plans:
  • Abetting a data breach affecting customer/employee personal data
  • Sabotaging mechanical or technological equipment
  • Sabotaging products intended for sale
  • Stealing company property, including intellectual property
  • Mishandling customers/patients on purpose
See also: Risk Management, in Plain English A fundamental safeguard is thorough vetting during the employment process. Others include: 1) active supervision, 2) automatic, system alerts when authorities are exceeded or other rogue actions are attempted, 3) robust internal audits. Regulatory Violations Risk Organizations must deal with employee-related regulation at the local, state and federal level. The number of major federal regulations has grown significantly in the past few decades and now includes such well-known acts as: the Fair Labor Standards Act, Title VII, Age Discrimination Act, the Americans with Disabilities Act, Employee Retirement Income Security Act, Family and Medical Leave Act and WARN Act. Each of these has numerous elements that must be understood and complied with, including gray areas that need to be thought through before any action regarding an employee can be decided on. The Fair Labor Standards Act has been the high-risk area of late. There have been numerous types of suits under this act related to: 1) misclassification of employees into exempt and non-exempt categories, which has implications for overtime pay, 2) incorrect calculation of overtime pay for those due it, 3) mismanagement of paid break time. A $188 million judgment against Walmart, which is being appealed, had to do with paid versus unpaid break time. Interestingly, this case revolves around the company not living up to the policies in its own handbooks, not around a failure to fulfill specific requirements spelled out in the law. This case is, therefore, illustrative of two important points. First, settlements can be financially significant even for the largest of companies. Second, when dealing with human resource matters, formal programs or policies, which constitute a contractual obligation, have to be considered. See also: Building a Strong Insurance Risk Culture Wage and hour suits are likely to keep increasing in 2016 due to the success of recent plaintiffs, new regulations regarding overtime pay and an overall concern among employees that wages are not sufficient or not fair. In an article titled “Why Wage and Hour Litigation Is Skyrocketing,” Lydia DePillis writes, “The number of wage and hour cases filed in federal court rose to 8,871 for the year [ended] Sept. 30, up from 1,935 in 2000.” Title VII and age discrimination cases have been associated with large dollar losses over the years. Given the many federal, state and local statutes, coupled with a more informed and litigious employee population, organizations can inadvertently step into non-compliance pitfalls rather easily. Organizations should always follow the laws that apply to them. Risk enters into the equation because there is always the potential that someone in management is unaware or careless or, worse yet, disrespectful of the laws. Thus, the organization is continuously exposed to the risk of violations. Every effort should be made to be compliant, including: 1) having a clear set of core values that guide lawful behavior, 2) educating management and all employees about the laws and how to comply with them, 3) investing in strong compliance processes and 4) making sure violators are dealt with quickly and appropriately. HR Program Risk Human resources professionals create and administer many expensive programs such as retirement, benefits, compensation and incentive programs. A large error in terms of budgeting or managing such programs could lead to a sizable financial risk for the organization. Imagine an actuarial error that creates severe pension underfunding or a poorly managed self-insured medical benefit plan that costs double what benchmarks would suggest. Or, consider a new incentive program that produces the antithesis of the behavior it was intended to promote. The risk can be major, not unlike the size and seriousness of a natural catastrophe or product recall or supply chain debacle. CEOs need to ensure that HR programs and policies are being handled by expert professionals, whether staff or consultants. At the same time, senior management needs to invest the attention and support necessary to ensure these are well-designed and implemented according to specification. The comments in this article are neither meant to be all-inclusive nor to be construed as advice.

Donna Galer

Profile picture for user DonnaGaler

Donna Galer

Donna Galer is a consultant, author and lecturer. 

She has written three books on ERM: Enterprise Risk Management – Straight To The Point, Enterprise Risk Management – Straight To The Value and Enterprise Risk Management – Straight Talk For Nonprofits, with co-author Al Decker. She is an active contributor to the Insurance Thought Leadership website and other industry publications. In addition, she has given presentations at RIMS, CPCU, PCI (now APCIA) and university events.

Currently, she is an independent consultant on ERM, ESG and strategic planning. She was recently a senior adviser at Hanover Stone Solutions. She served as the chairwoman of the Spencer Educational Foundation from 2006-2010. From 1989 to 2006, she was with Zurich Insurance Group, where she held many positions both in the U.S. and in Switzerland, including: EVP corporate development, global head of investor relations, EVP compliance and governance and regional manager for North America. Her last position at Zurich was executive vice president and chief administrative officer for Zurich’s world-wide general insurance business ($36 Billion GWP), with responsibility for strategic planning and other areas. She began her insurance career at Crum & Forster Insurance.  

She has served on numerous industry and academic boards. Among these are: NC State’s Poole School of Business’ Enterprise Risk Management’s Advisory Board, Illinois State University’s Katie School of Insurance, Spencer Educational Foundation. She won “The Editor’s Choice Award” from the Society of Financial Examiners in 2017 for her co-written articles on KRIs/KPIs and related subjects. She was named among the “Top 100 Insurance Women” by Business Insurance in 2000.

The REAL Objection to Opt Out

Each and every vendor makes a buck off workers' comp, and each and every one has an interest in maintaining the status quo.

|
I have never really understood why the Property Casualty Insurers Association of America has been so vehemently against opt out. While it seems that opt out returned to the back burner for this year with constitutional defeats in Oklahoma and political stalemate in other states, PCI has reignited the debate with an inflammatory paper. The basic arguments, which PCI supports with some data, is that opt out results in costs shifting to other systems and that a lack of standards and transparency is detrimental to consumers (i.e. injured workers). PCI also argues that opt out is all about saving employers money to the detriment of consumers by denying more claims earlier and paying less with capitations and restrictions not found in traditional comp. I get that alternative work injury systems must meet certain standards and need to be more transparent to consumers -- to me, that’s a no-brainer. But the objections that PCI raises are exactly the same complaints made against traditional workers' comp: inadequate benefits, unnecessary delays, cost shifting, etc. See also: Debunking ‘Opt-Out’ Myths (Part 6)   Each statistic cited by PCI against opt out can be asserted against traditional workers' comp -- just use another study or data source. For instance, just a couple of years ago, Paul Leigh of University of California at Davis and lead author of the study, Workers' Compensation Benefits and Shifting Costs for Occupational Injury and Illness, told WorkCompCentral, "We're all paying higher Medicare and income taxes to help cover [the costs not paid by workers' compensation]." That study, published in the April 2012 edition of the Journal of Occupational and Environmental Medicine, found that almost 80% of workers' compensation costs are being covered outside of workers' compensation claims systems. That amounts to roughly $198 billion of the estimated $250 billion in annual costs for work-related injuries and illnesses in 2007. Just $51.7 billion, or 21%, of those costs were covered by workers' compensation, the study said. Of the $250 billion price tag for work-related injury costs, the Leigh study found $67.09 billion of that came from medical care costs, while $182.54 billion was related to lost productivity. In terms of the medical costs, $29.86 billion was paid by workers' compensation, $14.22 billion was picked up by other health insurance, $10.38 billion was covered by the injured workers and their families, $7.16 billion was picked up by Medicare and $5.47 billion was covered by Medicaid. The study drew criticism from the workers' comp crowd, which defended its practices, challenged the data and anecdotally attempted to counter argue, with limited success. If one digs deep enough in the PCI study, I'm sure one could likewise find fault with the data and the reporting on cost shifting -- because the truth is that absolutely no one has a fix on that topic. My good friend Trey Gillespie, PCI assistant vice president of workers’ compensation, told WorkCompCentral that "the fundamental tenets of workers’ compensation [are] protecting injured workers and their families and protecting taxpayers. The general consensus is that the way programs should work is to protect injured workers and taxpayers and avoid cost-shifting.” Of course! All work injury protection systems should do that. But they don't. See also: What Schrodinger Says on Opt-Out That's what the ProPublica and Reveal series of critical articles about workers' compensation programs across the country tell us, both anecdotally and statistically: Injured workers aren't protected, costs are shifted onto other programs, and taxpayers are paying an unfair portion of what workers' comp should be paying. Indeed, in October, 10 federal lawmakers asked the U.S. Department of Labor for greater oversight of the state-run workers’ compensation system, to counteract “a pattern of detrimental changes to state workers’ compensation laws and the resulting cost shift to public programs.”
I started thinking about the one truism that governs human behavior nearly universally: Every person protects their own interests first. And I thought of PCI’s name: Property and Casualty Insurers Association of America. “Property and casualty.” Ay, there's the rub! There’s no room for P&C in opt out! ERISA-based opt out uses only health insurance and disability insurance. Workers' comp is the mainstay of the P&C industry, the single biggest commercial line and the gateway to a whole host of much more profitable lines. If opt out spreads beyond Texas, it is hugely threatening to the interests of the PCI members because they stand to lose considerable business, particularly if opt out migrates to the bigger P&C states. PCI is protecting its own interests (or those of its members) by objecting to opt out. And I don't blame them. Their impression of this threat is real. Michael Duff, a professor of workers’ compensation law at the University of Wyoming, told WorkCompCentral, “These are interested observers. They’re going to have an agenda. They represent insurers who are in the workers’ comp business.” Bingo. “Every commercial actor that participates in traditional workers’ compensation has an interest in seeing traditional workers’ compensation continue," Duff went on. “But that traditional workers’ compensation imposes costs on employers. There is now a group of employers who would like to pay less, and Bill Minick has developed a commercial product that is in competition with this other conceptual approach to handling things.” Here's THE fact: Traditional workers' compensation and ANY alternative work injury protection plan require vendors pitching wares and services to make the systems work. Insurance companies are as much a vendor in either scenario as physicians, bill review companies, utilization review companies, attorneys, vocational counselors, etc. Each and every single one makes a buck off workers' comp, and each and every one has an interest in maintaining the status quo. See also: States of Confusion: Workers Comp Extraterritorial Issues Arguing that one system is better than the other without admitting one's own special interest is simply hypocrisy. Workers' compensation is going through some soul searching right now. Employers leading the debate are asking, "Why stay in a system that facilitates vendors' interests ahead of employers or workers?" THAT's the question that BOTH the P&C industry and the opt out movement need to answer. Further debate about the merits of one over the other is simply sophistry. This article first appeared at WorkCompCentral.

David DePaolo

Profile picture for user DavidDePaolo

David DePaolo

David DePaolo is the president and CEO of WorkCompCentral, a workers' compensation industry specialty news and education publication he founded in 1999. DePaolo directs a staff of 30 in the daily publication of industry news across the country, including politics, legal decisions, medical information and business news.