Download

What Is an Extra Expense? (in English)

You don’t want to find out how your coverage works during a claim or that you’ve been paying for coverage you don’t need.

|
I am not sure why policy language has to be so confusing. Truly, there are some complicated risks that insurance covers, but even the simple ones seem to be made complicated by the language used. One example is extra expense. The words themselves seem pretty self-explanatory; a policyholder spends extra money due to an occurrence and submits the expenses as part of the claim. Though it sounds straight forward, within a property claim these expenses require different types of measurement, documentation and coverage. To ensure you are buying the right coverage for your risks, it’s important to understand the details and the differences. Per the International Risk Management Institute (IRMI), extra expenses are defined as: “…additional costs in excess of normal operating expenses that an organization incurs to continue operations while its property is being repaired or replaced after having been damaged by a covered cause of loss. Extra expense coverage can be purchased in addition to or instead of business income coverage, depending on the needs of the organization.” This is true, but there is another kind of “extra expense” that is included as part of your business income - this is commonly known as “expense to reduce loss.” These expenses meet the definition of extra expense, but they are incurred to reduce the duration or magnitude of the business income loss. See also: The Most Effective Insurance Policy Consider this scenario: A manufacturer is shut down because of a covered cause of loss. Despite damaged machinery, they manage to resume operations in the facility by performing work manually with more than normal labor. The extra labor costs enables the insured to maintain some production that reduces lost sales. Is this a business income loss, extra expense loss or both? In this case, extra expense coverage in excess of the business income would not be necessary since the extra expenses reduced the business income loss. Any sales that were lost could still be recovered as well. If only extra expense coverage was purchased, the manufacturer could recover the extra expenses but not any lost sales. The distinction between “extra expense” and “expense to reduce loss” is important when you are placing coverage. Quantification and documentation of extra expense exposures depends on the types of expenses and the scenarios envisioned. If the only extra expenses that are foreseen would be to reduce a greater business income loss, then it might not be necessary to purchase the additional coverage. If business income is not at risk or can be avoided entirely with extra expenses, extra expense coverage may be the way to go. Another category of coverage that gets confused with extra expense is expediting expense. Per the International Risk Management Institute (IRMI) expediting expenses are defined as: “…expenses of temporary repairs and costs incurred to speed up the permanent repair or replacement of covered property or equipment.” The need for expediting expense coverage came from a time when boiler and machinery coverage applied to specific objects written on separate policies. Modern all risk policies will include expediting expense as a part of expense to reduce loss or extra expense coverage. See also: Shouldn’t Your Insurance Coverage Become More Than An Expense? Again it is important to understand how you might incur these loss related expenses when placing coverage. To the extent that you can save the insurance company money by expediting, you are less likely to meet resistance. If you will need to expedite repairs for other reasons, regardless of cost or time savings, you may need to get coverage that provides full reimbursement. Understanding the different types of expense coverage and how they apply to your business is critical when buying insurance. You don’t want to find out how your coverage works during a claim or that you’ve been paying for coverage you don’t need. Think through your potential scenarios, consult your broker and a forensic accountant to explore what coverages and limits are best for your risks. Then, share your conclusions with your underwriter to make sure everyone is speaking the same language.

Christopher Hess

Profile picture for user ChristopherHess

Christopher Hess

Christopher B. Hess is a partner in the Pittsburgh office of RWH Myers, specializing in the preparation and settlement of large and complex property and business interruption insurance claims for companies in the chemical, mining, manufacturing, communications, financial services, health care, hospitality and retail industries.

Distribution Debunked (Part 1)

We try technology solutions, but distribution channels have no motivation to accept them.

|
Over the past two years, there has been a rapidly accelerated emphasis on insurance technology, data and distribution. But are we as an industry spinning our wheels? I think the answer to that question is a big yes. Why? Because we haven’t asked the right questions and are not trying to solve for the right problem. All of the major technology, big data and distribution initiatives out there have a few common origination points, namely, underwriting profitability and transactional efficiency. There is a ton of money and resource spent on this, then we charge distribution with leveraging them in existing channels and in line with current transactional norms. In other words, we are trying to apply technology solutions to distribution channels that are not motivated or prepared to accept them – we then scratch our heads and wonder why we are so far behind as an industry. See also: Fast and Slow: the Changing Landscape   Asking the right questions: To fully leverage our capabilities and move our industry forward, we fundamentally need to start asking different questions – we need to go at the problem from the customers’ perspective and then drive the solutions backward. This means having the courage to understand that a distribution infrastructure that is unwilling to change will have to be shelved in favor of distribution outlets that embrace change. Without that realization, there can be no progress. The only technology advancements that can take hold are the ones that support the traditional avenues and solidify the position of the stagnated channels. Until we understand this, we will never improve. Don’t believe me? Let’s look at the landscape: Why are we being commoditized? Insurers battle the commoditization of their product – yet distribution insists that the primary customer decision point is price, even though study after study shows that customers will pay a higher price when there is value and convenience provided. Because of this, the traditional distribution channels insist on building comparative quoting infrastructures and “get a quote now” facilities that escalate the commoditization. What does value mean? We insist on defining value in our own terms instead of on the customers’ terms. We continue to hear from insurers that they will not be the lowest price but that they provide significantly better coverage. That’s all fine and dandy, but the reality is that other insurers can mimic your offering in less time than it takes for you to educate your distribution, and then get them to start selling the product. In other words, your competitive advantage is hijacked before it ever gets to market. We fail to recognize that DISTRIBUTION ADOPTION TAKES LONGER THAN CUSTOMER ADOPTION. That has been OK for a lot of years because everyone has been looking at things the same way, but what happens when your competitors wake up and finally “get it”? See also: A Practical Tool to Connect to Customers   What does the customer want? Isn’t it fascinating that this is what is at the bottom of the list? Ok let’s dig in…
  • Customers want a process that is not PAINFUL.
  • Customers want to feel like they are buying the right thing from the right company and feel good about the transaction.
  • Customers want more than just a promise to pay.
  • Customers want to get their questions answered quickly and clearly.
  • Customers want to communicate in a way that works best for them.
It’s important to ask the right questions so you can solve for the right problem. In our next installment, we’ll look at each of these and how distribution breaks down.

Donn Vucovich

Profile picture for user DonnVucovich

Donn Vucovich

Donn Vucovich is a managing partner at MVP Advisory Group. Vucovich has more than 25 years of combined financial services industry and consulting experience.

Are Malpractice Claims Fading? (Video)

Do we no longer need to focus so much on tort reform related to medical malpractice?

|
Healthcare Matters sits down with Dr. Richard Anderson, chairman and CEO of the Doctors Company. In Part 6 of the series, we ask Dr. Anderson if further tort reform is necessary, given that the medical malpractice insurance industry is experiencing a drop in claims and medical malpractice insurance rates continue to fall, creating what is essentially the longest softest market in history.

Erik Leander

Profile picture for user ErikLeander

Erik Leander

Erik Leander is the CIO and CTO at Cunningham Group, with nearly 10 years of experience in the medical liability insurance industry. Since joining Cunningham Group, he has spearheaded new marketing and branding initiatives and been responsible for large-scale projects that have improved customer service and facilitated company growth.


Richard Anderson

Profile picture for user RichardAnderson

Richard Anderson

Richard E. Anderson is chairman and chief executive officer of The Doctors Company, the nation’s largest physician-owned medical malpractice insurer. Anderson was a clinical professor of medicine at the University of California, San Diego, and is past chairman of the Department of Medicine at Scripps Memorial Hospital, where he served as senior oncologist for 18 years.

Is Flood Map Due for a Big Data Make-Over?

The Internet of Things and big data technologies could turn the flood map into a poster child for the idea of smart cities.

|
One of the staples of many cities’ and regions’ disaster planning and readiness is the flood map showing areas and, if you zoom in, structures at risk from floods of a given magnitude. These are published by FEMA in the U.S. and equivalent government agencies in other countries. Flood maps are not glamorous or technologically exciting things.  They have done their work for many years and, provided that they are up to date, are an effective way of communicating a generalized level of risk. However, they are far from perfect, and it is possible to identify a number of improvements that could be made with some of the Internet of Things and big data technologies now available. In so doing, the flood map could become a poster child for the idea of smart cities. See also: The 2 New Realities Because of Big Data First, flood maps are regularly not up date, because they are updated on a five- or 10-year cycle (or, in poorer or or less capably governed locations, whenever funds are made available). In the interim, new understanding of weather patterns, sea level rise and the like can change the definition of appropriate flood scenarios to apply, and entirely new settlement and urbanization patterns can emerge. Flood maps would clearly be more useful if they were more dynamic – if the timescale for their updating was compressed. At the same time, because of their scale, flood maps cannot really capture localized variations in risk. The example below shows how these may apply even at the scale of individual homes, in this case in Florida. (With thanks to Coastal Risk Consulting, an IBM Business Partner) If this local variation was just applicable to residential properties, that would be one thing (although bad enough for the owners of the higher-risk homes!). But if the variation made the difference between having part of the local phone or internet system working or not, or if it meant that a hospital that was thought to be safe was actually at risk of its ER wing being under 18 inches of water, that would clearly be something else again, because it could badly de-rail emergency response. Flood maps clearly need to be more granular – more detailed – as well as more dynamic. Improvements in dynamism are already being made, as the availability of commercial mapping services from Google, TomTom and others might make one suspect. These are updated rather more frequently than five to 10 years! There are also considerable improvements in granularity now available, as the above example showed – companies like Coastal Risk Consulting will provide LIDAR-based risk assessments at the level of individual properties. Different flood models can be plugged in to allow a city, business or a homeowner (or their insurers) to assess risk arising at individual locations from different scenarios. See also: Flood Insurance at the Crossroads But the improvements in dynamism and granularity could, in theory, go much further. The concept of elevation (above sea level or above a river) probably brings to mind something that is a given, fixed and invariable, unless you happen to be looking at geological timescales. But there are factors that can mediate the value of elevation that operate on a much shorter timescale. Consider a building that is 10 feet above sea level but protected by a levee 10 feet high. It may be said to have 20 feet of “virtual elevation,” inasmuch as it would require a flood crest of more than 20 feet above sea level to flood the property. Similarly, take a property 10 feet above sea level but in the area covered by a flood pump or storm drain that can remove 1.5 feet of water from that area. The property may be said to have 11.5 feet of “virtual elevation.” A property may also have a virtual elevation of less than its physical elevation if, for example, building work or a wall or pavement channels additional water toward it. The point about virtual elevation is that it may change in any given location by the year as, say, gophers undermine the levee; by the month, as an area is paved; by the day, if the flood pump is being maintained; or even by the minute if the pump suddenly fails (perhaps when its power supply is compromised by flooding elsewhere)! Virtual elevation is a highly dynamic, highly granular concept that a typical flood map would fail to capture – yet one that may make the difference between a critical asset being operable or not, or an evacuation route being open or not.  A city faced with an oncoming storm-surge or a rainfall event upstream of where it is located might therefore need to ask “what’s our virtual elevation – our disposition - right now?” The answer might make a significant difference to its standing emergency management plans and require significant adjustments. All of which tends to imply that the traditional flood map really needs a makeover. At a minimum, while it still provides the baseline, the structures and urban extents that it shows need to be updated, say, annually; making the flood map part of a more interactive tool that allowed for different weather scenarios to be applied, say, would also be a step forward. In reality, the flood map would represent one end of a continuum stretching to something much more contemporaneous. Using the same core baseline data, changes to virtual elevation could be assessed as plans are approved or building permits are issued, or as assets are maintained and their records are updated. In this way the flood map would illustrate the observation that “big data” should really be labeled “small data” – but at enormous scale. If the extra data flows can be added to improve the flood map’s dynamism to, say, a daily or weekly update, and its granularity to the individual property or asset level, it would be transformed from some form or reference baseline that may or may not be up to date at any given point in time, to a live tool that supports day to day decision making.

Peter Williams

Profile picture for user PeterWilliams

Peter Williams

Dr Peter Williams is the Chief Technology Officer, Big Green Innovations, at IBM. His focus areas are Smarter Cities, with special reference to resilience to natural disasters and chronic stresses; and technology developments for governments.

I Got 99 Problems, but a Glitch Ain't One

Although the Jay-Z song isn't about workers' comp, the industry needs to see where its problems are -- and aren't.

|
I have taken some time to review notes from the Workers’ Comp Roundtable 2016 WC Summit. The laundry list of glitches and gripes is bountiful with few surprises. Although the notes themselves do little to move the needle, they clearly show where the needle points. The collected bulk of issues contributed from various corners illuminate a fantastically disjointed hopelessness. If nothing else, this summit is a general acknowledgement of workers’ comp as a systemic failure. This is very useful. Accepting failure is essential to force a widened perspective and arrest the status quo. Accepting failure means we don’t need a complicated sorting of issues to provide sense and direction. We need to stop glitch-fixing and work from a higher level. In that spirit, only two items picked from the vast summit notes are necessary to depict the problem and re-align a solution focus: Item #1: “Every single service provider makes more money if the case goes south.” Item #2: “80% of the system is working appropriately, but 20% needs addressing.” Consider that Item #1 is a truth caused by the incorrect assumption that “20% needs addressing,” per Item #2. The WC claims failure rate of between 10% and 20% has been an accepted statistical constant since at least the start of my career in the 1980s. It has not changed. Therefore, I submit that we must realize that this 20% is a societal-social-human element, which no part of the WC vendor arsenal can, nor should be expected to, fix. We need to stop addressing the 20% as if it has any potential for cure and return to work (RTW) and resolution. See also: States of Confusion: Workers Comp Extraterritorial Issues Fueling item #1 is the decades-long growth of various for-profit interventions, managed care controls and other misguided efforts aimed at the 20%. These remain alive and well, all “going south” for profit. No one corner of the industry has incentive to change. Each has a value proposition that makes some sense standing alone but falls apart and creates cross purposes in practice. Consider that most of the other summit notes are a sub-set of this fact, relating directly to glitches in execution and the lack of human consideration in the process-monster this industry has created, All address the 20%, with the backdrop of legislative pendulums swinging to over-correct and triggering counter forces to over-react. Consider the absurdity in this simple example: What if state law required restaurants to prepare food with 20% of their raw ingredients spoiled? Would any of their dishes be fit for consumption once the 20% was blended into recipes with the 80%? What if the restaurant’s solution was to charge more money to engage more specialized cooks and more expensive spices and techniques that promised to make the spoiled parts more palatable? What if the restaurant charged even more money to predict which dishes would be the most spoiled, yet served them anyway? What if over time the entire restaurant industry saw fit to lose money on the actual menu items but have profit rely entirely on the added services aimed at placating diners’ fears over spoiled food? This absurdity is our workers’ comp system.  A restaurant should be able to throw away ingredients unfit for use. It is not that simple in WC, yet is it so far-fetched to consider legitimizing the statutory marking of such WC cases early or at any stage in real-time?  Can some escape-hatch of “skipping medicine for resolution” be a legitimate mutual position from the claimant and defense side? Can the system open means for very early strategies and legal methods to dispatch the 20% without a need to pretend to “cure?” This has happened, in small doses. Remember back when California mandated vocational rehabilitation, and it became mostly an under-the-radar holding pattern to failure and a means to propel claimants into bigger and badder disability positions? Recall that the solution at one point was to allow the option for claimants to be paid the value of Voc Re, as if they attended. This situation is a legislative acknowledgement of my main point. Let’s expand this thinking on a grander scale. Let us also agree that employers should have 90% of the responsibility to identify the 20% – they should know their employees better than any predictive model, and adjusters should have the time and mandate to properly decipher real-time information with employers. Further, employers should strive to reduce the 20% as part of overall workplace culture efforts, just as a restaurant supplier is expected to minimize the delivery of spoiled produce. This is not just about WC. Better employer culture serves to better overall productivity. See also: Are Our Working Patterns Outdated? The industry needs to eliminate much of what it does that keeps claims churning open. Fees for claims and related services should be based on outcome performance. Eliminate rewards for false notions of “saved” medical dollars or simple transaction fees for late-timed or ill-fitted interventions. Think of how efficient the WC process would be if the 80% with outcome responsibility suddenly became the 99%. Many managed care schemes and other interventions would become unnecessary. Legislatively, we need an acceptance of the 20% as a human/societal anomaly and need to require judges to account for it in tougher court decisions. We need to craft law reforms that open different avenues to resolve these cases very early under a “nuisance” presumption. We don’t need to fix the 20%. We need very big changes that relieve workers' comp of this 20% burden. Once that happens, most every other item from the summit notes will be minimized or vanquished. Note: PDF downloads of complete summit notes can be found here.

Barry Thompson

Profile picture for user BarryThompson

Barry Thompson

Barry Thompson is a 35-year-plus industry veteran. He founded Risk Acuity in 2002 as an independent consultancy focused on workers’ compensation. His expert perspective transcends status quo to build highly effective employer-centered programs.

Telematics: Time to Move Beyond Pricing

North American insurers could offer a wide range of services based on telematics data, like those already offered in other countries.

|
Sometimes it is difficult to believe that vehicle telematics for usage-based insurance is 20 years old. While the likes of Norwich Union and Progressive began planning and piloting long ago, most of the real activity in the market has taken place over the last few years. SMA’s recently released research report: Telematics in P&C Insurance: The Need to Move Beyond Pricing, profiles the state of the UBI market in North America, with a special emphasis on data and how that translates into value propositions. As the report title suggests, the North American market tends to be stuck in neutral, focusing primarily on offering premium discounts to policyholders that exhibit certain driving behaviors, as tracked by the telematics device. While there are isolated instances of insurers that have gone beyond pricing, the majority of the programs in the market and the pilots underway concentrate on attracting customers through discounts (which are often substantial). In other markets, notably Italy, Brazil, South Africa, and the U.K., other value propositions are already in the market, including safety advice, theft deterrence and concierge services. See also: Telematics: No Longer Just For Cars In theory, there should be a clear market advantage to insurers that can more precisely determine the risk characteristics for a customer and price accordingly, grabbing market share while maintaining profitability in the process. However, in many cases it has not been quite that simple. Once you get beyond the Progressives and Allstates of the insurance world, which have collected data from billions of telematics miles, most of the remaining companies lack the historical data to satisfy actuaries in pricing and profitability. And customer adoption has not been as rapid as expected, either. Still, the move movement toward an increasing usage of UBI programs continues, with almost three in five insurers that write vehicle insurance either having programs in the market or plans to do so in the next few years. See also: Telematics: Now a ‘Movie,’ Not ‘Snapshot’ The SMA research confirms that primary data sources are used for pricing, but also identifies other types of data that are being collected and could be used for future value propositions. For example, most insurers with UBI programs are collecting data on location, routes driven, vehicle diagnostics and other information. There are a wide range of new services that insurers might offer to both personal and commercial lines customers based on this data, like those already offered in other countries. And the beauty of many of the offerings that go beyond pricing is that insurers will be able to test propositions in the market without requiring regulatory approval.

Mark Breading

Profile picture for user MarkBreading

Mark Breading

Mark Breading is a partner at Strategy Meets Action, a Resource Pro company that helps insurers develop and validate their IT strategies and plans, better understand how their investments measure up in today's highly competitive environment and gain clarity on solution options and vendor selection.

An Underestimated Source of Risk

Human resource risk is often underestimated, and that can be a serious misjudgment -- as recent lawsuits and settlements prove.

|
When directors or CEOs or senior managers think about risk, they generally envision risks associated with the company’s finances, manufacturing, data, supply chain and customers. Human resource risk is often underappreciated, and that can be a serious misjudgment. Recent events, lawsuits and settlements prove this point. It is true that the risk associated with talent and a lack thereof has risen in the risk hierarchy of most organizations. However, the many other serious risks associated with managing existing talent are often relegated to the bottom of the risk register. The reasons for this underestimation are varied. Many executives tend to think that: 1) human resource matters are supplemental to the business rather than integral, 2) being an “employer at will” protects the company and enables it to make human resource decisions however it sees fit, 3) a single employee, applicant or retiree is no risk to the organization as a whole (even though a single employee can potentially cause a “class” to be formed under the law). The danger inherent in underestimating HR risk is that it does not get adequately addressed with mitigation plans. Not all organizations will have the same exposure to risks. Even if they did have the same exposure, some will have more safeguards already in place and warrant a lower risk ranking than some other organization. The discussion that follows is not meant to imply that all HR risks must be prioritized at the top right hand corner of a heat map. It is meant to highlight the potential impact that some HR risks can have on an organization. Rogue Employee Risk The rogue employee is one of the most amazing phenomena among human resource risk categories. In financial services, rogue employees have wreaked havoc on otherwise solid and long-standing businesses. Two noteworthy examples are Barings Bank, London’s oldest merchant banks, and UBS, one of Switzerland’s financial giants. Roughly 20 years ago, Nick Leeson, a Barings Bank derivatives trader, gambled away the equivalent of $1. 4 billion of bank money from a secret “error” account. The bank went bust and was bought by ING for a nominal sum. In 2011, UBS announced it had lost $2 billion due to unauthorized trades by a director at its global synthetic equities desk. And financial institutions are not the only organizations exposed to rogue employee actions that create huge risks and large losses. For instance, GNP, parent of Just BARE and Gold'n Plump, just recalled 55,608 pounds of chicken because of what it called a "product tampering incident" at one of its processing plants. Here are some of the ways in which such an employee can create risk in just about any industry sector and for which organizations need to develop safeguards as part of their mitigation plans:
  • Abetting a data breach affecting customer/employee personal data
  • Sabotaging mechanical or technological equipment
  • Sabotaging products intended for sale
  • Stealing company property, including intellectual property
  • Mishandling customers/patients on purpose
See also: Risk Management, in Plain English A fundamental safeguard is thorough vetting during the employment process. Others include: 1) active supervision, 2) automatic, system alerts when authorities are exceeded or other rogue actions are attempted, 3) robust internal audits. Regulatory Violations Risk Organizations must deal with employee-related regulation at the local, state and federal level. The number of major federal regulations has grown significantly in the past few decades and now includes such well-known acts as: the Fair Labor Standards Act, Title VII, Age Discrimination Act, the Americans with Disabilities Act, Employee Retirement Income Security Act, Family and Medical Leave Act and WARN Act. Each of these has numerous elements that must be understood and complied with, including gray areas that need to be thought through before any action regarding an employee can be decided on. The Fair Labor Standards Act has been the high-risk area of late. There have been numerous types of suits under this act related to: 1) misclassification of employees into exempt and non-exempt categories, which has implications for overtime pay, 2) incorrect calculation of overtime pay for those due it, 3) mismanagement of paid break time. A $188 million judgment against Walmart, which is being appealed, had to do with paid versus unpaid break time. Interestingly, this case revolves around the company not living up to the policies in its own handbooks, not around a failure to fulfill specific requirements spelled out in the law. This case is, therefore, illustrative of two important points. First, settlements can be financially significant even for the largest of companies. Second, when dealing with human resource matters, formal programs or policies, which constitute a contractual obligation, have to be considered. See also: Building a Strong Insurance Risk Culture Wage and hour suits are likely to keep increasing in 2016 due to the success of recent plaintiffs, new regulations regarding overtime pay and an overall concern among employees that wages are not sufficient or not fair. In an article titled “Why Wage and Hour Litigation Is Skyrocketing,” Lydia DePillis writes, “The number of wage and hour cases filed in federal court rose to 8,871 for the year [ended] Sept. 30, up from 1,935 in 2000.” Title VII and age discrimination cases have been associated with large dollar losses over the years. Given the many federal, state and local statutes, coupled with a more informed and litigious employee population, organizations can inadvertently step into non-compliance pitfalls rather easily. Organizations should always follow the laws that apply to them. Risk enters into the equation because there is always the potential that someone in management is unaware or careless or, worse yet, disrespectful of the laws. Thus, the organization is continuously exposed to the risk of violations. Every effort should be made to be compliant, including: 1) having a clear set of core values that guide lawful behavior, 2) educating management and all employees about the laws and how to comply with them, 3) investing in strong compliance processes and 4) making sure violators are dealt with quickly and appropriately. HR Program Risk Human resources professionals create and administer many expensive programs such as retirement, benefits, compensation and incentive programs. A large error in terms of budgeting or managing such programs could lead to a sizable financial risk for the organization. Imagine an actuarial error that creates severe pension underfunding or a poorly managed self-insured medical benefit plan that costs double what benchmarks would suggest. Or, consider a new incentive program that produces the antithesis of the behavior it was intended to promote. The risk can be major, not unlike the size and seriousness of a natural catastrophe or product recall or supply chain debacle. CEOs need to ensure that HR programs and policies are being handled by expert professionals, whether staff or consultants. At the same time, senior management needs to invest the attention and support necessary to ensure these are well-designed and implemented according to specification. The comments in this article are neither meant to be all-inclusive nor to be construed as advice.

Donna Galer

Profile picture for user DonnaGaler

Donna Galer

Donna Galer is a consultant, author and lecturer. 

She has written three books on ERM: Enterprise Risk Management – Straight To The Point, Enterprise Risk Management – Straight To The Value and Enterprise Risk Management – Straight Talk For Nonprofits, with co-author Al Decker. She is an active contributor to the Insurance Thought Leadership website and other industry publications. In addition, she has given presentations at RIMS, CPCU, PCI (now APCIA) and university events.

Currently, she is an independent consultant on ERM, ESG and strategic planning. She was recently a senior adviser at Hanover Stone Solutions. She served as the chairwoman of the Spencer Educational Foundation from 2006-2010. From 1989 to 2006, she was with Zurich Insurance Group, where she held many positions both in the U.S. and in Switzerland, including: EVP corporate development, global head of investor relations, EVP compliance and governance and regional manager for North America. Her last position at Zurich was executive vice president and chief administrative officer for Zurich’s world-wide general insurance business ($36 Billion GWP), with responsibility for strategic planning and other areas. She began her insurance career at Crum & Forster Insurance.  

She has served on numerous industry and academic boards. Among these are: NC State’s Poole School of Business’ Enterprise Risk Management’s Advisory Board, Illinois State University’s Katie School of Insurance, Spencer Educational Foundation. She won “The Editor’s Choice Award” from the Society of Financial Examiners in 2017 for her co-written articles on KRIs/KPIs and related subjects. She was named among the “Top 100 Insurance Women” by Business Insurance in 2000.

The REAL Objection to Opt Out

Each and every vendor makes a buck off workers' comp, and each and every one has an interest in maintaining the status quo.

|
I have never really understood why the Property Casualty Insurers Association of America has been so vehemently against opt out. While it seems that opt out returned to the back burner for this year with constitutional defeats in Oklahoma and political stalemate in other states, PCI has reignited the debate with an inflammatory paper. The basic arguments, which PCI supports with some data, is that opt out results in costs shifting to other systems and that a lack of standards and transparency is detrimental to consumers (i.e. injured workers). PCI also argues that opt out is all about saving employers money to the detriment of consumers by denying more claims earlier and paying less with capitations and restrictions not found in traditional comp. I get that alternative work injury systems must meet certain standards and need to be more transparent to consumers -- to me, that’s a no-brainer. But the objections that PCI raises are exactly the same complaints made against traditional workers' comp: inadequate benefits, unnecessary delays, cost shifting, etc. See also: Debunking ‘Opt-Out’ Myths (Part 6)   Each statistic cited by PCI against opt out can be asserted against traditional workers' comp -- just use another study or data source. For instance, just a couple of years ago, Paul Leigh of University of California at Davis and lead author of the study, Workers' Compensation Benefits and Shifting Costs for Occupational Injury and Illness, told WorkCompCentral, "We're all paying higher Medicare and income taxes to help cover [the costs not paid by workers' compensation]." That study, published in the April 2012 edition of the Journal of Occupational and Environmental Medicine, found that almost 80% of workers' compensation costs are being covered outside of workers' compensation claims systems. That amounts to roughly $198 billion of the estimated $250 billion in annual costs for work-related injuries and illnesses in 2007. Just $51.7 billion, or 21%, of those costs were covered by workers' compensation, the study said. Of the $250 billion price tag for work-related injury costs, the Leigh study found $67.09 billion of that came from medical care costs, while $182.54 billion was related to lost productivity. In terms of the medical costs, $29.86 billion was paid by workers' compensation, $14.22 billion was picked up by other health insurance, $10.38 billion was covered by the injured workers and their families, $7.16 billion was picked up by Medicare and $5.47 billion was covered by Medicaid. The study drew criticism from the workers' comp crowd, which defended its practices, challenged the data and anecdotally attempted to counter argue, with limited success. If one digs deep enough in the PCI study, I'm sure one could likewise find fault with the data and the reporting on cost shifting -- because the truth is that absolutely no one has a fix on that topic. My good friend Trey Gillespie, PCI assistant vice president of workers’ compensation, told WorkCompCentral that "the fundamental tenets of workers’ compensation [are] protecting injured workers and their families and protecting taxpayers. The general consensus is that the way programs should work is to protect injured workers and taxpayers and avoid cost-shifting.” Of course! All work injury protection systems should do that. But they don't. See also: What Schrodinger Says on Opt-Out That's what the ProPublica and Reveal series of critical articles about workers' compensation programs across the country tell us, both anecdotally and statistically: Injured workers aren't protected, costs are shifted onto other programs, and taxpayers are paying an unfair portion of what workers' comp should be paying. Indeed, in October, 10 federal lawmakers asked the U.S. Department of Labor for greater oversight of the state-run workers’ compensation system, to counteract “a pattern of detrimental changes to state workers’ compensation laws and the resulting cost shift to public programs.”
I started thinking about the one truism that governs human behavior nearly universally: Every person protects their own interests first. And I thought of PCI’s name: Property and Casualty Insurers Association of America. “Property and casualty.” Ay, there's the rub! There’s no room for P&C in opt out! ERISA-based opt out uses only health insurance and disability insurance. Workers' comp is the mainstay of the P&C industry, the single biggest commercial line and the gateway to a whole host of much more profitable lines. If opt out spreads beyond Texas, it is hugely threatening to the interests of the PCI members because they stand to lose considerable business, particularly if opt out migrates to the bigger P&C states. PCI is protecting its own interests (or those of its members) by objecting to opt out. And I don't blame them. Their impression of this threat is real. Michael Duff, a professor of workers’ compensation law at the University of Wyoming, told WorkCompCentral, “These are interested observers. They’re going to have an agenda. They represent insurers who are in the workers’ comp business.” Bingo. “Every commercial actor that participates in traditional workers’ compensation has an interest in seeing traditional workers’ compensation continue," Duff went on. “But that traditional workers’ compensation imposes costs on employers. There is now a group of employers who would like to pay less, and Bill Minick has developed a commercial product that is in competition with this other conceptual approach to handling things.” Here's THE fact: Traditional workers' compensation and ANY alternative work injury protection plan require vendors pitching wares and services to make the systems work. Insurance companies are as much a vendor in either scenario as physicians, bill review companies, utilization review companies, attorneys, vocational counselors, etc. Each and every single one makes a buck off workers' comp, and each and every one has an interest in maintaining the status quo. See also: States of Confusion: Workers Comp Extraterritorial Issues Arguing that one system is better than the other without admitting one's own special interest is simply hypocrisy. Workers' compensation is going through some soul searching right now. Employers leading the debate are asking, "Why stay in a system that facilitates vendors' interests ahead of employers or workers?" THAT's the question that BOTH the P&C industry and the opt out movement need to answer. Further debate about the merits of one over the other is simply sophistry. This article first appeared at WorkCompCentral.

David DePaolo

Profile picture for user DavidDePaolo

David DePaolo

David DePaolo is the president and CEO of WorkCompCentral, a workers' compensation industry specialty news and education publication he founded in 1999. DePaolo directs a staff of 30 in the daily publication of industry news across the country, including politics, legal decisions, medical information and business news.

How to Shrink Employees’ Waistlines

With lack of physical activity during modern office workdays, encouraging exercise is in everyone’s interests. The question is how.

|
A majority of the population of countries in the Organization for Economic Cooperation and Development is now classified as overweight or obese, with weight-related health costs accounting for up to 10% of total healthcare spending. Levels of obesity are also rising in the developing world. Excess weight can lead to multiple health issues, increasing the number of sick days, as well as health insurance premiums. In addition, poor health costs U.S. companies U.S. $227 billion a year in lost productivity, while U.K. companies are losing £29 billion a year (U.S. $45 billion) through sick leave costs. With lack of physical activity during modern office workdays, a core contributing factor to the sedentary lifestyles that are increasing obesity, encouraging exercise is in everyone’s interests. The question is no longer so much whether you should invest in employee wellness, but how. In Depth Numerous studies have shown benefits from encouraging employees to exercise:
  • Better problem solving: Want your employees to get better at solving problems and innovating? Aerobic exercise has been shown to boost both positivity and creativity.
  • Improved mental health: Physically active employees are significantly less likely to suffer from depression or job burnout.
  • More capable management: Getting managers exercising not only reduces their stress levels but makes them better managers, according to some studies.
There are several approaches companies can take that can help even the most reluctant employee start adopting a more healthy lifestyle. However, it’s important to remember that, to get the benefit, it is important that employees see the exercise as enjoyable and practical, not as a chore. Here are just a few:
  • Calorie-counted staff cafés: Consuming too many calories is the key cause of most weight issues, so helping staff manage their intake by providing healthy yet nutritious meals at an on-site café can be a major boost. In addition, several studies have shown that workplace cafés can act as social hubs that boost employee engagement and motivation.
  • On-site gyms: Employees are more likely to exercise if it is convenient, while time lost traveling to an off-site gym can reduce productivity and increase stress. On-site company gyms can save employees an average of $58 a month in membership fees — and make it easy to get the productivity and health benefits of daytime exercise.
  • Discounts for regular workouts: With the rise of wearable fitness tracking devices come new opportunities to monitor employee lifestyles, and reward the healthy ones. The ability to keep track of employee activity is sparking a fresh wave of apps that could help reduce insurance premiums if adopted at scale.
  • Standing desks: Studies have shown standing desks — a popular alternative in modern workplaces — lead to an increased heart rate, improved energy levels and employees burning as much as 20 additional calories an hour. Long periods of sitting, meanwhile, have been associated with increased mortality across a range of illnesses, with some doctors warning that sitting is the new smoking.
  • Cycle-to-work schemes: As well as saving employees money (as much as $7.3 billion a year in the U.S. alone) and being a great way to burn off excess calories, cycling to work is, on average, associated with one less sick day per year than for non-cycling colleagues.
  • Group calisthenics: One of the oldest workplace wellness programs (still popular in many Asian countries), organized all-company workouts are starting to make a comeback in the West. Though they can be awkward at first, done right they can boost team spirit and employee health.
It has become a truism that employees are businesses’ biggest asset. Just as you would invest in keeping your machinery operating at its best through regular maintenance, investing in maintaining your staff’s health is increasingly vital. Not only could it be good for productivity, but studies have shown that such programs can be vital in both attracting and retaining top talent. With staff turnover rates increasing across the world, if you want to thrive in the long term, investing in employee health and wellness could be an increasingly important strategy to keep your people active, productive and engaged. Talking Points “Instead of viewing exercise as something we do for ourselves — a personal indulgence that takes us away from our work — it’s time we started considering physical activity as part of the work itself. The alternative, which involves processing information more slowly, forgetting more often and getting easily frustrated makes us less effective at our jobs and harder to get along with for our colleagues.” – Harvard Business Review “Workplace wellness and community prevention programs are a win-win way to make a real difference in improving our health and bottom line all at once.” – Jeff Levi, executive director, Trust for America’s Health “Employees are eight times more likely to be engaged when wellness is a priority in the workplace.” – World Economic ForumThis article originally appeared onTheOneBrief.com, Aon’s weekly guide to the most important issues affecting business, the economy and people’s lives in the world today.” Further Reading

Stephanie Pronk

Profile picture for user StephaniePronk

Stephanie Pronk

Stephanie Pronk is a senior vice president and leads Aon’s U.S. National Health Transformation team. Pronk combines more than 30 years of experience in developing, implementing and evaluating health improvement and benefit strategies.

AI's Promise Is Finally Upon Us

In the fields in which it is trained, AI exceeds the capabilities of humans. It has advanced more in three years than in the past three decades.

|
We have been hearing predictions for decades of a takeover of the world by artificial intelligence. In 1957, Herbert A. Simon predicted that within 10 years a digital computer would be the world’s chess champion. That didn’t happen until 1996. And despite Marvin Minsky’s 1970 prediction that “in from three to eight years we will have a machine with the general intelligence of an average human being,” we still consider that a feat of science fiction. The pioneers of artificial intelligence were surely off on the timing, but they weren’t wrong; AI is coming. It is going to be in our TV sets and driving our cars; it will be our friend and personal assistant; it will take the role of our doctor. There have been more advances in AI over the past three years than there were in the previous three decades. Even technology leaders such as Apple have been caught off guard by the rapid evolution of machine learning, the technology that powers AI. At its recent Worldwide Developers Conference, Apple opened up its AI systems so that independent developers could help it create technologies that rival what Google and Amazon have already built. Apple is way behind. The AI of the past used brute-force computing to analyze data and present them in a way that seemed human. The programmer supplied the intelligence in the form of decision trees and algorithms. Imagine that you were trying to build a machine that could play tic-tac-toe. You would give the computer specific rules on what move to make, and it would follow them. That is essentially how IBM’s Big Blue computer beat chess Grandmaster Garry Kasparov in 1997, by using a supercomputer to calculate every possible move faster than he could. See also: AI: Everywhere and Nowhere (Part 2) Today’s AI uses machine learning, in which you give it examples of previous games and let it learn from those examples. The computer is taught what to learn and how to learn and makes its own decisions. What’s more, the new AIs are modeling the human mind itself, using techniques similar to our learning processes. Before, it could take millions of lines of computer code to perform tasks such as handwriting recognition. Now it can be done in hundreds of lines. What is required is a large number of examples so that the computer can teach itself. The new programming techniques use neural networks — which are modeled on the human brain, in which information is processed in layers and the connections between these layers are strengthened based on what is learned. This is called deep learning because of the increasing numbers of layers of information that are processed by increasingly faster computers. Deep learning is enabling computers to recognize images, voice and text — and to do human-like things. Google searches used to use a technique called PageRank to come up with their results. Using rigid proprietary algorithms, they analyzed the text and links on Web pages to determine what was most relevant and important. Google is replacing this technique in searches and most of its other products with algorithms based on deep learning, the same technologies that it used to defeat a human player at the game Go. During that extremely complex game, observers were themselves confused as to why their computer had made the moves it had. In the fields in which it is trained, AI is now exceeding the capabilities of humans. AI has applications in every area in which data are processed and decisions required. Wired founding editor Kevin Kelly likened AI to electricity: a cheap, reliable, industrial-grade digital smartness running behind everything. He said that it “will enliven inert objects, much as electricity did more than a century ago.  Everything that we formerly electrified we will now ‘cognitize.’ This new utilitarian AI will also augment us individually as people (deepening our memory, speeding our recognition) and collectively as a species.There is almost nothing we can think of that cannot be made new, different or interesting by infusing it with some extra IQ. In fact, the business plans of the next 10,000 start-ups are easy to forecast: Take X and add AI. This is a big deal, and now it’s here.” See also: AI: The Next Stage in Healthcare   AI will soon be everywhere. Businesses are infusing AI into their products and helping them analyze the vast amounts of data they are gathering. Google, Amazon and Apple are working on voice assistants for our homes that manage our lights, order our food and schedule our meetings. Robotic assistants such as Rosie from “The Jetsons” and R2-D2 of Star Wars are about a decade away. Do we need to be worried about the runaway “artificial general intelligence” that goes out of control and takes over the world? Yes — but perhaps not for another 15 or 20 years. There are justified fears that rather than being told what to learn and complementing our capabilities, AIs will start learning everything there is to learn and know far more than we do. Though some people, such as futurist Ray Kurzweil, see us using AI to augment our capabilities and evolve together, others, such as Elon Musk and Stephen Hawking, fear that AI will usurp us. We really don’t know where all this will go. What is certain is that AI is here and making amazing things possible.

Vivek Wadhwa

Profile picture for user VivekWadhwa

Vivek Wadhwa

Vivek Wadhwa is a fellow at Arthur and Toni Rembe Rock Center for Corporate Governance, Stanford University; director of research at the Center for Entrepreneurship and Research Commercialization at the Pratt School of Engineering, Duke University; and distinguished fellow at Singularity University.