Download

Flogging the Data Until It Confesses

When you hear boasts of big short-term impacts of wellness programs, beware. Someone is likely flogging the data.

sixthings
Did you ever hear the joke where the boss says floggings will continue until morale improves? In healthcare, flogging the data until results improve or the data confesses is not uncommon. Too bad.
Over the course of my career, I’ve worked with companies with more than a hundred thousand covered lives, the claim costs of which could swing widely from year to year, all because of a few extra transplants, big neonatal ICU cases, ventricular assist cases, etc. Here are just a few of the single case claims I’ve observed in recent years:
  • $3.5 million   one cancer case
  • $6 million        one neonatal intensive care
  • $8 million        one hemophilia case
  • $1.4 million    one organ transplant
  • $1 million        one ventricular assist device
These big numbers aren't a complaint. After all, health insurance should be about huge, unbudgetable health events. But they raise an important point about the lumpiness of costs and about claims that are made about reducing health expenditures. All health insurance plans must cover one organ transplant every 10,000 or so life years, which will cost about $1 million over six years. So, a plan with 1,000 covered lives will have such an expense every 10 years, on average. Of course, the company may have none for 15 years and have two in the 16th year. The same timing applies to $500,000-plus ventricular assist device surgeries. Looking at claims data for small groups is perilous—and is sometimes so for large groups, too. Because of the high cost and relative infrequency of so-called “shock” claims (those of more than $250,000), you need about 100,000 life years for the claims data to be even approximately 75% credible. When a group with 5,000 lives says it did something that cut the claims costs, you can’t really know if the change made a significant difference for a couple of decades. Here's an example. A small-ish group with about 3,000 covered lives asked me to help calculate how much its wellness plan was saving. It had all its employees listed in three tiers: active wellness participants, moderate participants and non-participants. I warned the company it didn’t have enough data to be credible, but it proceeded anyway. It expected active users would have the lowest claim costs—and so on. When the data was reviewed, there was perfect reverse correlation. Active wellness users had the highest claim costs, moderate users had the next highest costs and non-participants had the lowest. In the final report—which I had nothing to do with preparing and from which I had recused myself—the company subtracted big claims by the active and moderate users to get the results it wanted. In short, the company flogged the data until it confessed. Alas. One large company claimed huge reductions in plan costs by adding a wellness program. It turns out, during the period in question, the company also implemented an “early out” incentive. Upon examination, the early-out program resulted in a big reduction in the number of older employees, which more than accounted for the reduction in claims costs. See Also: 6 Limitations of Big Data in Healthcare Here is yet another example. I was at a conference a few years ago where a presenter from a small company (about 1,000 covered lives) claimed to have kept its health costs flat for five years through wellness initiatives. While the presenter got a big ovation, his numbers just didn’t add up. I asked him a few questions after his speech about the other changes he made during that period. He said the company lowered its “stop loss” limit from $100,000 to $50,000 a few years earlier. Then he admitted to excluding his stop-loss premium costs, which were skyrocketing, from his presentation. With a bit of mental arithmetic, I added the costs back in, which revealed his company’s total health costs were going up at the same rate as everyone else’s, perhaps even a little faster. Hmmm. I don’t think he deliberately misled the audience; he just didn’t know better. When you hear boasts of big short-term impacts of wellness programs, beware of confirmation bias. When a company claims it implemented something that caused its health plan costs to drop 15% or so, ask a few questions:
  1. Did the company adjust for plan design changes—such as raising deductibles and co-pays—that merely shifted costs to employees?
  2. Did the changes really save claim dollars?
  3. Did the company factor in stop-loss premiums?
  4. How many life years of data did the company observe?
  5.  Did the company exclude large or “shock” claims? (This isn't uncommon, especially among wellness vendors.)
  6. Did the company experience any big changes in demographics, such as through implementation of an early retirement program or layoffs that, particularly, had a large impact on older workers?
When I’ve asked those kinds of questions of a small company, I’ve almost never seen a big claim of cost reductions hold up under scrutiny. And that goes for some big companies, too.
Today, flogging the data to get the desired results is all too common. That’s no surprise. Academics and big pharma kept getting caught doing the same thing. Skepticism is a good thing.

Tom Emerick

Profile picture for user TomEmerick

Tom Emerick

Tom Emerick is president of Emerick Consulting and cofounder of EdisonHealth and Thera Advisors.  Emerick’s years with Wal-Mart Stores, Burger King, British Petroleum and American Fidelity Assurance have provided him with an excellent blend of experience and contacts.

Technology and the Economic Divide

It is time to start a nationwide dialogue on how we can distribute the new prosperity that we are creating with advancing technologies.

sixthings
Yelp Eat24 customer-support representative Talia Jane recently wrote a heart-wrenching blog about the difficulties she faced in living on her meager salary. “So here I am, 25 years old, balancing all sorts of debt and trying to pave a life for myself that doesn’t involve crying in the bathtub every week,” she wrote. Her situation was so dire that, on one occasion, she could not even come up with the train fare to work.  She lived on the junk food that they provide at work. Her message was addressed to Yelp CEO Jeremy Stoppelman. What did the company do? It fired her on the spot. Yes, Jane made a mistake in posting this message on Medium rather than sending an email to Stoppleman. But her situation isn’t unique. She outlined the contours of a life that are familiar to many of the people working on the lowermost rungs of technology’s corporate ladder. After a social media backlash, Stoppleman acknowledged that the cost of living in San Francisco is too high and tweeted that there needs to be lower-cost housing. But the problem is more complex than San Francisco’s housing costs. The problem is the growing inequality and unfair treatment of workers. And technology is about to make this much worse and create a cauldron of unrest. Silicon Valley is a microcosm of the problems that lie ahead. Sadly, some of its residents would rather brush away the poverty than face up to its ugly consequences. This was exemplified in a letter that Justin Keller, founder of Commando.io, wrote to San Francisco Mayor Ed Lee and Police Chief Greg Suhr.  He complained that the “homeless and riff-raff” who live in the city are wrecking his ability to have a good time. The Valley’s moguls do not overtly treat as inconveniences to themselves the bitter life trajectories that lead to experiences such as Keller complained of; but they have largely been in denial about the effects of technology. Other than a recent essay by Paul Graham on income inequality, there is little discussion about its negative impacts. The fact is that automation is already decimating the global manufacturing sector, transforming a reliable mass employer providing middle-class income into a much smaller employer of people possessing higher-level educations and skills. The growth of the “Gig Economy”—ad hoc work—is shifting businesses toward the goal of part-time, on-demand employment, with aggressive avoidance of obligations for health insurance and longer-term benefits. And the tech industry has a winner-takes-all nature, which is why only a few giant digital companies compete with each other to dominate the global economy. A substantial part of the value they capture is concentrated at the center and mostly benefits a few shareholders, executives and employees. With technology advances and convergence, we are in the middle of a gold rush that is widening inequality. Already, in Silicon Valley, the Google bus has become a symbol of this inequity. These ultra-luxurious, Wi-Fi–connected buses take workers from the Mission district to the GooglePlex, in Mountain View. The Google Bus is not atypical; most major tech companies offer such transport now. But so divisive are they that in usually liberal San Francisco, activists scream angrily about the buses using city streets and bus stops, completely ignoring the fact that they also take dozens of cars off the roads. Teslas, too, have become symbols of the obnoxious techno-elite—rather than being celebrated for being environmentally game-changing electric vehicles. In short, there’s very little logic to the emotionally charged discussions—which is the same as what we are seeing at the national level with the presidential primaries. Intellectuals are trying to build frameworks to understand why the divide, which first opened up in the 1990s, continues to worsen. Thomas Piketty explained in his book Capital in the Twenty-First Century that the economic inequality gap widens if the rate of return on invested capital is superior to the rate at which the whole economy grows. His proposed response is to redistribute income via progressive taxation. A competing theory, by an MIT graduate student, holds that much of the wealth inequality can be attributed to real estate and scarcity. Silicon Valley has both: an explosion in wealth for investors and company founders, and a real-estate market constrained by limits on development. We need to immediately address San Francisco’s housing crisis and raise wages for lower-skilled workers. Both are possible; the region has enough land, and the industry has enough wealth. In the longer term, we will also need to develop safety nets, retrain workers and look into the concept of a universal basic income for everyone. It is time to start a nationwide dialogue on how we can distribute the new prosperity that we are creating with advancing technologies.

Vivek Wadhwa

Profile picture for user VivekWadhwa

Vivek Wadhwa

Vivek Wadhwa is a fellow at Arthur and Toni Rembe Rock Center for Corporate Governance, Stanford University; director of research at the Center for Entrepreneurship and Research Commercialization at the Pratt School of Engineering, Duke University; and distinguished fellow at Singularity University.

How to Make IT Efforts Strategic

Here are six steps that business owners must take to ensure that IT investments drive strategic value that can be measured.

sixthings
Has your IT come out of the proverbial and actual basement to be an integral part of your business strategy? Too often, business leaders assign IT a task and expect an initiative to be delivered. End of story. The truth is, business owners must engage and own the outcomes of their IT investments, driving them to a strategic value that can be measured. What is IT strategy? Think about any infrastructure initiative (building highways, public transportation or urban development). Without the requisite strategic investment of time, funding and planning, these initiatives face delays, cost overruns, diversion from desired strategy and failure. True partnerships between IT and business operations insure that the best thinking of both can be applied to a given situation to produce strategic results. See Also: The 7 Colors of Digital Innovation Business value IT should be viewed as a business strategy. Today, not a single discussion in the workers’ compensation industry relating to claims management or medical management does not include IT. As workers’ comp focuses on outcomes (both cost and quality), it is the only new strategy around. Moreover, it is the most effective and efficient strategy to achieve business goals. The following six elements are necessary to generate business value by leveraging the IT strategy.  1.    Define the project—Describing how new technology or a new data application will function is only the first step in integrating IT into the business strategy. However, defining the project can be tricky. Remember, IT professionals talk a different language and appreciate different measures of success than those involved in operations. Business owners cannot assume their IT requests are understood as they were intended. Even slight misinterpretations of requests can result in frustration, cost overrides and a useless tool. I recall one time, early in my career, when I submitted specifications for a development project. I used the word “revolutionary” to describe the powerful impact it would have on the business. However, the IT person, who was younger and male, interpreted "revolutionary" in an aggressive, military sense, which was not even close to what I had in mind. Always verify that you have an understanding and clarify of all elements of the IT project.  2.    Design for simplicity—If the IT project outcome is complicated or requires too many steps, people will not use it. 3.    Define the expected business value—As a part of defining the IT project, define its expected business value. Both the business unit involved and the IT team need to align their expected outcomes. Not unlike evaluating ROI (return on investment), identify the financial investment and rewards of the IT project. Make sure to also describe the anticipated collateral outcomes of the IT project, such as PR, business growth or client involvement. Figure out how to measure the expected business outcomes when the project is complete. Design the project outcome value measures at the beginning. Too often, business leaders do not articulate their expectations of value and, therefore, can never prove them. If you do not know where you are going, you could end up somewhere else. 4.    Commit resources—Funding and other resources such as personnel should be allocated at the beginning; short-shrifting resources will guarantee less-than-satisfactory results. Know from the beginning how the IT project will be implemented and who will do and be responsible for the work. Establish accountabilities and create procedures for follow-up. 5.    Monitor progress—Continuously monitor and manage the project, even throughout the IT development process. Discovering deviations from the plan early on minimizes damage and rework. Obviously, rework means cost and delay. 6.    Measure value—Once the project is accepted and implemented, begin continuous outcome evaluation. Execute the value measures outlined at the beginning. Make the necessary adjustments and keep your eye on the business value. Not everyone can be an IT expert, but everyone can become an expert in how IT advances the strategies of their domain.

Karen Wolfe

Profile picture for user KarenWolfe

Karen Wolfe

Karen Wolfe is founder, president and CEO of MedMetrics. She has been working in software design, development, data management and analysis specifically for the workers' compensation industry for nearly 25 years. Wolfe's background in healthcare, combined with her business and technology acumen, has resulted in unique expertise.

What You Must Know on Machine Learning

Even with the emphasis on analytics and big data, companies are totally unaware of the powerful opportunities with machine learning.

||||||
I've yet to hear a financial services executive focus on machine learning as a key part of his company's insight strategy. "Analytics" still dominates Google searches (not only ahead of "machine learning" but far ahead of even "big data"): Businesses are increasingly looking to hire data scientists, and they leave universities having been taught machine learning together with a mixture of statistics and computer science. When I spoke with data science students at an event in Edinburgh, it was clear they saw machine learning as a key part of their specialty, even if most businesses rarely mention the term. In the 20 years since I was an R&D manager developing artificial intelligence pilots, I've seen few businesses even attempt to apply the techniques I found so powerful (including case-based reasoning, neuro-fuzzy logic and genetic algorithms). But perhaps data science finally has enough momentum to take AI into mainstream commercial application. So, if you're looking to keep up with developing data science or (wider) customer insight professions, what should you know about machine learning? Is it too late for you to learn? Do you need to return to university? Although the social life options of the latter may sound appealing, most leaders don't have time to put their corporate careers on hold while they retrain. Luckily, there are online resources to help you get up to speed and, at least, understand the language being used by your latest hires. In this post, I'll share a few online resources and reviews I hope you'll find useful. See Also: How Machine Learning Changes the Game What better place to start than an online tutorial that claims to be the world's easiest introduction. With the catchy headline "Machine Learning is Fun!", this two-part blog—published on Medium by Adam Geitgey—is perhaps not as simple as some would like, but it does provide a useful overview of techniques. Screen Shot 2016-03-24 at 12.06.20 PM To balance the data science perspective on machine learning, I thought it might also be interesting to share a market research perspective. This balanced and useful review by Kevin Gray in Quirks provides such a perspective. It should help researchers consider where AI algorithms could also be applicable to their quant work. Screen Shot 2016-03-24 at 12.07.28 PM If all that education and advice has made you keen to get your hands dirty and try machine learning, the next question is how you can get started. Well, if you are an R coder or have analysts in your team with R programming skills, here's a handy starting point shared by Jason Brownlee. Screen Shot 2016-03-24 at 12.08.19 PM Don’t worry if you can’t, or prefer not to, use R. It seems that, as well as a plethora of machine learning tools, there are some heuristics, too. In this quick-start guide from the same site as above, Brownlee also shares how to understand any machine learning tool quickly (the information is so good I had to include this second link from the same blog. Screen Shot 2016-03-24 at 12.09.08 PM Finally, to really get you ruminating on the subject, consider this more philosophical piece by Christopher Nguyen, where he explores our relationship with AI the other way—what can the ways machines learn teach us about our own brains, imaginations and the role of intuition. Thought-provoking stuff Screen Shot 2016-03-24 at 12.09.47 PM I hope this post was of interest. If you’ve discovered other great content online that can help us all better understand machine learning, please do share. Have a great time learning more!

Paul Laughlin

Profile picture for user PaulLaughlin

Paul Laughlin

Paul Laughlin is the founder of Laughlin Consultancy, which helps companies generate sustainable value from their customer insight. This includes growing their bottom line, improving customer retention and demonstrating to regulators that they treat customers fairly.

Commercial Insurers Face Tough Times

One executive said, “The odds of this long of a lucky streak [on catastrophe losses] occurring is less than 1%.”

sixthings
Beyond the secular forces we described in our "Future of Insurance" series, more immediate and cyclical issues will be shaping the insurance executive agenda in 2016. Commercial insurers (including reinsurers) face tough times ahead, with underwriting margins that are being pressured by softening prices and a potentially volatile interest rate environment. Recently, reserve releases, generally declining frequency and severity trends, as well as lower-than-average catastrophe losses have allowed commercial insurers to report generally strong underwriting results. However, redundant reserves are being (or have been) depleted, and the odds of a continued benign catastrophe environment are low. For example, one insurance executive recently observed, “The odds of this long of a lucky streak occurring is less than 1%.” The commercial insurance market has, in recent years, had generally strong underwriting results, but this could change—potentially, very soon. With varying degrees of focus, commercial P&C insurers have been mitigating the risk environment by taking a variety of strategic actions. In 2016 and beyond, they will need to accelerate their strategic efforts in four key areas: 1) core systems and data quality, 2) new products, pricing discipline and terms and conditions, 3) corporate development and 4) talent management. Core systems and data quality 93% of insurance CEOs—a higher percentage than anywhere else in financial services—see data mining and analysis as more strategically important for their business than any other digital technology. Nevertheless, many commercial insurers operate with networks of legacy systems that complicate the timely extraction and analysis of data. This is no longer deemed acceptable, and leading insurers continue to transform their system environments as a result. Significantly, these transformations do not focus solely on specific systems for policy administration, claims, finance, etc. To ensure timely quality data across the entire commercial P&C value chain, commercial insurers also focus on how the various systems are integrated with one another. To put this into context, when a dollar of premium is collected, it not only “floats” across time until it is paid out in claims, it also “floats” across a variety of functions and their related systems: Billing systems process premium dollars; ceded reinsurance systems process treaty and facultative transactions; policy administration systems (PAS) process endorsement changes; claims systems process indemnity and expense payments. Actuarial systems in disconnected data environments prevent the timely and efficient extraction and analysis of internal data and also complicate the focused and efficient use of external data, especially unstructured data. “Big data” is becoming increasingly popular considering the insights that insurers and reinsurers can derive from it. However, such insights only become actionable to the extent that companies can assess the external environment in the context of the internal environment—in other words, to the extent that big data can enhance (or otherwise inform) the internal data’s findings. If all functional and systemic codes are not rationalized on an enterprise-wide basis, it is very difficult to efficiently accumulate and analyze data. New products, pricing discipline and terms and conditions Commercial insurers and reinsurers are not generally known as product innovators, but they can be. For example, as the profile of cyber-related risks increases, the need for cyber-related commercial insurance grows, thereby offering numerous opportunities for product innovation. Because cyber is a relatively new exposure, frequency and severity data are nascent, therefore both pricing and risk accumulation models are in various stages of development. As a result, prescient insurers are carefully tracking and comparing their cyber pricing practices and coverage grants with those of key competitors. To be effective, such practices should be consistent with existing price, terms and conditions and monitoring processes. For example, tracking actual-to-expected premiums and rates is a common practice, which leading insurers perform regularly (i.e., at least quarterly, with monthly tracking common). Insights from this kind of analysis apply to both new and existing products. The underwriting cycle is inherently a pricing phenomenon, and insurers and reinsurers that have greater and more timely product and pricing insights have a competitive advantage relative to those that do not. To explain, in addition to lower rates, the “soft” parts of the underwriting cycle tend to be characterized by the loosening of policy terms and conditions, which can erode profitability as quickly as inadequate prices. Therefore, the most competitive insurers and reinsurers carefully and continuously track the adequacy of policy terms and conditions. Recurring actuarial analyses and standardized reporting can monitor changes in pricing as well as in terms and conditions. However, identifying emerging underwriting risks is inherently qualitative. Therefore, this analysis can be time-consuming, especially for insurers with suboptimal PAS environments. However, almost all companies find the analysis well worth the effort. Corporate Development The combination of historically low interest rates, favorable frequency and severity trends and the relative lack of severe catastrophes has resulted in record policyholder surplus across P&C commercial insurance. Executives have a number of options on how to deploy surplus, one of which is corporate development. Commonly, “corporate development” means mergers and acquisitions, but it can also encompass book purchases/rolls, renewal rights and runoff purchases. Determining the best option depends on many factors, including purchase price, competitive implications and an assessment of how the acquired assets and any related capabilities can complement or enhance existing underwriting capabilities. Accordingly, some insurers are beginning to augment traditional due diligence processes (such as financial diligence, tax diligence and IT diligence) with underwriting-specific diligence to help ensure value realization over time. If a corporate development opportunity offers underwriting capabilities that at least align to—and preferably enhance—existing capabilities, it can help facilitate a smooth integration, thereby mitigating underwriting risk (a key cycle management consideration). Talent Management For the most part, commercial underwriting decisions cannot be fully automated because they require judgment. Therefore, it is natural for underwriting talent to be a top priority. However, insurance executives have lamented that it is a major challenge for the industry to attract and retain knowledgeable personnel. Two trends make commercial insurance talent management particularly challenging. First, experienced underwriters are leaving the industry. According to one study, “The number of employees aged 55 and over is 30% higher than any other industry—and that, coupled with retirements, means the industry needs to fill 400,000 positions by 2020.” Second, underwriting talent is relatively difficult to attract. For example, according to the Wall Street Journal, insurance ranks near the top of the list of least-desirable industries—according to recent graduates. The image of the insurance industry is that it is generally behind the times and offers little in terms of career development. Therefore, developing a performance-driven culture that enables the recruitment, development and retention of underwriting talent is more crucial than ever. To help accomplish this, insurers should employ and should continuously assess tools and resources that educate and empower underwriters through all phases of their careers. This is important because the expectations in commercial underwriting are high, and the nature of the job requires a diverse range of skills (e.g., analytical, relational, sales, financial and risk). Furthermore, the best commercial underwriters are entrepreneurial, which employers should highlight as they recruit and manage their underwriting staffs. Commercial insurers face a looming talent crunch and have to find ways to present themselves as—and actually be—a place where young people can have rewarding careers. Implications
  • The relatively strong underwriting results of recent years are likely to soften in the coming year. Accordingly, commercial underwriters will need to accelerate their strategic efforts in:
  1. Core systems and data quality,
  2. New products, pricing discipline and terms and conditions,
  3. Corporate development
  4. Talent management
  • Core systems transformations go beyond individual system competencies. To ensure timely, quality data across the entire commercial P&C value chain, insurers also are focusing on how the various systems are integrated with each another to facilitate the timely and efficient extraction and analysis of internal data and the focused and efficient use of external data (especially unstructured data).
  • There are opportunities to create new products, but, to be profitable, insurers must exercise pricing discipline and must carefully and continuously track the adequacy of policy terms and conditions.
  • Current surplus levels have enabled insurers to invest in corporate development, and some insurers have augmented traditional due diligence processes (such as financial diligence, tax diligence and IT diligence) with underwriting-specific diligence to help promote value realization over time.
  • Commercial insurers have an aging workforce and are facing an impending talent crunch. Automation cannot replace the judgment that is required for effective underwriting. Therefore, it is vital for insurers to develop a performance-driven culture that enables the recruitment, development and retention of underwriting talent over time.

Joseph Calandro

Profile picture for user JosephCalandro

Joseph Calandro

Joe Calandro, Jr., is a managing director with Strategy&, part of the PwC network. Calandro has broad experience, in the U.S. and globally, across the disciplines of strategy, analytics, M&A, risk management, underwriting and claims.


Francois Ramette

Profile picture for user FrancoisRemette

Francois Ramette

Francois Ramette is a partner in PwC's Advisory Insurance practice, with more than 15 years of strategy and management consulting experience with Fortune 100 insurance, telecommunications and high-tech companies.

Active Shooter Scenarios

Planning for an active shooter threat has become an unfortunately necessary part of institutional safety and risk management best practices.

sixthings
Campus safety and security is a topic of increasing concern on both a personal and institutional level. On-campus shootings can no longer be viewed as singular, isolated events. The good news is that the chance of an active shooter incident taking place on campus is pretty small. However, because of the random nature of such events, all institutions need to be prepared. Planning for an active shooter threat has become an unfortunately necessary part of the framework of institutional safety and risk management best practices. Active Shooter Defined According to the U.S. Department of Homeland Security, an active shooter is an individual actively engaged in killing or attempting to kill people in a confined and populated area; in most cases, active shooters use firearms(s), and there is no pattern or method to their selection of victims. Active shooter situations are unpredictable and evolve quickly. Typically, the immediate deployment of law enforcement is required to stop the shooting and mitigate harm to victims. Because active shooter situations are often over within 10 to 15 minutes, before law enforcement arrives on the scene, individuals must be prepared both mentally and physically to deal with an active shooter situation. Colleges and universities understand the need for emergency response plans for many different types of disasters and typically already have processes and procedures in place to address multiple types of disasters. Planning for an active shooter threat can and should be integrated into an institution’s overall emergency and disaster preparedness plans. While many of the components are similar for most natural and man-made disasters, the inclusion of an active shooter plan generates an even greater immediacy for response. There are several considerations when it comes to the development and implementation of an emergency response plan to address any threat. These include the three Ps: Prevention, Preparedness and Post-Event Management and Recovery, each of which will be discussed in greater detail below. See Also: "Boss, Can I Carry While I'm Working?"
  • PREVENTION
Engage in Threat Assessment Probing how threats develop can mitigate, diffuse or even eliminate a situation before it occurs. Active shooters do not develop in a vacuum. A joint study by the U.S. Department of Education, the Secret Service and the Federal Bureau of Investigation concluded that individual attackers do not simply “snap” before engaging in violence; rather, they often exhibit behaviors that signal an attack is going to occur. The study recommends the use of threat assessment teams to identify and respond to students and employees. As part of the threat identification and assessment process, an institution may elect to conduct pre-employment background checks to identify past patterns of violent behavior. While the background check process may not be a perfect indicator of future behavior, it does provide a useful mechanism for vetting a prospective employee. If triggering behavior is found, the threat assessment team can be used to evaluate the information and determine whether further action or intervention is needed.  Encourage Training and Education An essential component of prevention is training the campus community on how to identify both trigger behaviors and events that may trigger a potential incident. Supervisor and Faculty Training: Train faculty on how to recognize early warning signs of individuals in distress. Supervisors/faculty should be aware of major personal events in the lives of their employees, as many incidents of violence occur in close proximity to such events. Student/Community Training: Educate the campus community on how to recognize warning signs of individuals in distress and provide a mechanism for sharing that information. Develop and Communicate Reporting Procedures All employees and students should know how and where to report violent acts or threats of violence. Information regarding the function of the threat assessment team or other similar programs should be provided to the entire campus community. The institution should also have an internal tracking system of all threats and incidents of violence. Continuing Staff and Student Evaluations When appropriate, obtain psychological evaluations for students or employees exhibiting seriously dysfunctional behaviors.
  • PREPAREDNESS
Leverage Community Relationships There are many programs and resources in communities that can assist with the development of active shooter response plans. Include local law enforcement agencies, SWAT teams and fire and emergency responders in early stages of the plan development to promote good relations and to help the agencies become more familiar with the campus environment and facilities. The police can explain what actions they typically take during incidents involving threats and active violence situations that can be included in the institution’s plan. Provide police with floor plans and the ability to access locked and secured areas. Invite law enforcement agencies, SWAT teams and security experts to educate employees on how to recognize and respond to violence on campus. Such experts can provide crime prevention information, conduct building security inspections and teach individuals how to react and avoid becoming a victim. Review Resources and Security Periodic review of security policies and procedures will help minimize the institution’s vulnerability to violence and other forms of crime.
  • Routinely inspect and test appropriate physical security measures such as electronic access control systems, silent alarms and closed-circuit cameras in a manner consistent with applicable state and federal laws.
  • Conduct risk assessments to determine mitigation strategies at points of entry.
  • Develop, maintain and review systems for automatic lockdown. Conduct lockdown training routinely.
  • Place active shooter trauma kits in various locations on the campus. Train employees on how to control hemorrhaging, including the use of tourniquets.
  • Provide panic or silent alarms in high-risk areas such as main reception locations and the human resources department.
  • Implement an emergency reverse 911 system to alert individuals both on and off campus. Periodically test the system to serve as training and verification that the equipment is functioning properly.
  • Equip all doors so that they lock from the inside.
  • Install a telephone or other type of emergency call system in every room.
  • Install an external communication system to alert individuals outside the facility.
Develop and Communicate Lockdown Procedures Lockdown is a procedure used when there is an immediate threat to the building occupants. Institutions should have at least two levels of lockdown – sometimes called “hard lockdown” and “soft lockdown.” Hard Lockdown: This is the usual response when there is an intruder inside the building or if there is another serious, immediate threat. In the event of a hard lockdown, students, faculty and staff are instructed to secure themselves in the room they are in and not to leave until the situation has been curtailed. This allows emergency responders to secure the students and staff in place, address the immediate threat and remove any innocent bystanders to an area of safety. Soft Lockdown: This is used when there is a threat outside the building but there is no immediate threat to individuals inside the building. During a soft lockdown, the building perimeter is secured and staff members are stationed at the doors to be sure no one goes in or out of the facility. Depending on the situation, activities may take place as usual. A soft lockdown might be appropriate if the police are looking for a felon in the area or if there is a toxic spill or other threat where individuals are safer and better managed inside. Evacuation Procedures Communication/Training Evacuation of the facility can follow the same routes used for fire evacuation if the incident is confined to a specific location. Otherwise, other exits may need to be considered. Designate a floor or location monitor to assist with the evacuation and inventory of evacuees for accountability to authorities. Establish a meeting point away from the facility. Develop a Communication System Perhaps the most crucial component of an active shooter response plan is the network of communication systems. Immediate activation of systems is critical to saving lives because many mass shootings are over and bystanders are injured or dead before police can respond. Create a Crisis Response Box A crisis response box has one primary purpose: provide immediate information to designated campus staff for effective management of a major critical incident. If a crisis is in progress, this is not the time to collect information. It is the time to act upon information. Knowing what information to collect, how to organize it and how to use it during a crisis can mean faster response time. Create an Incident Command Center Plan The National Incident Management System (NIMS) is a nationally recognized emergency operations plan that is adapted for large critical incidents where multi-agency response is required. NIMS facilitates priority-setting, interagency cooperation and the efficient flow of resources and information. The location of an incident command center should be in a secure area within sight and sound of potential incidents with staging areas located nearby. See Also: Thought Leader in Action: At U. of C.
  • POST-EVENT MANAGEMENT AND RECOVERY
To ensure a smooth transition from response to recovery, plans that went into effect during the event should be de-escalated and integrated into the plan for moving forward. This will include aspects such as:
  • Media and information management
  • Impact assessment
  • Facility and environmental rebuilding
  • Restoring student, staff and community confidence
Conclusion Though an active shooter situation is unlikely to occur at most colleges and universities, it is still essential to be prepared. Failure to do so can cause the loss of lives, severe financial repercussions and reputational damage that could take years to reverse. Additional resources for university risk managers and administrators are available in the complete Encampus Active Shooter Resource Guide, which is available for download here.

Mya Almassalha

Profile picture for user MyaAlmassalha

Mya Almassalha

Mya Almassalha joined the Encampus team in early 2016; she brings with her more than a decade of general insurance and risk management expertise, with a strong focus on higher education and organizational risk management.

The Sorry Spectacle of Defensive Medicine

As this infographic shows, as much as $850 billion a year is wasted on unnecessary, defensive medical tests and procedures in the U.S.

sixthings
info

Erik Leander

Profile picture for user ErikLeander

Erik Leander

Erik Leander is the CIO and CTO at Cunningham Group, with nearly 10 years of experience in the medical liability insurance industry. Since joining Cunningham Group, he has spearheaded new marketing and branding initiatives and been responsible for large-scale projects that have improved customer service and facilitated company growth.


Richard Anderson

Profile picture for user RichardAnderson

Richard Anderson

Richard E. Anderson is chairman and chief executive officer of The Doctors Company, the nation’s largest physician-owned medical malpractice insurer. Anderson was a clinical professor of medicine at the University of California, San Diego, and is past chairman of the Department of Medicine at Scripps Memorial Hospital, where he served as senior oncologist for 18 years.

The Coming Changes in Regulation

The developing International Capital Standard will require close monitoring by globally active insurers.

sixthings
Like the rest of the financial services industry, insurers are subject to increasingly complex and prescriptive regulations and standards. In the coming year, insurers will need to focus on the new U.S. Department of Labor fiduciary standard, which is likely to have a significant effect on how insurance products are sold. Moreover, global developments, especially those related to the developing International Capital Standard, will require insurers to closely monitor—and, ideally, contribute to—official discussions about how globally active insurers should manage capital. DOL Fiduciary Standard In 2015, the U.S. Department of Labor (DOL) proposed regulation on the way investment advisers and brokers are compensated. Under the proposal, recommendations to an employee retirement benefit plan or an individual retirement account (IRA) investor will be considered “fiduciary” investment advice, thus requiring the advice to be in the “best interest” of the client, rather than being merely “suitable.” As a result, insurance brokers and agents who provide investment advice will face limits on receiving commission-based (as opposed to flat-fee) compensation. The proposed compensation limitation does not apply to general investment education. Furthermore, the proposal’s amended Prohibited Transaction Exemption 84-24 (PTE 84-24) allows commission-based compensation for the sale of certain insurance products. However, to continue receiving commissions for certain products (e.g., variable annuity (VA) sales to IRAs), insurance brokers would need to utilize a separate best interest contract (BIC) exemption. In addition to compensation structure, the new fiduciary standard will have significant operational and strategic impacts, especially in technology, compliance, product pricing and development. There are four main considerations for insurance company and broker-dealer (BD) compliance:
  1. The BIC allows certain forms of commission-based compensation, but there are enhanced disclosure and contract requirements, including that financial institutions will have to disclose to retirement investors the total projected cost of each new investment over holding periods of one, five and 10 years, before the execution of the transaction. This could potentially cause a delay in new transactions, especially if the company’s current technology does not store this cost information in a central location. In addition, insurers must wait for customers to acknowledge the projected costs before a transaction can be completed. There are new contractual BIC obligations, including the requirement to have a three-way written contact between the financial adviser/insurance agent, financial institution and investor indicating the fiduciary status of the adviser and describing the fiduciary compliance program.
  2. Insurance companies have the option of moving to an “advisory” model. This means a flat fee for advisory services, and sales incentives would need to be adjusted to address the move from commission-based to flat fee compensation. (Clients could resist paying a standard flat fee for what they consider to be minimal investment advice.)
  3. Insurance companies could also move to a self-directed/order taker model. This would allow insurers to maintain their current fee structure but rely on customers’ direct orders (i.e., requests for broker advice and assistance).
  4. The PTE 84-24 exemption would allow insurers to sell certain products to fund retirement plans and IRAs, but it: prohibits commission-based compensation for sales of VA contracts to fund IRAs vehicles; prohibits the payment of certain types of fees to insurance advisers; and requires that conflicts of interest be disclosed and that the insurer act in the “best interest” of the plan, plan participant or IRA.
All of the above will have a significant impact on insurance company profitability and the competitive landscape. Insurers will have to make investments in employee training and technological enhancements. If companies leverage the BIC exemption, they will need to enhance their systems to detect and block disallowed products. See Also: If the Regulations Don't Fit, You Must... Because insurers often do not have a central repository of relevant fee and cost data, new user interface tools may be necessary to produce timely pre- and post-sale customer disclosures. This increased disclosure and communication burden could mean companies will reconsider the appropriateness of maintaining smaller client accounts. In addition, new market entrants—such as low-cost (often automated), fee-based service providers—could disrupt future business. Accordingly, for insurance companies to remain competitive in the middle market, they may need to develop a new class of simple, low-cost products. Making the transition from the suitability standard to the “best interest” standard will also be significant for insurance-affiliated broker dealers and their agents/registered representatives because of restrictions on providing investment advice to prospective clients. Agents will need to enhance their client profiling to refresh and verify clients’ objectives on a continuing basis to determine what is in clients' best interests. Retail financial advisers and non-affiliated broker dealers also will experience adjustments to compensation structures and will be subject to training that ensures registered representatives know which product recommendations will subject them to the new fiduciary standard. Policy riders that could have prevented or delayed the standard's promulgation were not in the final draft of the omnibus appropriations bill that Congress passed in December 2015. The White House is pushing for the rule to be issued as early as March 2016, with a 2017 compliance deadline. International Capital Standard (ICS) The proposed International Capital Standard (ICS) is intended to be a consistent capital measure for globally active insurers. The ICS’s advocates promote it as a solution for groupwide supervisors to have a better understanding of how insurers manage capital allocation in an international business. In the wake of the 2008 financial crisis, the Financial Stability Board directed regulators to improve the regulatory system—particularly capital standards—for all financial services. While the banking industry has received the majority of the attention, the insurance industry is subject to a call for wider change. Initiatives have included the development of methodologies to identify and determine accompanying capital requirements for global systemically important insurers (G-SIIs), as well as insurers that are active in multiple jurisdictions (internationally active insurance groups (IAIGs)) but are not necessarily globally systemically important. The ICS is intended to be a truly global group measure, unlike any current regulatory practice. Potential Effects Many insurers are concerned the ICS could potentially force insurers to adopt “foreign” calculations that differ from current regulatory processes and conflict with existing capital practices. In addition, there has been considerable regulatory change in recent years, and the ICS is yet another initiative insurers would have to address. If the final ICS calculation is different from current practices, then all functional areas could be affected because of a knock-on effect on product portfolio, pricing and investment strategy. Accordingly, as the ICS continues to develop, insurers should begin to consider the potential impact it may have on available capital reserves, required capital levels and capital management. Insurers should consider how new capital standards will interact with current regulatory capital requirements and should prepare to identify additional capital resources; understand changing stakeholder and investor reporting expectations; and assess the wider business impacts, such as insurance product pricing and risk appetite. Furthermore, insurers should already be taking an active role in influencing capital standards development, becoming involved in industry groups and forums and regularly communicating with stakeholders to manage expectations and ensure appropriate treatment of company-specific issues. Financial reporting teams should consider the need for updated or new capital disclosures, the communication of capital ratios and rating agency concerns. See Also: The Rise of Panopticon Regulation? If enacted, the ICS also is likely to increase the need to adapt, modernize and enhance the efficiency of core operations. To prepare for this eventuality, insurance groups should complete readiness assessments and review their key systems, data flow, processes and internal controls to determine whether they need new systems and processes. More specifically, insurers may need to develop and implement internal models and adjusted calculation methods, including incorporating new risk margin calculations and alternative methods of classifying available capital. The calculation of required capital could pose more granular technical issues in regimes where an economic capital assessment has not previously formed part of the regulatory framework or common ancillary metric. Compliance, risk management and finance functions will have to assess emerging changes in reporting requirements, determine their role and decide how to educate the business and how to monitor the impacts moving forward. Insurers will need their ERM functions to identify, measure, aggregate, report and manage risk exposures within predetermined tolerance levels, across all activities of the insurance group with clearly defined and documented structures, frameworks and procedures. In relation to organizational structure, insurance holding groups will need to assess the potential impact of the ICS on the classification of their separate legal entities. They should review their legal entity organization charts and be prepared to assign and categorize the regulatory classification of each operating legal entity within the structure (as various capital frameworks across multiple jurisdictions could apply). Overall, the ICS is only part of the overall regulatory framework for globally active insurers, called Comframe. Other aspects of Comframe—such as governance, risk management policies and ORSA—will also have a significant impact on many areas of an insurer's business, regardless of what becomes of the ICS. It’s too early to say for certain what the final ICS will look like, but even the regulators who question its necessity seem reconciled to the notion that a common standard will eventually become reality. The big debate is what the one true ICS should entail and what should be the nature of calculations supporting it. Implications DOL Fiduciary Standard
  • The “best interest” standard is likely to restrict certain investment advice to prospective clients and will certainly have an impact on how insurers approach and conduct sales. In particular, insurers will need to distinguish what is considered investment advice and what is not. One related development to watch is if the fiduciary standard increases insurers’ implementation of robo-advice for routine transactions and research.
  • There will be significant operational and strategic impacts, especially in the areas of technology, compliance, employee training, product pricing and development. Moreover, it appears that compliance with the standard will be mandatory as of next year, which means insurers, affiliated and independent brokers and agents have to address all of these considerations in a very short time.
ICS
  • The ICS has the potential to affect the entire organization, not just risk and capital management. Product portfolio, pricing and investment strategy will all feel the standard’s effect, with resulting pressure to modernize and enhance core operations. To prepare, insurance groups should complete readiness assessments and review their key systems, data flow, processes and internal controls to determine whether they need new systems and processes.
  • Because the ICS is still in the developmental stage, we strongly encourage insurers to take an active role in influencing capital standards development, become involved in industry groups and forums and to be in regular communication with stakeholders to manage expectations and ensure appropriate treatment of company specific issues.

Henry Essert

Profile picture for user HenryEssert

Henry Essert

Henry Essert serves as managing director at PWC in New York. He spent the bulk of his career working for Marsh & McLennan. He served as the managing director from 1988-2000 and as president and CEO, MMC Enterprise Risk Consulting, from 2000-2003. Essert also has experience working with Ernst & Young, as well as MetLife.


Ellen Walsh

Profile picture for user EllenWalsh

Ellen Walsh

Ellen Walsh is a partner in the financial services risk and regulatory advisory practice of PwC and provides risk management and regulatory advisory services to PwC’s leading insurance clients. She currently leads PwC's efforts related to the impact of the regulatory change on financial institutions, specifically on insurance companies.

Dear Founders: Are You Listening?

Founders: Here is a framework for when and how to talk to users about your innovations. You can't just wait for your turn to talk.

sixthings

Since my last post, “Distribution is 80% of your problem,” I have had the opportunity to speak in-depth with several terrific start-up founders about some of the incredible things they are doing and why things are not going so well. Several of their stories remind me of another big lesson I have learned over the years: We entrepreneurs often mistake “listening” as “waiting to talk," until it’s too late.

A Little Knowledge (About Your Users) Is a Dangerous Thing

All the stories have a similar theme: We launched our product, and we got 10,000-plus users (or 100-plus small paying customers) using unscalable ways. Now, we are not sure of what to do next.

One founder I communicated with had talked to hundreds of her paying users and managed to convince herself that her market was women who want to make sure their kids don’t get too much unsupervised screen time. We talked to the company's users and discovered that, in fact, the core group that loved the app were working women who want to keep track of their kids and know they are safe after school. Whenever this start-up had spoken to its user, it heard the answer it wanted to hear, not what the users were saying. The lesson learned here was about waiting to tell users what they “should” be doing with the app.

Another app — one that got to 20,000 users quickly with a small amount of seed money — found, once we dug deep, that fewer than 150 of their users were active weekly. The start-up had no idea who these 150 users were or what, specifically, they were doing with the product. After 20 user interviews, we discovered the start-up's core use case was far from what the company thought it was and that the product was too hard to use. For far too long, the start-up was convinced its technology would change the world, especially because 20,000 users seemed to be using the product.

A third, B2B-focused start-up I recently spent time with has more than 100 paying users but has stalled growth and usage numbers. When I asked the company to tell me who its users were and what pain point it was solving, I kept getting back a laundry list of features and user personas instead. When the company dug deeper and spoke to users, it found that, of the 27 features, users are using two and that no one had discovered the three the company thinks are the real killer benefits. We realized the company's model needs to shift away from “my users are using the wrong features and should have discovered the 'right ones.'" As a start-up, you don’t get to tell users what scenarios and which features they should use your product for; consumers will tell you by using whatever they find useful.

Apple May Not Need to Talk to Users, But the Rest of Us Do

As a founder, you start with a hypothesis. You have all these incredible suppositions on how you will change the world with your product. You may think you can get away with: “My users do not know what they are doing. I will tell them what they should do. It works for Apple (or so goes the myth) so it will work for me — let’s just ignore users." Believe me, those kinds of companies are black swans. For the rest of us, our users matter—who they are, what they use our products for and what they ignore.

This is for two basic reasons:

  1. Product/Market Fit: Unless we know and understand our users (or potential users), our incoming hypothesis of the value our product provides is literally that —a hypothesis. Sure, some people may not get it, and some may just dismiss it. But without a group of people who buy into the value we hypothesize that we can provide and who agree to become ecstatic users of our product, we probably did not have a real hypothesis to begin with, just a supposition that is wrong.
  2. Go-to-market: The more detail we can find out about users, the more we can figure out how to go after them in a tight, focused way. Going after moms who want to limit unsupervised screen time is very different from attracting busy working moms who really want to know where their kids are after school. The two are different products, have different features and have a different go-to-market.

One potential red herring during the early days comes when you manage to attract a chunk of users quickly. You can easily get deluded by the numbers — they're like inventory, they hide a lot of problems. You convince yourself that what you're doing can't be wrong if 20,000 users think you're right. The fact is that these 20,000 people do not think you are right ;  you somehow managed to "get" them, and they experimented with your product hoping to find something of use. 200 of those users might think you are onto something, but you don’t know who those 200 are. If you understood what those 200 really like about your product, you might be able to find the next 20,000 users who are really right for you.

What to Avoid When You Do Decide to Talk to Users

  1. Don’t defend what you have built and try to convince them you are right;
  2. Don’t keep coming back to your vision and what will come later or focus on product features they should be using;
  3. Don’t make a sales pitch about your company and yourself, make it about them and their real reaction to your product—even if it means you have to throw everything away and start over again.

If you do not do these things, you have not really listened to your users—you have just waited for your turn to talk and convinced yourself you understand your users.

A FRAMEWORK FOR WHEN TO LISTEN TO USERS--AND HOW 

Here’s a framework I have developed over the years about when and how to listen to users:

The First 500 Users

Those first 500 users are the most important people in your journey. You need to do more than just talk to them, you need to build a solid relationship with them — they are the foundation of your product.

In my previous start-up, a career marketplace, I personally introduced my early adopters to friendly hiring managers at many companies and helped them land a job. A lot of those early customers are now my Facebook friends. Some of them even became our ambassadors and had equity in the company.

Those first users add immense value. They  validate your hypothesis, refine your ideas, recruit more users and test new features, on top of a whole lot more. And they are also very forgiving to defects, crashes, bad user experience (UX), everything.

I used to schedule as many phone calls with them as I could. In every conversation, I would first show what we were working on (in detail) and get their feedback. I would then open up  and  ask about what they were doing with the product, why they chose it over others, how they found it added value, what related issues they had that we could help with, among other questions. I logged every conversation.

Listening Is Hard to Do—For Founders in Particular

Most of the time when we think we’re listening, we are actually just waiting for our turn to talk. Here are three reasons why:

  1. We are always busy talking — to ourselves. Even when we are obviously talking to someone else, we are also internally talking to ourselves. So listening genuinely — muting your internal conversation and giving someone your full attention — is hard.
  2. For founders, listening genuinely is harder. Most entrepreneurs have their product, features, ideas and vision so deeply ingrained that, when they talk to users, entrepreneurs are always defending things they find users having problems with . (“But you didn’t see the profile page; the settings let you change this," “There are so many cool things you can do, didn’t you see this feature?,” “We’ll get to that in Version 3," “Wait, no, you don’t understand, that’s where the puck is going,” etc.)
  3. It is not easy for people to articulate what they are thinking. To really understand what users are saying, you have to read between the lines. Even if you lead with your world view, you really have to listen to users' views carefully — both what is said and what is not.

Talking to users requires real effort . Be aware of that and start focusing on your first 500 users. Treat your early adopters with special respect — make them feel special and take care of them beyond just the product.

Beyond the First 500 Users

Moving forward with your customer base requires using other techniques (in addition to real conversations) that are still important. One such tactic is talking through the product,  provoking conversations with product experiments.

An example of this would be radically changing your on-boarding — drop everything and get them in — for a small set of users and seeing what happens. Remove a feature you think is not useful and wait for users to complain. Removing things temporarily is the best way to test if they are really valuable.

It also helps to create ancillary products  ( quick prototypes )  to test value outside your core product. As you learn more about your users, you will start to see more value propositions, some that align with your vision and some that don’t.

Until you are truly convinced you have product-market fit, do not be shy about running small experiments on the side to keep testing different ideas. Use conversations to create hypotheses, and experiment quickly.

Another technique is to always ask, “What else would you want this product to do for you?” in every support email. My start-up once introduced a critical defect in our iPhone app that led to hundreds of support emails. Adding that one question uncovered several hundred feature requests, including a lot we had not thought about.

Talking to users as you scale is more than just about having conversations. Lead with a hypothesis, measure, iterate, run side experiments continuously to test.

Dear founder, do not wait to talk to your users until it’s too late.

And when you do, listen. Don’t just wait to talk.

Texas Work Comp: Rising Above Critics

Critical articles tend to be an accumulation of plaintiff attorney opinions and confusion by out-of-state persons.

sixthings
Recently, published articles have been critical of the Texas workers’ compensation system and the choice of an “Option” available to Texas employers. Such articles tend to be an accumulation of plaintiff attorney opinions and confusion of out-of-state persons who do not have sufficient working knowledge of the subject matter. This article will address two recent examples:
  1. “The Status of Workers’ Compensation in the United States — A Special Report” by the Workers’ Injury Law and Advocacy Group (“WILG”).[1]
This is a group of attorneys supposedly “dedicated to representing the interests of millions of workers and their families.”[2]  Their paper is at best a rant against the original “grand bargain,” which was struck to create each state’s statutory workers’ compensation system. It is without virtually any legal citations or other back-up. It does, however, illustrate the wisdom of the Texas system in offering a choice — (1) workers’ compensation insurance or (2) the “nonsubscriber” Option. For example, the article notes that many physicians will not take workers’ compensation patients, that 33 states have cut workers’ compensation benefits and that insurance companies continually clamor for reform by lobbying legislatures to cut medical costs or implement other cost savings — all of which the authors say makes it more difficult for the injured worker to recover. See Also: Who Is to Blame on Oklahoma Option? The article changes course when it suddenly says that an option to workers’ compensation is at fault, boldly declaring that “opt-out is bad for everyone” and claiming that intervention by the federal government is “immediately needed.” WILG criticizes the no-fault workers’ compensation insurance system and in the same breath castigates those who elect the Texas Option and thrust themselves into the tort system, which plaintiff attorneys have long claimed they love. This is particularly perplexing for Texas readers in view of two factors: (1) the need for plaintiff attorneys has been largely eliminated from the Texas workers’ compensation system,[3] and (2) the fact that the Texas Option gives the injured employee the right to sue for negligence and recover actual and punitive damages. [4] Responsible employers that elect the Texas Option establish injury benefit plans for medical, lost wage and other benefits.   The benefits are subject to the Employee Retirement Income Security Act (ERISA), which provides numerous employee protections, including communication of rights and responsibilities, fiduciary requirements, and access to state and federal courts.[5] Only negligence liability claims against the employer can be forced into arbitration, which many employers insist upon as a more efficient method of dispute resolution that has been sanctioned for decades by both the U.S. Supreme Court and the Texas Supreme Court. Arbitration even supports awards for pain and suffering and punitive damages. In those cases, plaintiff attorneys have the advantage because an Option employer loses the defenses of contributory/comparative negligence, assumption of the risk or negligence of a fellow employee. Perhaps the real reason why WILG is fighting Options to workers’ compensation is a combination of not wanting to learn how to succeed at ERISA litigation and fear that other workers’ compensation systems will become as efficient as the Texas system. Why aren’t these self-serving lawyers touting workers’ compensation Option that embraces the tort system they have so righteously pledged to protect?
  1. “Worse Than Prussian Chancellors: A State’s Authority to Opt-Out of the Quid Pro Quo” by Michael C. Duff, University of Wyoming College of Law, Jan. 9, 2016. http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2713180
Professor Duff complains primarily about “compulsory arbitration” of workplace claims. He erroneously and boldly declares, "In states both retaining the exclusive remedy rule and allowing employers to opt-out of the workers’ compensation system, employees of opt-out employers are left with no legal remedy for workplace injury.[6] This apparently refers to the Oklahoma Option, in which an employer can adopt an injury benefit plan with benefits at least equal to or greater than traditional system levels. Such a high benefit mandate may merit application of the “exclusive remedy rule,” which prevents an injured employee from suing their employer. However, Professor Duff overlooks the fact that ERISA provides injury claimants with extensive legal rights, including causes of action for wrongful denial of benefits, failure to produce documents, breach of fiduciary duty and discrimination (including retaliatory discharge). He also overlooks the fact that ERISA claims generally cannot be made subject to mandatory arbitration. [7] So, is this really a matter of having “no legal remedy” or a case of Professor Duff (who is also a WILG member) trying to support the positions and business of his plaintiff attorney friends? See Also: Strategic Implications of the Oklahoma Option Professor Duff makes it clear in the rest of his long article that the real enemy is an employer’s ability to implement a employment dispute resolution system that includes arbitration. Professor Duff omits the fact that he is bucking almost the entire U.S. judicial system, which endorses arbitration. For example, the U.S. Supreme Court clearly maintains that arbitration is right, proper and allowed.[8] “By agreeing to arbitrate a statutory claim, a party does not forego the substantive rights afforded by the statute, it merely results in submitting the resolution of the claim in “an arbitral, rather than judicial, forum.”[9] The Texas Supreme Court has been even more clear by stating that “. . . an agreement to arbitrate is a waiver of neither a cause of action nor the rights provided under [The Texas Labor Code]” and is not the denial of a right but rather simply “an agreement that those claims should be tried in a specific forum.”[10] Oddly, when discussing his arguments against arbitration, Professor Duff states emphatically that it might be “acceptable if employees have knowingly signed pre-injury waivers of workers’ compensation benefits.” [11] What Professor Duff apparently does not understand is that pre-injury waivers of negligence claims have long ago been essentially outlawed in Texas, and an injured employee under the Texas Option retains his rights to sue the company for negligence. [12] The Texas Labor Code also specifies strict conditions under which an employee is allowed to even settle such a tort claim — i.e., only after at least 10 days have passed following the employee’s receipt of a medical evaluation from a non-emergency care doctor, the agreement is in writing with the “true intent of the parties as specifically stated in the document” and the provisions are “conspicuous and appear on the face of the agreement.”[13] Like the WILG report, Professor Duff confuses benefit and liability exposures in Texas and Oklahoma, overlooks available legal remedies in both states and refuses to accept well-established public policy and judicial precedent that favors arbitration of employment-related claims. Conclusion The publication titled, “Non-Subscription: The Texas Advantage,” [14] states that, “Fortunately . . . legislators who drafted the first workers’ compensation laws in 1913 were farsighted enough to provide an option.” Employers that elect the Texas Option to workers’ compensation are subjected to an extra measure of liability as they cut out the middleman and decrease the taxpayer expense of the governmentally prescribed workers’ compensation system. Those Option employers that are operating legally and responsibly should be credited with advancements, such as improving worker access to better medical care, offering modified duty job availability and oftentimes providing better wage replacement benefits. At the end of the day, perhaps WILG and Professor Duff should make an investment of their time to learn how ERISA protects injured workers and how to litigate an ERISA dispute. These authors should also further consider the ample remedies under Texas law for employer negligence liability, an exposure that provides more incentive to maintain a safe workplace. No doubt, pursuing such claims does require some effort. Perhaps, therefore, the real objectives of these two papers is a desire to maintain the profitability of legal work favoring injured workers and reducing the attorney effort. [1]https://s3.amazonaws.com/membercentralcdn/sitedocuments/wp/wp/0245/745245.pdf?AWSAccessKeyId=0D2JQDSRJ497X9B2QRR2&Expires=1458161961&Signature=7rYh45nzAcLAZ%2BYqPx1HilrodQ4%3D&response- content‑disposition=inline%3B%20filename %3D%22WILG%20Grand%20Bargain%20 Report%201%2D16%2Epdf%22%3B%20 filename%2A%3DUTF%2D8%27%27WILG%2520 Grand%2520Bargain%2520Report%25201%252D16%252Epdf [2] www.wilg.org [3] St. Mary’s Law Journal 2000, “Texas Workers’ Compensation: At Ten Year Survey – Strengths, Weaknesses and Recommendations,” Phil Hardburger, Chief Justice, Court of Appeals, Fourth District of Texas, San Antonio, page 3, 41, citing to research an oversight counsel on workers’ comp, an examination of strengths and weaknesses of the Texas Workers’ Compensation System (August, 1998). [4] Tex. Lab. Code § 406.033 and Carlson’s Texas Employment Laws Annotated, 2015 Edition. [5] 29 USC Chapter 18, Subtitle B, Parts 1, 4 and 5. [6] Page 3 of Professor Duff’s treatise. [7] 29 C.F.R. 2560.503-1(c)(4); see also Professor Duff’s footnote 26, stating “in Texas and Oklahoma, employers are able to combine opt-out with arbitration.” [8] Scherk v. Alberto-Culver Company, 417 U.S. 506, 519, 94 S. Ct. 2449, 41 L. Ed. 2nd 270 (1974) (holding that arbitration clauses are, “in effect, a specialized kind of forum-selection clause.”) [9] Id. at 631 quoting Mitsubishi Motors Corp v. Solar Chrysler-Plymouth, Inc., 473 U.S. 614, 628, 105 S. Ct. 3346, 87 L. Ed. 2d 444 (1985). [10] In re Golden Peanut Company, LLC, 298 S.W.3d, 629 (Tex. 2009). [11] Professor Duff at page 2. [12] Tex. Lab. Code see 406.033(e) and (f) (a legal cause of action by an employee who claims to be injured on the job “may not be waived by an employee before the employee’s injury or death”); Tex. Lab. Code 406.033(a). [13] Tex. Lab. Code 406.033(f) and (g). [14] Published by the Texas Association of Business.

Gary Thornton

Profile picture for user GaryThornton

Gary Thornton

Gary Thornton is a partner at Jackson Walker, focusing on non-subscriber tort litigation and employment law defense. He wrote the foundation article for the Texas Bar Journal on non-subscriber litigation. He has represented companies all over Texas in both non-subscriber litigation and all areas of employment law.