Tag Archives: cost

How Bureaucracy Drives WC Costs

Workers’ compensation is one of the most highly regulated lines of insurance. Every form filed and every payment transaction is an opportunity for a penalty. Claims can stay open for 30 years or longer, leading to thousands of transactions on a single claim. Each state presents different sets of compliance rules for payers to follow. This bureaucracy is adding significant cost to the workers’ compensation system, but is it improving the delivery of benefits to injured workers?

Lack of Uniformity

Workers’ compensation is regulated at the state level, which means every state has its own set of laws and rules governing the delivery of indemnity and medical benefits to injured workers. This state-by-state variation also exists in the behind-the-scenes reporting of data. Most states now require some level of electronic data interchange (EDI) from the payers (carriers or self-insured employers). There is no common template between the states; therefore carriers must set up separate data feeds for each state. This is made even more complex when you factor in the multiple sources from which payers must gather this data for their EDI reporting. Data sources include employers, bill review and utilization review vendors. The data from all these vendors must be combined into a single data feed to the states. If states change the data reporting fields, each of the vendors in the chain must also make changes to their feeds.

Variation also exists in the forms that must be filed and notices that must be posted in the workplaces. This means that payers must constantly monitor and update the various state requirements to ensure they stay in full compliance with the regulations.

Unnecessary Burden

Much of the workers’ compensation compliance efforts focus on the collection of data, which is ultimately transmitted to the states. The states want this information to monitor the system and ensure it is operating correctly, but is all this data necessary? Some states provide significant analytical reports on their workers’ compensation systems, but many do little with the data that they collect. In a world concerned about cyber risk, collecting and transmitting claims data creates a significant risk of a breach. If the data is not being used by the states, the risk associated with collecting and transmitting it seems unnecessary.

Another complication is that there are multiple regulators involved in the system for oversight in each jurisdiction. Too often, this means payers have to provide the same information to multiple parties because information sent to the state Department of Insurance is not shared with the state Division of Workers’ Compensation and vice versa.

Some regulation is also outdated based on current technology. Certain states require the physical claims files to be handled within that state. However, with many payers now going paperless, there are no physical claims files to provide. Other states require checks to be issued from a bank within those states. Electronic banking makes this requirement obsolete.

How Is This Driving Costs?

All payers have a significant amount of staffing and other resources devoted to compliance efforts. From designing systems to gathering and entering data, this is a very labor-intensive process. There have not been any studies on the actual costs to the system from these compliance efforts, but they easily equate to millions of dollars each year.

States also impose penalties for a variety of things, including late filing of forms and late and improper payment of benefits. The EDI process makes it possible for these penalties to be automated, but that issue raises the question of the purpose of the penalties altogether. These penalties are issued on a strict liability basis. In other words, either the form was filed in a timely manner or it was not. A payer could be 99% compliant on one million records, but they would be automatically penalized for the 1% of records that were incorrect. In this scenario, are the penalties encouraging compliance, or are they simply a source of revenue for the state? A fairer system would acknowledge where compliance efforts are being made. Rather than penalize every payer for every error, use the penalties for those that fall below certain compliance thresholds (say, 80% or 90% compliance).

The laws themselves can be vague and open to interpretation, which leads to unnecessary litigation expenses. Terms such as “reasonable” and “usual and customary” are intentionally vague, and often states will not provide further definition of these terms.

How Can We Improve?

One of the goals of workers’ compensation regulations is to ensure that injured workers are paid benefits in a timely manner at the correct rate and that they have access to appropriate medical treatment. There was a time when payers had offices located in most states, with adjusters handling only that state. Now, with most payers utilizing multi-state adjusters, payers must be constantly training and educating their adjusters to ensure that they understand all of the nuisances of the different states that they handle.

The ability to give input to regulators is also invaluable, and payers should seek opportunities to engage with organizations to create positive change. Groups such as the International Association of Industrial Accident Boards and Commissions (IAIABC) and the Southern Association of Workers’ Compensation Administrators (SAWCA) provide the opportunity for workers’ compensation stakeholders to interact with regulators on important issues and also provides the opportunity to seek uniformity where it makes sense (EDI, for example).

There needs to be better transparency and communication between all parties in the rule-making process so that regulators have a better understanding of the impact these rules have on payers and the effort required to achieve compliance.

Developing standards in technology would be helpful for both the payers and the states. If your systems cannot effectively communicate with the other systems, you cannot be efficient. Upgrading technology across the industry, particularly on the regulatory side, has to become a priority.

Finally, we need to give any statutory reforms time to make an impact before changing them again because the constant change adds to confusion and drives costs. In the last 10 years, there have been more than 9,000 bills introduced in various jurisdictions related to workers’ compensation. Of those, about 1,000 have actually been turned into law. People expect that these reforms will produce the desired results immediately, when in reality these things often take time to reach their full impact.

These issues were discussed in depth during an “Out Front Ideas With Kimberly and Mark” webinar on Feb. 9, 2016. View the archived webinar at http://www.outfrontideas.com/archives/.

Should You Offshore Your Analytics?

There has been much on LinkedIn and Twitter in recent years about the shortfall in analytical resource, for the U.S. and UK especially.

Several years ago, I had the learning experience of attempting to offshore part of my analytics function to India, Bangalore to be precise. It was all very exciting at first, traveling out there and working with the team as they spent some time in the UK. Plus, on paper, offshoring looked like a good idea, to address the peaks and troughs of demand for analysis and modeling.

The offshoring pitch successfully communicated the ease of accessing highly trained Indian graduates at a fraction of UK wages. However, as with all software demos, the experience after purchase was a little different.

I always expected the model to take a while to bed down, and you expect to give any new analysts time to get up to speed with our ways of working. However, after a few months, the cracks began to show. Analysts in India were failing to understand what was required unless individual pieces of work were managed like mini-projects and requirements specified in great detail. There also appeared to be little ability to improvise or deal with “dirty data,” so significant data preparation was still required by my UK team (who were beginning to question the benefit).

Once propensity modeling was attempted, a few months later, it became even more apparent that lack of domain knowledge and rote learning from textbooks caused problems in the real world. Several remedies were tried. Further visits to the UK, upgrading the members of the team to even more qualified analysts (we started with graduates but were now working solely with those who held masters degrees and in some cases PhDs). Even after this, none of my Bangalore team were as able to “think on their feet,” like any of my less qualified analysts in the UK, and there were still not any signs that domain knowledge (about insurance, our business, our customer types, previous insight learned, etc) was being retained.

After 18 months, with a heavy heart (as all those I had worked with in Bangalore sincerely wanted to do the best job they could), I ended this pilot and instead recruited a few more analysts in the UK. Several factors drove the final decision, including:

  1. Erosion on labor arbitrage (the most highly skilled cost more);
  2. Inefficiency (i.e. need for prep and guidance) affecting the UK team;
  3. Cost and effort to comply with data security requirements.

Since that time, I have had a few customer insight leaders suggest that it is worth trying again (nowadays with China, Eastern Europe or South Africa), but I am not convinced. On reflection, my biggest concerns are around the importance of analysts understanding their domain (business/customers) and doing their own data preparation (as so much is learned from exploratory data analysis phase). The “project-ization” of analysis requests does not suit this craft.

So, for me, the answer is no. Do you have any experience of trying this?

Solvency 2: An Outcome Very Different Than Planned

The original intention of the EU's Solvency 2, the regulatory requirement for capital held by insurers, was to create a framework that inspired policyholder confidence and restore trust. The real outcome was to force insurers to undertake massive programs of data management at costs that, for some Tier 1 insurers, have exceeded $200 million. Some insurers said they would pass the cost on to their customers, which I’m sure wasn’t the intention.

In what was arguably worse, the cost became so great that other useful programs were put on hold because of this burning regulatory platform. The knock-on effect has been to create delay especially in customer-facing activities (which would have had a far better impact in improving confidence and trust).

Some international insurers suggested that the requirements might prevent them from trading in Europe – creating a “Fortress Europe” – but Solvency 2 seems to be emerging in multiple guises around the globe, in China, Latin America, South Africa and of course the U.S. in the form of RMORSA.

There’s lots written on this topic, such as http://www.solvencyiiwire.com/, and I won’t bore you, but as I looked out at the faces at a major conference in the U.S. where I spoke recently, I recognized the look I saw in many insurers in Europe in 2008 — that of not really knowing what was going to hit them.

Insurers were to discover that more than 80% of both cost and implementation time was absorbed in data management, 15% on analysis and the small balance on risk reporting. Yet the reporting element proved to be the only part visible – reminding me of an iceberg analogy, with the reporting being that part of the ‘berg visible above the waterline.

Comparing risk and regulation to an iceberg is interesting, and as I looked around the room at the conference, I wondered how many attendees were ready for what would be, for them, a long and difficult passage. But not, I hope, a Titanic one…