Tag Archives: technologies

Coverage Risks From the ‘Internet of Things’

The “Internet of Things” is here.

According to Cisco, sometime during 2008, the number of things connected to the Internet exceeded the number of people.  Cows, corn, cars, fish, medical devices, appliances, power meters — practically  any item imaginable has been or can be connected. Eventually, we will be able to, for instance,  “sync” an entire home so that its heating system is programmed to adjust to  weather patterns and inhabitants’ activities.

Businesses ranging from small start-ups to long-standing conglomerates  are now embedding adaptive “smart” technologies into even mundane products, including  window shades, light bulbs and door locks.

While Internet of Things (IOT) devices create obvious value, they also expand risk. In effect, we are creating an  “infrastructure for surveillance” that constantly generates critical, sometimes exceptionally  private, data transmitted for use on servers perhaps thousands of miles away.

If an IOT device malfunctions, or if data or software is compromised or lost, individuals and  businesses may suffer devastating losses. Dosages of critical medication might be missed, for instance, or needed medical treatments omitted. In fact, the risks posed by IOT have already attracted the attention of regulatory authorities. The U.S. Food and Drug Administration has surveyed the industry and decided to update its guidance on cybersecurity for IOT medical devices, and the Federal Trade Commission has held a symposium addressing IOT issues.

As use of these products continues to expand, such risks will be realized, and manufacturers will look to their insurers for defense and indemnity protection. Coverage for product liability is typically provided under liability policies, which can be written on an occurrence or claims-made basis. Liability of the manufacturer of a malfunctioning fire alarm that fails to alert homeowners of a fire should be covered under such policies, as should bodily injuries or property damage caused by other defective products, including products that are part of the IOT. Injuries from such products may result not only from a device’s failure to work but also from a network’s failure to provide communications as needed. These failures, as well as the more traditional product failures, should continue to be covered if insurance is to continue to serve its function and transfer financial risk.

Liability policies generally define the product risk to include all bodily injury and property damage occurring away from premises you own or rent and arising out of your product or your work except:

  1. products that are still in your physical possession; or
  2. work that has not yet been completed or abandoned.

The policies define “your products” to be any property (other than real property) manufactured,  sold, handled, distributed or disposed of by the insured and to include warranties or representations made at any time with respect to the fitness, quality durability, performance or use of your product; and the providing of or failure to provide warnings or instructions.

Liabilities for malfunctions of IOT products appear to fit squarely within this definition. There are, however, some complications that insurers might put forward were they interested in denying coverage, and policyholders will need to examine their insurance to avoid the uncertainty and cost of litigation.

Coverage for IOT risk is complicated by the fact that the devices add value and efficiency by communicating with each other and distant servers on which data is stored and algorithms run. Indeed, this interoperability is the critical and promoted feature of IOT products. To see how this can complicate the coverage question, let us take a concrete example.

Let us imagine a refrigerator — the eFridge — that communicates data concerning the products it holds. When combined with complementary devices — called eShelves — it is able to keep track of all food in the kitchen. The refrigerator also keeps track of its states, including its internal temperature, and transmits its state data and food stocked to a server maintained by smartKitchens at a distant location. On this server, the data is stored and analyzed by an algorithm designed by smartKitchens’ software engineers. The algorithm, based upon eFridge state data and data on stocked food, generates recommended recipes for the week so that all food is used before it spoils. The recommendations sent from the server to the eFridge appear on a screen on the refrigerator’s front door.

There are two Internet transport protocols, TCP and UDP. The latter is often used when broadcasting within a network is needed (as it is so that the eShelves can be configured) and can be cheaper to implement, but it is also less reliable because communicating devices receive no notice when UDP datagrams — the electronic containers of transmitted data — are lost or dropped. The eFridge is designed to use UDP, and the software engineers have developed their algorithm to deal with the problem of dropped datagrams as follows: Rather than generating a warning that there is incomplete information, the algorithm assumes that the refrigerator’s state is consistent with the average state maintained over the prior two weeks. This is done to avoid multiple appearances of “error” messages on the eFridge door/screen and to increase customer satisfaction.

Now imagine that one week the server fails to receive datagrams regarding the state of the refrigerator on Monday, during which for some unknown reason the temperature inside the refrigerator exceeded room temperature. Unfortunately, as of Monday, the refrigerator contained a pound of mussels, which as a result of the temperature change are spoiled. Data concerning this temperature increase were not received by the server, and therefore the algorithm, having been designed to assume that the temperature was maintained at its average, recommends a recipe for Wednesday of mussels provençale. The consumer sustains a very serious case of food poisoning and seeks compensation from smartKitchens, which demands coverage from its insurer. Is smartKitchens covered?

The event appears to be squarely within the sort of product liability coverage that product manufacturers and distributors expect. There is a product away from the insured’s premises that made a “defective” recommendation and caused bodily injury. As such, there should be coverage.

But an aggressive insurer could construct an argument to the contrary. It might contend that the injury was caused by the algorithm, not the refrigerator. Insurers might contend that the algorithm constitutes “work that has not been completed or abandoned,” because the engineers have the ability to change the algorithm to address the possibility of spoiled mussels, and that therefore the risk is not within the product’s coverage.

Such an argument should ultimately fail. The fact that smartKitchens’ software engineers can update the algorithm does not mean that they have “not completed or abandoned” it for purposes of the insurance policy. Moreover, liability policies generally provide that “work which requires further … correction … because of defect or deficiency, but which is otherwise complete, shall be deemed completed.” In fact, here, smartKitchens let the algorithm run as it was designed to, and it did so. Nonetheless, although the insured should eventually obtain the benefit of coverage, that could very well be only after protracted and expensive litigation, reducing the value of the insurance purchased.

There is another argument as well that the insurer might make. Since about 2003, liability policies have generally included an exclusion — exclusion p, on the Insurance Services Office Inc. form — barring coverage for damages arising out of the loss of, loss of use of, damage to, corruption of, inability to access or inability to manipulate electronic data.

As used in this exclusion, electronic data means information, facts or programs stored as or on, created or used on, or transmitted to or from computer software, including systems and applications software, hard or floppy disks, CD-ROM[s], tapes, drives, cells, data processing devices or any other media which are used with electronically controlled equipment.

An insurer might contend that the problem was created, not by the eFridge, but by the loss of electronic data, when the packets were dropped. The insurer might use this argument to contend that coverage is barred.

Again, the insured should prevail were the insurer to make such an argument. The algorithm functioned as it was designed. It did not fail to process data, but processed data exactly as intended. It was merely responding as designed to an unfortunate consequence of the decision to implement the UDP protocol.

But here, too, the insured is likely to find itself in an expensive coverage dispute, depriving the insured of the value of the insurance purchased.

As always, new technologies create new risks, and new risks create the possibility of coverage disputes. These disputes should be resolved in the insured’s favor, as it is the responsibility of an insurer to draft policy language to clearly and unequivocally exclude risks. This rule has special force where, as in our example, there is an expectation that liability for products would be covered. It should, in other words, be the responsibility of underwriters to understand the products they insure and clearly state if they do not desire to cover an attendant risk. Nonetheless, as the use of IOT devices continues and expands, the past has taught that we can expect to see risks expand and insurers attempt to restrict coverage.

New Data Strategies for Workers’ Comp

Workers’ compensation is widely recognized as one of the most challenging lines of business, suffering years of poor results. Insurance companies are under increasing pressure to achieve profitability by focusing on their operations, such as underwriting and claims.

Insurers can no longer count on cycles, where a soft market follows a hard one. The traditional length of a hard or soft market is evolving in a global economy where capital moves faster than ever and competitors are using increasingly sophisticated growth, segmentation and pricing strategies.

Insurance executives also cite regulatory and legislative pressures, such as healthcare and tax reform, as inhibitors of growth. Furthermore, medical costs continue to rise, making it particularly difficult to price for risk exposure. The long tail of a workers’ compensation claim means that the cost to treat someone continues to increase as time elapses and becomes a compounding problem.

Despite recent improvements in combined ratios, there are still many challenges within workers’ compensation that have to be reconciled. The savviest insurers are evaluating the availability of technologies, advanced data and analytics to more accurately price risk – and ultimately ensure profitability.

The ‘Unknown’ in Workers’ Compensation

Information asymmetry has made it difficult for insurers to accurately determine who is a high-risk customer and who is low-risk. At the point of new business, a workers’ compensation insurer is likely to have the least amount of information about those they are insuring, and it’s easy to understand why. The insured knows exactly who is on the payroll and what types of duties employees have. Some of that information is relayed to an agent, and then finally to the carrier, but, as in any game of telephone, the final message becomes distorted from the original.

This imbalance of information is one reason why fraud is rampant, and why insurers ultimately pay the price. Payroll misclassification – or “premium fraud” – occurs when businesses pay salaries off the books, misrepresent the type of work an employee does or purposely misclassify employees as independent contractors. Some misclassifications are not nefarious. But whatever the cause, they create significant revenue and expense challenges for carriers that rely on self-reporting.

The ‘Silent’ Killer

Without the right insight or analytical tools, insurance companies have a hard time discerning between their policyholders and making consistent and fair decisions on how much premium to charge on each policy. When an insurer begins to use predictive analytics, competitors that are still catching up run the risk of falling victim to adverse selection. When we hear executives say things like, “The competition has crazy pricing,” it raises a red flag. We immediately begin looking for warning signs of adverse selection, such as losing profitable business and an increasing loss ratio.

The problem is that it takes time to recognize that a more sophisticated competitor is stealing your good business by lowering prices while also sending you the worst-performing business. By the time you recognize adverse selection is occurring, you’re falling behind and have to respond quickly.

The Power of Actionable Data

Fortunately, there are technologies available for insurers of all sizes to make more informed, evidence-based decisions. But when it comes to data, there is still some confusion: Is more data always better? And how can carriers turn data into actionable results?

It’s not always about the volume of data that an insurer has; it’s about the business value you can derive from it.

If an insurer is just beginning to store, govern and structure its data, it is likely not receiving actionable insights from historic data assets. Accessing a more holistic data set with multiple variables (from states/geography, premium size, hazard groups, class codes, etc.) through a third party or partner can help to avoid selection bias, while encouraging rigorous testing and cataloging of data variables. Having access to a variety of information is key when it comes to making data-driven decisions.

The conundrum insurers face when delivering actionable intelligence that underwriters can use is that they only know the business they write. They know very little about business they quote and nothing about business they don’t even see. What complicates this picture is that an insurer’s data is skewed by its specific risk appetite and growth strategies. It’s up to the insurer to fill in the blind spots in its own data set to ensure accurate pricing and risk assessment. As we know, what an insurance company doesn’t know can hurt it.

As insurers increasingly turn to advanced data and analytics, the next question that keeps insurers up at night is, “When everything looks good, how do I know what isn’t really good?” One way that Valen Analytics is helping insurers answer that question is by providing companies with a “Risk Score,” a standard measure of risk quality. By tapping into Valen’s contributory database, workers’ compensation underwriters can have better insight into all the policies they write – even historically loss-free policies. In fact, the Risk Score accurately identifies a 30% loss ratio difference between the best- and worst-performing loss-free policies. This is one example of how the power of data can push the industry forward.

Despite its many challenges, the workers’ compensation industry is becoming more analytically driven and improving its profitability. A comprehensive data strategy drives pricing accuracy and business growth while allowing insurers to achieve efficiencies in underwriting decision-making. By keeping up with technological advances, insurers can use data and analytics to grow into new markets and areas of business, while also protecting their profitable market share.

While NCCI’s annual “State of the Line” report labeled workers’ compensation as “balanced” this year, we may soon see the integration of data and analytics push the industry to be recognized as “innovative.”

What the Next-Gen Insurer Will Look Like

Innovation is a crucial strategic mandate that is defining a new era of winners and losers. From retail to entertainment and everything in between, decades of business traditions and assumptions are toppling because of change – change that runs the gamut from customer behaviors and expectations to the use of new technologies. This level of change and disruption is unprecedented in the history of the insurance industry. And the pace just doesn't slow down: new technologies, the mash-up of technologies, new uses for these technologies, new competition, new customer behaviors, needs and expectations. These changes are demanding a new and responsive insurance industry.

At the same time, the impact of influencers is escalating — from both inside and outside the industry — and the explosion of data, the lifeblood of insurance, is creating new challenges as well as opportunities. This blitz is challenging and disrupting sacred business and operational models and assumptions, requiring new thinking, experimentation, the adoption of new technologies and yes … innovation. Many insurers, large and small, are grappling with getting their heads around how the business of insurance will change in the next three to five years.

While looking to the future has long been a part of our very culture, our ability to envision the future for insurance companies is often stymied by the priorities and challenges of today. However, if we want a future, we must rethink how we embrace innovation as the core of the Next-Gen Insurer.

A Next-Gen Insurer must reimagine the core components of insurance – the business models, products and services, infrastructures,and customers. All need to be underpinned by a culture that embraces collaboration, transformation and innovation. Forward-thinking insurers are defining what they will look like three, five and 10 years from now, planning how they will respond to influencers within and outside the industry, the path they will take to get there and the relationships that will fuel the journey.

Many insurers are on the journey, but they are going at different speeds and focusing on the different priorities that will uniquely differentiate and position them as market leaders. Some are reimagining the fundamentals of insurance, while others are retooling products, services, distribution and processes. Regardless of the approach, becoming a Next-Gen Insurer is a long-term, enterprisewide endeavor. It’s important to think big even though actions may start small.

So how to begin?

First, recognize that the innovation journey has started, with or without you. The longer you wait – the more difficult it becomes, and the more likely it is to be detrimental to your long-term business. Insurers must define their unique vision for how they will evolve into a Next-Gen Insurer by examining the fundamentals of the insurance business and determining how new levels of agility, flexibility, creativity and competitiveness can be created. There are four critical business components that insurers must reshape in their Next-Gen Insurer model: the customer, products and services, infrastructure and business model.

At the same time, companies must identify, track, assess and define how to respond to or leverage key influencers and trends. Prioritize them, developing scenarios and plans of action, experimenting and collaborating. This is paramount, not just for competitive advantage but for long-term survival. The coming years promise unparalleled opportunity for insurers to increase their value to their customers. Those that best capitalize on the key influencers will realize the most in rewards. In contrast, those that do not prepare for the future will find themselves falling behind, losing both competitive position and financial stability.

Equally critical is recognizing that no business, regardless of size, can go it alone and expect to lay hold of all the possibilities and reap all of the benefits. Most insurers lack the time, expertise and resources to track all of the influencers unless they engage outside industry resources. Insurers must identify partners who can mobilize an ecosystem of both internal and external relationships and resources to capture potential, change legacy cultures and enable the ideas and technologies that can be uniquely deployed within their companies to create their Next-Gen Insurer.

But most importantly, create and nurture a culture of innovation that starts at the top and is seen, heard and acted upon each and every day. Begin by identifying those within your organization who are the outside-the-box thinkers: those renegades and dreamers who can be advocates on the journey.

The innovation journey toward reinventing the business of insurance has started. Don’t delay, because what is innovative today will be expected tomorrow.

Begin your journey today — to ensure that you have a tomorrow.

For information about a detailed report on the Next-Gen Insurer, click here. To learn more about where the leaders in the industry are in their innovation journey, consider attending the 2014 SMA Summit in Boston Sept. 15, 2014.