Tag Archives: acord

Why Is Data on U.S. Property So Poor?

How a building is constructed and maintained and where it is located all have a massive impact on its potential to be damaged or destroyed. That knowledge is as old as insurance itself.

So why do so many underwriters still suffer from lack of decent data about the buildings they insure?

And when better data does get collected for U.S. properties, why does it seem to get lost as it crosses the Atlantic?

London is an important marketplace for insuring U.S. risks. It provides over 10% of the capacity for specialty risks — those that are hard, or impossible, to place in their home market through admitted carriers. Reinsurers of admitted carriers, insurers of homeowners and small businesses in the excess and surplus markets and facultative reinsurers of large corporate risks all need property data.

The emergence and growth of a new type of property insurers in the U.S. such as Hippo and Swyfft has been driven by an expectation of having access to excellent data. They are geared up to perform fast analyses. They believe they can make accurate assessments and offer cheaper premiums. The level of funding for ambitious startups shows that investors are prepared to write large checks, tolerate years of losses and have the patience to wait in the expectation that their companies will displace less agile incumbents. If this works, it’s not just the traditional markets in the U.S. that will be under threat. The important backstop of the London market is also vulnerable. So what can established companies do to counter these new arrivals?

Neither too hot nor too cold

The challenge for any insurer is how to get the information it needs to accurately assess a risk, without scaring off the customer by asking too many questions. The new arrivals are bypassing the costly and often inaccurate approach of asking for data directly from their insureds, and instead are tapping into new sources of data. Some do this well, others less so. We’re already seeing this across many consumer applications. They lower the sales barrier by suggesting what you need, rather than asking you what you want. Netflix knows the films you like to watch, Amazon recommends the books you should read, and soon you’ll be told the insurance you need for your home.

Health insurers such as Vitality are dramatically improving the relationship with their clients, and reducing loss costs, by rewarding people for sharing their exercise habits. Property insurers that make well-informed, granular decisions on how and what they are underwriting will grow their book of business and do so profitably. Those that do not will be undercharging for riskier business. Not a viable long-term strategy.

Fixing the missing data problem would be a good place to start.

We recently brought together 28 people from London Market insurers to talk about the challenges they have with getting decent quality data from their U.S. counterparts. We were joined by a handful of the leading companies providing data and platforms to the U.S. and U.K. markets. Before the meeting, we’d conducted a brief survey to check in on the trends. A number of themes emerged, but the two questions we kept coming back to were: 1) Why is the data that is turning up in London so poor, and 2) what can be done about it?

This is not just a problem for London. If U.S. coverholders, carriers or brokers are unable to provide quality data to London, they will increasingly find their insurance and reinsurance getting more expensive, if they can get it at all. Regulators around the world are demanding higher standards of data collection. The shift toward insurers selling direct to consumer is gathering momentum. Those that are adding frictional costs and efficiencies will be squeezed out. This is not new. Rapid systemic changes have been happening since the start of the industrial revolution. In 1830, the first passenger rail service in the world opened between Liverpool and Manchester in the northwest of England. Within three months, over half of the 26 stagecoaches operating on that route had gone out of business.

See also: Cognitive Computing: Taming Big Data  

Is the data improving?

Seventy percent of those surveyed believed that the data they are receiving from their U.S. partners has improved little, if at all, in the last five years. Yet the availability of information on properties had improved dramatically in the preceding 15 years. Why? Because of the widespread adoption of catastrophe models in that period. Models are created from large amounts of hazard and insurance loss data. Analyses of insured properties provide actionable insights and common views of risks beyond what can be achieved with conventional actuarial techniques. These analytics have become the currency of risk, shared across the market between insurers, brokers and reinsurers. The adoption of catastrophe models accelerated after Hurricane Andrew in 1992. Regulators and rating agencies demanded better ways to measure low-frequency, high-severity events. Insurers quickly realized that the models, and the reinsurers that used the models, penalized poor-quality data by charging higher prices.

By the turn of the century, information on street address and construction type, two of the most significant determinants of a building’s vulnerability to wind and shake, was being provided for both residential and commercial properties being insured for catastrophic perils in the U.S. and Europe. With just two major model vendors, RMS and AIR Worldwide, the industry only had to deal with two formats. Exchanging data by email, FTP transfer or CD became the norm.

Then little else changed for most of the 21st century. Information about a building’s fire resistance is still limited to surveys and then only for high-value buildings, usually buried deep in paper files. Valuation data on the cost of the rebuild, another major factor in determining the potential scale of loss and what is paid to the claimant, is at the discretion of the insured. It’s often inaccurate and biased toward low values.

If data and analytics are at the heart of insurtech, why does access to data appear to have stalled in the property market?

How does the quality of data compare?

We dug a bit deeper with our group to discover what types of problems they are seeing. In some locations, such as those close to the coast, information on construction has improved in the last decade, but elsewhere things are moving more slowly.

Data formats for property are acceptable for standard, homogeneous property portfolios being reinsured because of the dominance of two catastrophe modeling companies. For non-admitted business entering the excess and surplus market, or high-value. complex locations there are still no widely adopted standards for insured properties coming into the London market, despite the efforts of industry bodies such as Acord.

Data is still frequently re-keyed multiple times into different systems. Spreadsheets continue to be the preferred medium of exchange, and there is no consistency between coverholders. It is often more convenient for intermediaries to aggregate and simplify what may have once been detailed data as it moves between the multiple parties involved. At other times, agents simply don’t want to share their client’s information. Street addresses become zip codes, detailed construction descriptions default to simple descriptors such as “masonry.”

Such data chaos may be about to change. The huge inefficiency of multiple parties cleaning up and formatting the same data has been recognized for years. The London Market Group (LMG), a powerful, well-supported body representing Lloyd’s and the London company market has committed substantial funds to build a new Target Operating Model (TOM) for London. This year, the LMG commissioned London company Charles Taylor to provide a central service to standardize and centralize the cleaning up of the delegated authority data that moves across the market. Much of it is property data. Once the project is complete, around 60 Lloyd’s managing agents, 250 brokers and over 3,500 global coverholders are expected to finally have access to data in a standard format. This should eliminate the problem of multiple companies doing the same tasks to clean and re-enter data but still does nothing to fill in the gaps where critical information is missing.

Valuation data is still the problem

Information on property rebuilding cost that comes into London is considered “terrible” by 25% of those we spoke to and “poor quality” by 50%.

Todd Rissel, the CEO of e2Value, was co-hosting our event. His company is the third-largest provider of valuation data in the U.S. Today, over 400 companies are using e2Value information to help their policy holders get accurate assessments of the replacement costs after a loss. Todd started the company 20 years ago, having begun his career as a building surveyor for Chubb.

The lack of quality valuation data coming into London doesn’t surprise Todd. He’s proud of his company’s 98% success in accurately predicting rebuilding costs, but only a few states, such as California, impose standards on the valuation methods that are being used. Even where high-quality information is available, the motivation may not be there to use it. People choose their property insurance mostly on price. It’s not unknown for some insurers to recommend the lowest replacement value, not the most accurate, to reduce the premium, and the discrepancy gets worse over time.

Have the losses of 2017 changed how data is being reported?

Major catastrophes have a habit of exposing the properties where data is of poor quality or wrong. Companies insuring such properties tend to suffer disproportionately higher losses. No companies failed after the storms and wildfires of 2017, but more than one senior industry executive has felt the heat for unexpectedly high losses.

Typically, after an event, the market “hardens” (rates get more expensive), and insurers and reinsurers are able to demand higher-quality data. 2017 saw the biggest insurance losses for a decade in the U.S. from storms and wildfire — but rates haven’t moved.

Insurers and reinsurers have little influence in improving the data they receive.

Over two-thirds of people felt that their coverholders, and in some cases insurers, don’t see the need to collect the necessary data. Even if they do understand the importance and value of the data, they are often unable to enter it into their underwriting systems and pipe it digitally direct to London. Straight-through processing, and the transfer of information from the agent’s desk to the underwriter in London with no manual intervention, is starting to happen, but only the largest or most enlightened coverholders are willing or able to integrate with the systems their carriers are using.

We were joined at our event by Jake Hampton, CEO of Virtual MGA. Jake has been successful in hooking up a handful of companies in London with agents in the U.S. This is creating a far stronger and faster means to define underwriting rules, share data and assess key information such as valuation data. Users of Virtual MGA are able to review the e2Value data to get a second opinion on information submitted from the agent. If there is a discrepancy between the third party data that e2Value (or others) are providing and what their agent provides, the underwriter can either change the replacement value or accept what the agent has provided. A further benefit of the dynamic relationship between agent and underwriter is the removal of the pain of monthly reconciliation. Creating separate updated records of what has been written in the month, known as “bordereau,” is no longer necessary. These can be automatically generated from the system.

Even though e2Value is generating very high success rates for the accuracy of its valuation data, there are times when the underwriter may want to double-check the information with the original insured. In the past, this required a lengthy back and forth discussion over email between the agent and the insured.

JMI Reports is one of the leading provider of surveys in the U.S. Tim McKendry, CEO of JMI, has partnered with e2Value to create an app that provides near-real-time answers to an underwriter’s questions. If there is a query, the homeowner can be contacted by the insurer directly and asked to photograph key details in his home to clarify construction details. This goes directly to the agent and underwriter enabling the accurate and fast assessment of rebuild value.

What about insurtech?

We’ve been hearing a lot in the last few years about how satellites and drones can improve the resolution of data that is available to insurers. But just how good is this data? If insurers in London are struggling to get data direct from their clients, can they, too, access independent sources of data directly? And does the price charged for this data reflect the value an insurer in London can get from it?

Recent entrants, such as Cape Analytics, have also attracted significant amounts of funding. They are increasing the areas of the U.S. where they provide property information derived by satellite images. EagleView has been providing photographs taken from its own aircraft for almost 20 years. CEO Rishi Daga announced earlier this year that their photographs are now 16 times higher-resolution than the best previously available. If you want to know which of your clients has a Weber barbeque in the backyard, EagleView can tell you.

Forbes McKenzie, from McKenzie Insurance Services, knows the London market well. He has been providing satellite data to Lloyd’s of London to assist in claims assessment for a couple of years. Forbes started his career in military intelligence. “The value of information is not just about how accurate it is, but how quickly it can get to the end user,” Forbes says.

See also: How Insurtech Helps Build Trust  

The challenges with data don’t just exist externally. For many insurance companies, the left hand of claims is often disconnected from the right hand of underwriting. Companies find it hard to reconcile the losses they have had with what they are being asked to insure. It’s the curse of inconsistent formats. Claims data lives in one system, underwriting data in another. It’s technically feasible to perform analyses to link the information through common factors such as the address of the location, but it’s rarely cost-effective or practical to do this across a whole book of business.

One of the barriers for underwriters in London in accessing better data is that companies that supply the data, both new and old, don’t always understand how the London market works. Most underwriters are taking small shares of large volumes of individual properties. Each location is a tiny fraction of the total exposure and an even smaller fraction of the incoming premium. Buying data at a cost per location, similar to what a U.S. domestic insurer is doing, is not economically viable.

Price must equal value

Recently, the chief digital officer of a London syndicate traveled to InsureTech Connect in Las Vegas to meet the companies offering exposure data. He is running a POC against a set of standard criteria, looking for new ways to identify and price U.S. properties. He’s already seeing a wide range of approaches to charging. U.K.-based data providers, or U.S. vendors with local knowledge of how the information is being used, tend to be more accommodating to the needs of the London insurers. There is a large potential market for enhanced U.S. property data in London, but the cost needs to reflect the value.

Todd Rissel may have started his career as a surveyor and now be running a long-established company, but he is not shy about working with the emerging companies and doesn’t see them as competition. He has partnerships with data providers such as drone company Betterview to complement and enhance the e2Value data. It is by creating distribution partnerships with some of the newest MGAs and insurers, including market leaders such as Slice and technology providers like Virtual MGA, that e2Value is able to deliver its valuation data to over a third of the companies writing U.S. business.

Looking ahead

It is widely recognized that the London market needs to find ways to meaningfully reduce the cost of doing business. The multiple organizations through which insurance passes, whether brokers, third-party administrators or others, increase the friction and hence cost. Nonetheless, once the risks do find their way to the underwriters, there is a strong desire to find a way to place the business. Short decision chains and a market traditionally characterized by underwriting innovation and entrepreneurial leaders means that London should continue to have a future as the market for specialty property insurance. It’s also a market that prefers to “buy” rather than “build.” London insurers are often among the first to try new technology. The market welcomes partnerships. The coming generation of underwriters understands the value of data and analytics.

The London market cannot, however, survive in a vacuum. Recent history has shown that those companies with a willingness to write property risks with poor data get hit by some nasty, occasionally fatal surprises after major losses. With the increasing focus by the regulator and Lloyd’s own requirements, casual approaches to risk management are no longer tolerated. Startups with large war chests from both U.S. and Asia see an opportunity to displace London.

Despite the fears that data quality is not what it needs to be, our representatives from the London market are positive about the future. Many of them are looking for ways to create stronger links with coverholders in the U.S. Technology is recognized as the answer, and companies are willing to invest to support their partners and increase efficiency in the future. The awareness of new perils such as wildfire and the opening up of the market for flood insurance is creating opportunities.

Our recent workshop was the first of what we expect to be more regular engagements between the underwriters and the providers of property information. If you are interested in learning more about how you can get involved, whether as an underwriter, MGA, provider data, broker or other interested party, let me know.

Time for E-Signatures, Doc Management

If you want to know why insurance companies need electronic signatures and document management, you must first look at the regulatory landscape.

In the past 10 years, this climate has changed considerably, and most insurance companies are struggling to do one of two things to handle these changes: 1) make internal policies to comply with these changes without sacrificing profitability; and 2) find creative ways to outpace competitors looking for the same solutions to these problems.

Neither is an easy feat.

The National Association of Insurance Commissioners (NAIC) has even devoted a large portion of its industry report to addressing one of the myriad ways insurance companies are striving to transcend regulatory difficulties—through the efficiency of the internet.

This is a major reason why insurance companies need both electronic signatures and document management. Used separately, they are ineffective at delivering that the solutions insurance companies need. Together, their interplay makes navigating regulatory changes easy, especially those administered and upheld by the Federal Insurance Office (FIO) and NAIC.

Understanding E-Commerce and Insurance Sales Problems

Most states in the U.S. require those applying for insurance services over the internet to complete an electronic signature, whether it is used as a standalone technology or integrates with document management technologies. Although the approach may seem like common sense, its advent does away with the use of a witness or notary and brings into question the legitimacy of signatures.

See also: The Most Valuable Document That Money Can Buy  

Despite digital signatures being more efficient (after all, if e-signatures existed in 1776, all 56 U.S. delegates could’ve signed the document on the day our nation was founded; instead, it took roughly a month to collect all the signatures), they require additional authentications. This can be automated by document management tools.

Legitimizing Electronic Insurance Applications

ACORD, the Association for Cooperative Operations Research and Development, achieved this automation by making digital forms available on its domain. Application of electronic signature technology situated in document management solutions just needs to be applied during the final stages of the process.

Why the Need Is Paramount

Above all else, these are the features that create an effective interplay between document management technologies and electronic signatures.

Authentication Procedures

Inclusion of a KBA challenge question helps authenticate the digital signature process. This ensures that the party attempting to sign a document is who he or she says he or she is.

IP Address Verification

IP address verification is an extra layer that can bolster the legitimacy of a signed document if a legal dispute over its authenticity ever arises.

Form Fill Automation

There are new and exciting ways to automate the form fill process for recurring client-based and document related processes. Zonal OCR makes this possible, eliminating manual processes and reducing document workload to a bare minimum.

See also: E-Signatures: an Easy Tech Win  

Bar Code Authentication

Although a bar code authentication in an electronic signature should never be a standalone backup, it does add a layer of legitimacy. A bar code is a stamp of individuality that reveals its purpose and origins quite clearly.

Ensuring Data in Documents is Unaltered

It becomes obvious that electronic signatures are more useful if applied through document management technologies, as these technologies ensure documentation is not altered.

What’s more, the role-based user permissions of a document management system can trace who changed what within a system, ensuring that those who alter data without authorization can be held accountable for their actions.

4 Ways to Improve Agent Experience

While tech innovation is quickly transforming how people buy insurance, most innovation is focused on direct-to-consumer (DTC). Yet the majority of P&C insurance sales still happen through an agent. Our own research indicates that while 80% of consumers do not trust insurance carriers, they do trust agents, and, for this reason, still prefer to buy insurance through an agent, rather than online.

If these numbers are any indication, insurance agents are not going away yet. Still, the process of buying insurance through an agent–for customers, carriers and agents alike–remains cumbersome, with long lapse times, large margins of error and a lot of paperwork. These problems are deciding factors in why agents and customers choose to bring their business elsewhere and can lead to millions of dollars of lost premiums for carriers and a general distrust among agents and customers. Focusing on improving the agent-driven sales process is a win-win-win–for customers, agents and carriers. Below are a few simple ways that carriers can incorporate agents into their customer experience strategy.

1. Ensure accurate quotes

Carriers must do everything they can to ensure that quotes are as accurate as possible and issued without rework. Quote inaccuracy leads to a breakdown of trust on all sides, causing both consumers and agents to lose trust in carriers, creating friction within the quoting process and extending and convoluting the sale.

By exposing risk ratings, ordering reports early and ensuring that the level of accuracy is the same between comp-raters and proprietary systems, carriers can speed the sales process and, through transparency, create trust between agents and consumers.

See also: The New Agent-Customer Relationship  

2. Align technology with agent needs, your rules and ACORD standards

Ensuring that, as a carrier, you understand the information, systems and tools agents need to service customers and that agents understand your rules can help eliminate unnecessary confusion on both sides of the sales process. Sales enablement tools and SaaS technology that are designed using contextual research on agents’ real selling processes help greatly.

Make sure your technology accurately represents your rules and that it is transparent to the agent. Moreover, ensuring that any technology you implement is aligned with ACORD standards will create consistency between your technology and other systems agents are used to, further eliminating barriers. Remember, you want to make it as easy as possible for them to use your systems so they can go about their jobs.

3. Make servicing and claims as transparent as possible

When hailing an Uber with your phone, you know exactly where it is on the map, what it looks like, who the driver is and when it is arriving, to the minute. Agents and customers want similar transparency into what is going on with applications and with claims.

Invest in the technology needed to service the customer efficiently and fairly and inform the agent of what’s happening with the customer. Agents appreciate knowing key actions like a late payment or claim so they can manage expectations with the customer.

4. Bring agents along the journey of global transformation initiatives

Digital transformation in insurance is about more than just technology. It’s about creating a better experience for everyone and developing the processes and the technology needed to support it. Digitizing business as usual is not the solution. Rather, invest in deep user research and involving all the players involved in an experience to develop the best possible solution for the customer, the carrier and the agent.

As you are developing new sales enablement technology, share plans and get feedback from agents on in-progress work. This will not only provide you with valuable feedback but can convert agents into your initiative’s most powerful marketing tool with their colleagues and customers.

See also: How to Enhance Customer Service  

According to a survey of 5,000 independent agents, 99.5% of insurance agents say that ease of doing business is critical in choosing which carrier gets their business. Being easy to do business with goes a long way to support agents in their relationship with customers. Agents know when your quotes are not accurate and are frustrated by manual corrections. They will do whatever they can to shield their customers from those frustrations, and sometimes that means choosing a different carrier for their customer’s business.

As carriers plan their global transformation initiatives, considering independent agents as a critical kind of user–as critical as, say, a customer–can provide an important competitive edge over other carriers, making them a carrier of choice for independent agents and the first carrier they go to when writing new business.

This article first appeared on the Cake & Arrow website. For more on improving the agent customer experience, watch this on-demand webinar.  

UBI Has Failed, but Telematics…Wow!

Insurance telematics has been out there for more than 20 years. Many insurers have tried to play with the technology, but few have succeeded in using the data available from connected telematics devices. The potential of this technology was misunderstood, and best practices have remained almost unknown, as it was not common in the insurance sector to look for innovation in other geographies, such as Italy, where progress has been made.

But the insurance sector is being overtaken by a desire to change, and it’s becoming more common to see innovation scouting taking place on an international level. In the last two years, billions of dollars have been invested in insurance startups; innovation labs and accelerators have popped up; and many insurance carriers have created internal innovation units.

On the other hand, I’m starting to hear a new wave of disillusion about the lack of traction of insurtech initiatives, the failure of some of them, or insurtech startups radically changing from their original business models.

In a world that tends toward hyperconnectivity and the infiltration of technology into all aspects of society, I’m firmly convinced all insurance players will be insurtech—meaning they all will be organizations where technology will prevail as the key enabler for the achievement of strategic goals.

See also: Telematics Has 2 Key Lessons for Insurtechs  

Starting from this premise, I’d like to focus on two main points:

  1. The ability of the insurance sector to innovate is incredibly higher than the image commonly perceived.
  2. While not all insurtech innovations will work, a few of them will change the sector.

In support of the first point, consider the trajectory of digital insurance distribution. The German Post Office first experimented with remote insurance sales at the beginning of the 1980s in Berlin and Düsseldorf using Bildschirmtext (data transmitted through the telephone network and the content displayed on a television set). Almost 60% of auto insurance coverage is now sold online in the U.K., and comparison websites are the “normal” way to purchase an auto insurance policy. In few other sectors is one able to see comparable penetration of digital distribution.

In the health insurance sector, the South African insurer Discovery demonstrates incredible innovation, as well. Over the last 20 years, the insurer has introduced new ways to improve policyholders’ lives using connected fitness devices to track healthy behaviors, generate discounts and deliver incentives for activities supporting wellness and even healthy food purchases. Discovery has been able to replicate this “Vitality” model in different geographies and different business lines and to exploit more and more usage of connected devices in its model each month. Vitalitydrive by Discovery rewards drivers for driving knowledge, driving course attendance and behavior on the road with as much as 50% back on fuel purchases at certain stations.

More than 12 months ago, I published my four Ps approach for selecting the most interesting initiatives within the crowded insurtech space. I believe initiatives will have a better chance to win if they can improve:

  • Productivity (generate more sales).
  • Profitability (improve loss or cost ratios).
  • Proximity (improve customer relationships through numerous customer touchpoints).
  • Persistency (account retention, renewal rate increase).

Those insurtech initiatives will make the insurance sector stronger and more able to achieve its strategic goal: to protect the way people live.

One trend able to generate a concrete impact on all four Ps is connected insurance. This is a broad set of solutions based on sensors for collecting data on the state of an insured risk and on telematics for remote transmission and management of the collected data.

In a survey of ACORD members by the North American Connected Insurance Observatory, 93% of respondents stated this trend will be relevant for the North American insurance sector. It’s easy to understand why. We live in a time of connected cars, connected homes and connected health. Today, there is more than one connected device per person in the world, and by some estimates the figure will reach seven devices per person by 2020. (Cisco Internet Business Solutions Group, “The Internet of Things: How the Next Evolution of The Internet Is Changing Everything,” April 2011, estimates seven per person; AIG/CEA, 2015, estimates five per person.) Others put the number at 50 devices for a family of four by 2022, up from 10 in 2014. The insurance sector cannot stop this trend; it can only figure out how to deal with it.

Moving to the concrete insurance usage of connected devices, the common perception of UBI is not positive at all. This is the current mood after years of exploring the usage of dongles within customer acquisition use cases, where the customer installs a piece of hardware in the car for a few months and the insurer proposes a discount based on the analysis of trips. This partially (only for a few months) connected car approach is based on the usage of data to identify good drivers, with the aim of keeping them as clients through a competitive price offer. In 2015, around 3.3 million cars in the U.S. sent in data to an insurance company in some way, representing less than 1.5% of the market.

In contrast, another market used telematics in a completely different way—and it succeeded. Almost 20% of auto insurance policies sold and renewed in the last quarter of 2016 in Italy had a telematics device provided by an insurer based on the IVASS data. The European Connected Insurance Observatory—the European chapter of the insurance think tank I created, consisting of more than 30 European insurers, reinsurers and tech players with an active presence in the discussion from their Italian branches—estimated that 6.3 million Italian customers had a telematics policy at the end of 2016.

Some insurers in this market were able to use the telematics data to create value and share it with customers. The most successful product with the largest traction is based on three elements:

  • A hardware device provided by the insurer with auto liability coverage, self-installed by the customer on the battery under the car’s hood.
  • A 20% upfront flat discount on annual auto liability premium.
  • A suite of services that goes beyond support in the case of a crash to many other different use cases—stolen vehicle recovery, car finder, weather alerts—with a service fee around €50 charged to the customer.

This approach is not introducing any usage-based insurance elements but is an approach clearly able to satisfy the most relevant needs of a customer:

  • Saving money on a compulsory product. Research shows that pricing is relevant in customer choice.
  • Receiving support and convenience at the moment of truth—the claims moment. Insurers are providing a better customer experience after a crash using the telematics data. Just think of how much information can be gathered directly from telematics data without having to question the client.
  • Receiving services other than insurance. That’s something roughly 60% of insurance customers look forward to and value, according to Bain’s research on net promoter scores published last year.

Let’s analyze this approach from an economic perspective:

  • The fee to the customer is close to the annual technology cost for the hardware and services. The €50 mentioned above represents more than 5% of the insurance premium for the risky clients paying an annual premium higher than €1,000. This cluster represents less than 5% of the Italian telematics market. The fee is more than 10% of the premium for the customers paying less than €400. This cluster represents more than 40% of the Italian telematics market.
  • The product is a constant, daily presence in the car, with the driver, with no possibility of turning it off. While the product ensures support in case of a crash, it is also a tremendous deterrent for anyone tempted to make a fraudulent claim, as well as for drivers engaging in risky behavior otherwise hidden from the insurer.
  • The telematics portfolio has shown on average 20% lower claims frequency on a risk-adjusted basis than the non-telematics portfolio, based on the analysis done by the Italian Association of Insurers.
  • Insurer best practices have achieved additional savings on the average cost of claims by introducing a claims management approach as soon as a crash happens and by using the objective reconstruction of the crash dynamic to support the claim handler’s decisions.
  • A suite of telematics services is delivered to the customer, along with a 25% upfront discount on the auto liability premium.

So, best practices allowed carriers to maximize return on investment in telematics technology by using the same data coming from the black box to activate three different value creation levers: value-added services paid for by the customer, risk selection and loss control. The value created was shared with the customer through the upfront discount. The successful players obtained a telematics penetration larger than 20% and experienced continuous growth of their telematics portfolios.

See also: Telematics: Moving Out of the Dark Ages?  

These insurers were able to orchestrate an ecosystem of partners to deliver a “customer-centric” auto insurance value proposition, satisfying the three main needs of customers—or at least those of “good” customers. Compared with many approaches currently being experimented with in different business lines around the world, where the insurance value proposition is simply enlarged by adding some services, this insurtech approach is also leveraging the insurers’ unique competitive advantage—the insurance technical P&L—to create a virtuous value-sharing mechanism based on the telematics data.

The story of the Italian auto telematics market shows how insurtech adoption will make the insurance sector stronger and better able to achieve its strategic goals: to protect the ways in which people live and organizations work

This article originally appeared on Carrier Management.

We’re Being Luddites About Verification

There are several seminal moments when I first experienced something that forever changed the trajectory of my life:

–While senior director of research and development at ACORD in 1992, I worked with New Science, a research firm. The insurance industry was heavily investing in the development of AL3 batch data standards via point-to-point dial-up connections. New Science, however, was looking way down the path toward global, online, real-time transactions through a single network connection. Working with New Science was the first time I heard the word “internet.”

–I distinctly remember walking through the Indianapolis airport and seeing someone holding a “brick” next to his head. It was the first time I saw a mobile phone in operation, a Motorola Dynatic 8000X. Priced at $3,995 and weighing in at 28 ounces, the phone took roughly 10 hours to take on a full charge and offered only about 30 minutes of talk time on a highly limited analog network. Some of us remember running off airplanes to banks of payphones to check voice mail and to make calls between connections. Now it’s almost impossible even to find a pay phone.

–My first date with Mary Ann Hildebrand was Oct. 9, 1971. Game 1 of the 1971 World Series featured the Pittsburgh Pirates against our hometown favorites, the Baltimore Orioles. A week later, I held her hand and kissed her for the first time. And, after 41 years of marriage, the rest is history, as they say.

I also clearly remember listening to and meeting Thornton May in 1992 after his scathing commentary, “Luddism Looms Large,” appeared in ComputerWorld. It was the first time I heard the term “Luddite.” The term goes back to followers of Ned Ludd, the late 18th century British antitechnology leader who protested the replacement of human labor and skill with machines. Ludd energized a movement throughout the textile industry as his followers protested by destroying machines and property. Luddism today is a more general term for those who are opposed to technology change.

See also: Key to Digitizing Customer Experience  

When it comes to online verification, the insurance industry is filled with Luddites, compared with other industries. Every time an insurance policy or business relationships changes anywhere in the world, verification of insurance and compliance checking is required. This should happen digitally, right? In this day and age…. Instead, verification is delivered via a form, whether paper, fax or PDF.

All have the same problem: The information in them is as of a point in time. The information is locked, and the receiver can’t do anything with it.

Compare that with these industries:

  • To verify stock price information, you don’t have someone send you a form saying what it was last week or last month. You don’t even have to log onto the individual company websites, or have to go to the NYSE or NASDAQ. You just search for the company, and you see today’s price as it dynamically changes, in addition to historical pricing and a raft of other information.
  • To verify the status of a flight, you don’t have to log onto the individual airline websites. You just search for the airline and flight number and you see the schedule, if it’s on time in addition to city and gate information.
  • To verify my ability to pay, no one takes impressions of credit cards anymore. I do not show paper or PDF versions of my three-month-old credit card or bank statements to prove that I can pay. Nor does anyone take a picture of a check, my face and driver’s license. Someone I’m paying reads my card or check electronically, automatically verifying that funds are available.

I was reminded of this on my most recent speaking engagement. At 4 a.m., I arrived at my destination city. I stepped into a cab and was efficiently whisked away to my meeting location. Cabs no longer take a physical impression of my credit card. Instead, my card with an onboard chip was inserted, read and charged. Boom! Verified.

Unlike other industries that have online verification available, today’s convoluted and wildly expensive verification of insurance is a vortex of manual effort, paper, email, faxes and procedures. Data is both late and locked in certificate forms (paper or PDF). To begin getting our arms around the size of this opportunity, here are three sets of statistics to reflect on:

  • $1 trillion-plus of vehicle loans in the U.S. require verification at least once a year — twice a year if the policy is six months, and perhaps 12 times a year if the insured is paying monthly.
  • 1.2 million companies with 28.8 million commercial trucks and 3 million drivers provide forms as proof of insurance. How many do you think are out of date? Fraudulent?
  • 42.6 million independent contractors provide form-driven proof of insurance when they bid on a job.

Companies that receive data on forms have no assurance that the information is real or accurate or complies with their needs. Even with extensive and expensive manual checking, no one really knows if the data on the form is valid.

We have an expensive, lose-lose proposition.

Trying to fix the problem by addressing the form is like trying to fix cigarettes with a new type of cigarette. Problems with the underlying technology preclude a solution.

See also: Secret to Finding Top Technology Talent  

When a form-based proof/certificate of insurance is shared today, no one asks for a non-disclosure. There is also no password or encryption beyond the PDF format. Insurance rates, rules and forms are filed and approved by state agencies, which by nature make them available to the public. You can also go to web sites to search and view insurance carrier forms.

Insurance verification is not just at origination or signing of a contract. Insurance verification is continuous.

Once it goes on, it goes on and on.