Download

Lemonade’s Crazy Market Share

We have 27% share among newcomers to insurance! You don’t need clairvoyance to see the predictive power of that metric.

sixthings
It’s the craziest thing: In the State of New York, Lemonade appears to have overtaken Allstate, GEICO, Liberty Mutual, State Farm and the others in what is probably the single most critical market share metric of all. But I’m getting ahead of myself. Our story starts a few months back, when a few digits in a tedious insurance report woke me with a jolt: “723,030.” Why the drama? 723,030 was the number of New Yorkers with renters insurance, and Lemonade had sold way more than 7,230 renters policies to New Yorkers. The upshot: We captured more than 1% market share in just a few months. That seemed crazy. In homeowners insurance in the U.S., a 1.6% market share makes you a top 10 insurance company. And this exclusive club has been at it, on average, for 104 years. Lemonade launched in September. See also: Lemonade Reports: ‘Our First 100 Days’   I went to my shelf, pulled my copy of "Microtrends" and highlighted its punchline:
“It takes only 1% of people making a dedicated choice — contrary to the mainstream’s choice — to create a movement that can change the world.” (xiv)
Then It Got Crazier No sooner had we come back down to Earth, when a new study suggested that our "movement" was on the move. This survey, dated April 2017, updated Lemonade’s NY market share to a crazier 4.2% (E:+2.1/-1.4). Note that while our market share numbers are from dependable sources (reports by regulators, surveys by Google), differing methodologies and timeframes make a conclusive number hard to pin down. That’s just fine by us. For one, we’re growing fast, making any precise number passé by the time it’s computed. For another, "overall market share" — whatever the number — misses the craziest part. The Craziest Part Most New Yorkers got their insurance policy before Lemonade existed. That means that "overall market share" pits our few months of sales against sales made by legacy carriers in the decades before we launched. Which raises the question: What’s our market share among New Yorkers who entered the market since we did? What’s our share of brand new policies? Looks Like We’re Number One It’s totally crazy but also totally logical. Given that about 90% of the market bought their policy before we launched, it stands to reason that our "brand new" market share will be about 10x our "overall" market share. Logic is nice, of course, but it’d be better if there was some empirical evidence to back it up. There is. A second survey broke down marketshare based on when people first bought insurance and found that Lemonade’s market share among first time buyers is more than 27%! 27% share among newcomers to insurance! You don’t need clairvoyance to see the predictive power of that metric. Nothing foretells tomorrow’s "overall" market share like today’s "brand new" market share. Note that the margin of error in the survey is wide (+12.6/-9.8), so our true "brand new" marketshare could be as little as 18%. Again, I’m not spending any time narrowing the range. Pick any point within the margin of error, and the thrust of the story is unchanged: It’s crazy. Crazy Is the New Normal Lemonade is growing exponentially, and today’s subscriber base is more than 2X what it was when those surveys ran 10 weeks ago. In fact, new bookings have doubled every 10 weeks since launch and show no sign of letting up. But exponential growth isn’t the craziest part. The craziest part is that, even if that acceleration stopped, even if we just maintained the status quo from April, within a few years our overall market share would automatically climb to match our "brand new" market share.
That’s what "brand new: market share means; and that’s why it’s probably the single most critical metric of all. Today’s crazy is tomorrow’s normal.
See also: Lemonade: From Local to Everywhere   I know: We’re still tiny, and incumbents won’t stand idly by as we coast from #1 in "brand new" to #1 nationwide. But that’s the trajectory we’re on. And with a nod to Newton’s first law, we’ll keep moving along that trajectory unless stopped by an external force. Game on.

Daniel Schreiber

Profile picture for user DanielSchreiber

Daniel Schreiber

Daniel Schreiber is CEO and co-founder at Lemonade, a licensed insurance carrier offering homeowners and renters insurance powered by artificial intelligence and behavioral economics. By replacing brokers and bureaucracy with bots and machine learning, Lemonade promises zero paperwork and instant everything.

How Tech Created a New Industrial Model

With a connected device for every acre of inhabitable land, we are starting to remake design, manufacturing, sales. Really, everything.

sixthings

With a connected device for every acre of inhabitable land, we are starting to remake design, manufacturing, sales. Really, everything.

With little fanfare, something amazing happened: Wherever you go, you are close to an unimaginable amount of computing power. Tech writers use the line “this changes everything” too much, so let’s just say that it’s hard to say what this won’t change.

It happened fast. According to Cisco Systems, in 2016 there were 16.3 billion connections to the internet around the globe. That number, a near doubling in just four years, works out to 650 connections for every square mile of Earth’s inhabitable land, or roughly one every acre, everywhere. Cisco figures the connections will grow another 60% by 2020.

Instead of touching a relatively simple computer, a connected smartphone, laptop, car or sensor in some way touches a big cloud computing system. These include Amazon Web Services, Microsoft Azure or my employer, Google (which I joined from the New York Times earlier this year to write about cloud computing).

Over the decade since they started coming online, these big public clouds have moved from selling storage, network and computing at commodity prices to also offering higher-value applications. They host artificial intelligence software for companies that could never build their own and enable large-scale software development and management systems, such as Docker and Kubernetes. From anywhere, it’s also possible to reach and maintain the software on millions of devices at once.

For consumers, the new model isn’t too visible. They see an app update or a real-time map that shows traffic congestion based on reports from other phones. They might see a change in the way a thermostat heats a house, or a new layout on an auto dashboard. The new model doesn’t upend life.

For companies, though, there is an entirely new information loop, gathering and analyzing data and deploying its learning at increasing scale and sophistication.

Sometimes the information flows in one direction, from a sensor in the Internet of Things. More often, there is an interactive exchange: Connected devices at the edge of the system send information upstream, where it is merged in clouds with more data and analyzed. The results may be used for over-the-air software upgrades that substantially change the edge device. The process repeats, with businesses adjusting based on insights.

See also: ‘Core in the Cloud’ Reaches Tipping Point  

This cloud-based loop amounts to a new industrial model, according to Andrew McAfee, a professor at M.I.T. and, with Eric Brynjolfsson, the coauthor of “Machine, Platform, Crowd,” a new book on the rise of artificial intelligence. AI is an increasingly important part of the analysis. Seeing the dynamic as simply more computers in the world, McAfee says, is making the same kind of mistake that industrialists made with the first electric motors.

“They thought an electric engine was more efficient but basically like a steam engine,” he says. “Then they put smaller engines around and created conveyor belts, overhead cranes — they rethought what a factory was about, what the new routines were. Eventually, it didn’t matter what other strengths you had, you couldn’t compete if you didn’t figure that out.”

The new model is already changing how new companies operate. Startups like Snap, Spotify or Uber create business models that assume high levels of connectivity, data ingestion and analysis — a combination of tools at hand from a single source, rather than discrete functions. They assume their product will change rapidly in look, feel and function, based on new data.

The same dynamic is happening in industrial businesses that previously didn’t need lots of software.

Take Carbon, a Redwood City, CA maker of industrial 3D printers. More than 100 of its cloud-connected products are with customers, making resin-based items for sneakers, helmets and cloud computing parts, among other things.

Rather than sell machines, Carbon offers them like subscriptions. That way, it can observe what all of its machines are doing under different uses, derive conclusions from all of them on a continuous basis and upgrade the printers with monthly software downloads. A screen in the company’s front lobby shows total consumption of resins being collected on AWS, the basis for Carbon’s collective learning.

“The same way Google gets information to make searches better, we get millions of data points a day from what our machines are doing,” says Joe DeSimone, Carbon’s founder and CEO. “We can see what one industry does with the machine and share that with another.”

One recent improvement involved changing the mix of oxygen in a Carbon printer’s manufacturing chamber. That improved drying time by 20%. Building sneakers for Adidas, Carbon was able to design and manufacture 50 prototype shoes faster than it used to take to do half a dozen test models. It manufactures novel designs that were previously theoretical.

The cloud-based business dynamic raises a number of novel questions. If using a product is now also a form of programming a producer’s system, should a company’s avid data contributions be rewarded?

For Wall Street, which is the more interesting number: the revenue from sales of a product, or how much data is the company deriving from the product a month later?

Which matters more to a company, a data point about someone’s location, or its context with things like time and surroundings? Which is better: more data everywhere, or high-quality and reliable information on just a few things?

Moreover, products are now designed to create not just a type of experience but a type of data-gathering interaction. A Tesla’s door handles emerge as you approach it carrying a key. An iPhone or a Pixel phone comes out of its box fully charged. Google’s search page is a box awaiting your query. In every case, the object is yearning for you to learn from it immediately, welcoming its owner to interact, so it can begin to gather data and personalize itself. “Design for interaction” may become a new specialization.

 The cloud-based industrial model puts information-seeking responsive software closer to the center of general business processes. In this regard, the tradition of creating workflows is likely to change again.

See also: Strategist’s Guide to Artificial Intelligence  

A traditional organizational chart resembled a factory, assembling tasks into higher functions. Twenty-five years ago, client-server networks enabled easier information sharing, eliminating layers of middle management and encouraging open-plan offices. As naming data domains and rapidly interacting with new insights move to the center of corporate life, new management theories will doubtless arise as well.

“Clouds already interpenetrate everything,” says Tim O’Reilly, a noted technology publisher and author. “We’ll take for granted computation all around us, and our things talking with us. There is a coming generation of the workforce that is going to learn how we apply it.”


Quentin Hardy

Profile picture for user QuentinHardy

Quentin Hardy

Quentin Hardy is the head of editorial at Google Cloud, writing about the ways that cloud computing technology, and by extension the advent of computer intelligence at every point on the planet, is reshaping society.

It's Time to Accelerate Digital Change

Companies have started, but addressing narrowly defined problems or one specific part of the business has delivered limited value.

sixthings
For global insurers, digital transformation and disruptive innovation have gone from being vague futuristic concepts to immediate action items on senior leaders’ strategic agendas. New competitive threats, continuing cost pressures, aging technology, increasing regulatory requirements and generally lackluster financial performance are among the forces that demand significant change and entirely new business models. Other external developments — the steady progress toward driverless cars, the rapid emergence of the Internet of Things (IoT) and profound demographic shifts — are placing further pressure on insurers. A common fear is that new market entrants will do to insurance what Uber has done to ride hailing, Amazon has done to retail and robo advisers are doing to investment and wealth management. Yes, "digital transformation" has become an overused term beloved by industry analysts, consultants and pundits in the business press. Yes, it can mean different things to different companies. However, nearly every insurer on the planet — no matter its size, structure or particular circumstances — should undertake digital transformation immediately. This is true because of ever-rising consumer expectations and the insurance sector’s lagging position in terms of embracing digital. The good news is that many early adopters and fast followers have already demonstrated the potential to generate value by embedding digital capabilities deeply and directly into their business models. Even successful pilot programs have been of limited scope. By addressing narrowly defined problems or one specific part of the business, they have delivered limited value. Formidable cultural barriers also remain; most insurers are simply not accustomed or equipped to move at the speed of digital. Similarly, few, if any, insurers have the talent or workforce they need to thrive in the industry’s next era. Because the value proposition for digital transformation programs reaches every dimension of the business, it can drive breakthrough performance both internally (through increased efficiency and process automation) and externally (through increased speed to market and richer consumer and agent experiences). Therefore, insurers must move boldly to devise enterprise-scale digital strategies (even if they are composed of many linked functional processes and applications) and “industrialize” their digital capabilities — that is, deploy them at scale across the business. This paper will explore a range of specific use cases that can produce the breakthrough performance gains and ROI insurers need. From core transformation to digital transformation Recognizing the need to innovate and the limitations of existing technology, many insurers undertook core transformation programs. These investments were meant to help insurers set foot in the digital age, yet represented a very first step or foundation so insurers could use basic digital communications, paperless documents, online data entry, mobile apps and the like. These were necessary steps, as the latest EY insurance consumer research shows that more than 80% of customers are willing to use digital and remote contact channels (including web chat, email, mobile apps, video or phone) in place of interacting with insurers via agents or brokers. More advanced technologies, which can enable major efficiency gains and cost improvements for basic service tasks, also require stronger and more flexible core systems. Chatbot technology, for instance, can deliver considerable value in stand-alone deployments (i.e., without being fully integrated with core claims platforms). However, the full ROI cannot be achieved without integration. For many insurers, core transformation programs are still underway, even as insurers recognize a need to do more. Linking digital transformation programs to core transformation can help insurers use resources more effectively and strengthen the business case. Waiting for core transformation programs to be completed and then taking up the digital transformation would likely result in many missed performance improvement and innovation opportunities, as well as higher implementation costs. One key challenge is the industry’s lack of standardized methodologies and metrics to assess digital maturity. With unclear visibility, insurance leaders will have a difficult time knowing where to prioritize investments or recognizing the most compelling parts of the business case for digital transformation. But, because digital transformation is a long journey, most insurers are best served by a phased or progressive approach. This is not to suggest that culturally risk-averse insurers be even more cautious. Rather, it is to acknowledge that complete digital transformation at one go can’t be managed; there are simply too many contingencies, dependencies and risks that must be accounted for. See also: The Key to Digital Innovation Success   Insurers must be focused and bold within their progressive approach to digital transformation, as it is the way to generate quick wins and create near-term value that can be invested in the next steps. Each step along the digital maturity curve enables future gains. Rather than waiting to be disrupted, truly digital insurers move boldly, testing and learning in pursuit of innovation and redesigning operations, engaging customers in new ways and seeking out new partners. Digital transformation across the insurance value chain: a path to maturity and value creation Digital transformation delivers tangible and intangible value across the insurance value chain, with specific benefits in six key areas: It’s important to emphasize speed and agility as essential attributes of the digital insurer. Even the most innovative firms must move quickly if they are to fully capitalize on their innovations — a concept that applies across the entire value chain. The idea is to launch microservices faster and embrace modernized technology where possible. For instance, deploying cloud infrastructures will enable some parts of the business to scale up and scale down faster, without disrupting other parts of the business with “big dig” implementations. The dependencies and limitations of legacy technology are also worth reiterating. Insurers that can integrate process innovations and new tools with existing systems — and do so efficiently and without introducing operational risk — will gain a sustainable competitive advantage. The following digital transformation scorecards reflect how the benefits apply to different technologies and initiatives. Omni-channel Today’s consumers are naturally omni-channel, researching products online, recommending and talking about them with friends and contacts on social media and then buying them via mobile apps or at brick-and-mortar retail locations. Basically, they want a wide range of options — text, email, web chat, phone and sometimes in-person. A better omni-channel environment may also enable insurers to place new products in front of potential customers sooner and more directly than in the past. Insurers must look beyond merely supporting multiple channels and find the means to allow customers to move seamlessly between channels, or even within channels (such as when they move from chatting with a bot to chatting with a human agent). It is difficult to overstate how challenging it is to create the capabilities (both technological and organizational) to recognize customers and what they are seeking to do, without forcing them to re-enter their passwords or repeat their questions. There are many other subtleties to master, including context. For example, a customer trying to connect via social media to voice concerns is not likely to respond well to a default ad or up-sell offering. Omni-channel is increasingly a baseline capability that insurers must establish to achieve digital maturity. Big data analytics The application of advanced analytical techniques to large and ever-expanding data sets is also foundational for digital insurers. For instance, predictive analytics can identify suitable products for customers in particular regions and demographic cohorts that go far beyond the rudimentary cross-selling and up-selling approaches used by many insurers. Big data analytics also hold the key for creating personalized user experiences. Analytics that “listen” to customer inputs and recognize patterns can identify opportunities for new products that can be launched quickly to seize market openings. Deep analysis of the customer base may make clear which distribution channels (including individual agents and brokers) are the best fit for certain types of leads, leading to increased sales productivity. The back-office value proposition for big data analytics can also be built on superior recognition of fraudulent claims, which are estimated to be around 10% of all submitted claims, with an impact of approximately $40 billion in the U.S. alone. Reducing that number is an example of how digital transformation efforts can be self-funding. Plus, the analytics capabilities established in anti-fraud units can be extended into other areas of the business. Big data is also reshaping the risk and compliance space in important ways. As insurers move toward more precise risk evaluations (including the use of data from social channels), they must also be cognizant of shifting regulations regarding data security and consumer privacy. It won’t be easy ground to navigate. Internet of Things (IoT) The onset of smart homes gives insurers a unique opportunity to adopt more advanced and effective risk mitigation techniques. For instance, intelligent sensors can monitor the flow of water running through pipes to protect against losses caused by a broken water pipe. Similar technology can be used to monitor for fire or flood conditions or break-ins at both private homes and commercial properties. The IoT clearly illustrates the new competitive fronts and partnership opportunities for insurers; leading technology and consumer electronics providers have a head start in engaging consumers via smart appliances and thermostats. Consumers, therefore, may not wish to share the same or additional data with their insurers. Insurers may also be confronted by the data capture and management challenges related to IoT and other connected devices. Telematics Sometimes grouped with IoT, data from sensors and telematics devices have applications across the full range of insurance lines:
  • Real-time driver behavior data for automotive insurance
  • Smart appliances — including thermostats and security alarms — within homeowners insurance
  • Fitness trackers for life and health insurance
  • Warehouse monitors and fleet management in commercial insurance
The data streams from these devices are invaluable for more precise underwriting and more responsive claims management, as well as product innovation. Telematics data provides the foundation for usage-based insurance (UBI), which is sometimes called “pay-as-you-drive” or “pay-as-you-live.” Premium pricing could be based on actual usage and driving habits, with discounts linked to miles driven, slow or moderate speeds and safe braking patterns, for instance. Consider, too, how in-vehicle devices enable a fully automated claims process:
  • Telematics data registers an automobile accident and automatically triggers a first notice of loss (FNOL) entry.
  • Claims information is updated through text-based interactions with drivers or fleet managers.
  • Claimants could be offered the opportunity to close claims in 60 minutes or less.
Such data could also be used to combat claims fraud, with analysis of the links between severity of the medical condition and the impact of the accident. Some insurers are already realizing the benefits of safe driving discounts and more effective fraud prevention. These telematics-driven processes will likely become standard operating procedure for all insurers in the near future. Voice biometrics and analysis Audio and voice data may be the most unstructured data of all, but it too offers considerable potential value to those insurers that can learn to harness it. A first step is to use voice biometrics to identify customers when they call into contact centers, saving customers the inconvenience of entering policy numbers and passwords, information that may not be readily at hand. Other insurers seeking to better understand their customers may convert analog voice data from call center interactions into digital formats that can be scanned and analyzed to identify customer emotions and adjust service delivery or renewal and cross-selling offers accordingly. The manual quality control process checks for less than 1% of the recordings, which is insufficient. Through automation, the entire recording can be assessed to identify improvement areas. See also: 4 Rules for Digital Transformation   Drones and satellites Early-adopting insurers are already using drones and satellites to handle critical tasks in underwriting and claims. In commercial insurance, for instance, drones can conduct site inspections, capturing thermal imagery of facilities or work sites. Their reviews can be as specific as looking for roof cracks, old or damaged boilers and other physical plan defects that can pose claims risks. Within homeowners lines, satellites can capture data to analyze roofs, chimneys and surrounding terrain so that insurers can determine which homeowner they want to add to underwrite, as well as calculate competitive and profitable premiums. When linked to digital communications tools, drone and satellite data can even trigger notifications to customers of new price options or policy adjustments. Within claims, drones and satellites can handle many tasks previously handled by human adjustors across all lines of business. Such remote assessments can reduce claims processing time by a considerable degree. This method is particularly effective in situations such as after floods, fires and natural disasters, where direct assessment is not possible. While many transformation programs that use drones and satellites remain in the experimental stages due to operational challenges, it is possible that they can improve the efficiency and accuracy of underwriting and claims information gathering by 40%. Blockchain Blockchain provides a foundation for entirely new business models and product offerings, such as peer-to-peer insurance, thanks to its ability to provide virtual assistance for quoting, claims handling and other tasks. It also provides a new level of information transparency, accuracy and currency, with easier access for all parties and stakeholders in an insurance contract. With higher levels of autonomy and attribution, blockchain’s architectural properties provide a strong digital foundation to drive use of mobile-to-mobile transactions and swifter, secure payment models, improved data transparency and reduced risk of duplication or exposure management. Insurance companies are interested in converting selected policies from an existing book to a peer-to-peer market. A blockchain network is developed as a mechanism for integrating this peer-to-peer market with a distributed transaction ledger, transparent auditability and “smart” executable policy. E-aggregators are another emerging business model that is likely to gain traction, because it is appealing to both insurers and the customers. Insurers can offer better pricing due to reduced commissions compared with a traditional agent-based distribution model, while customers gain freedom to compare different policies based on better information. Of course, e-aggregators (whether fully independent or built through an existing technology platform) will require a sophisticated and robust digital platform for gathering information from different insurance companies to present it to consumers in the context of a clear, intuitive experience. It is also important for insurance companies to transfer information to e-aggregators rapidly; otherwise, there is the risk they will miss out on sales opportunities. This is why blockchain is the right technology for connecting e-aggregators and insurers. To see the full report from EY, click here.

David Connolly

Profile picture for user DavidConnolly

David Connolly

David Connolly is a global insurance digital leader at EY. He leads the EY global insurance digital practice. EY has defined a multitude of technology-enabled business offerings that help insurers quickly launch digital solutions to remain competitive. Connolly is based in Silicon Valley, California.

3 Technology Trends Worth Watching

Many insurers think 20% of their business could be soon lost to insurtech startups, so staying ahead of technology trends is vital.

sixthings
At a time when many insurers believe that 20% of their business could be soon lost to insurtech startups and when roughly one third of insurance industry CIOs said that, if given an extra $5 million to spend, they would spend it on big data or increased data collection, understanding technology trends is critical to gaining an edge. So, let’s look at three of the emerging technologies affecting the insurance sector. 1. Sensors and other data-tracking technologies In the past, insurance companies and actuaries based their pricing on aggregated data from large numbers of customers. Today, innovations in internet-connected devices such as wearables, auto devices and smart homes are giving insurance companies meaningful data that is specific to individual policyholders. For example, Progressive Insurance created Snapshot, a device a policyholder can install in his or her car that allows Progressive to monitor certain data about the customer's driving habits and to adjust pricing accordingly. Progressive claims to have distributed $600 million in discounts to its policyholders, largely because of data from Snapshot. See also: 10 Trends at Heart of Insurtech Revolution   Snapshot is just one example of how sensors and data-tracking technology can generate savings for policyholders while, at the same time, making insurers more efficient. As this technology continues to gain adoption, many more sensors will be available to monitor policyholder data on health, autos, homes and more. 2. Drones Drone technology is a rapidly growing niche in the insurance industry, with some predicting it will reach a yearly value of $6.8 billion in the coming years. This growing interest in drone technology was a driving force behind a recent panel discussion on drones at the Contractor Connection conference in St. Louis. WeGoLook’s COO, Kenneth Knoll, participated in this panel, which was attended by more than 3,000 industry professionals. Knoll noted that drone technology applies to a wide range of insurance services — roof inspections, underwriting, disaster relief, crop inspections, and much more. Consider an order recently received by WeGoLook requesting a scene inspection at a commercial location where an injury occurred. As compared with photos taken from the ground, aerial imagery captured by one of WeGoLook’s licensed drone operators offered the insurer client a much more effective representation of the scene in question. 3. Paperless solutions Evolving technology also makes it possible for insurers to onboard new clients, handle claims and send notifications in a completely paperless manner. The increased digitization of insurance solutions has the potential to dramatically improve the speed and efficiency with which insurance companies operate. For example, Lemonade, an insurtech company, allows clients to sign up for policies and file claims in less than three minutes, using only a mobile device. Mobile is the new paper as millennials have an extremely high percentage of smartphone use (97%). Carriers that can best cater to paperless, mobile solutions will gain a strong competitive advantage. See also: The Story Behind the Lemonade Hype   Final Thoughts Some have argued that we are currently experiencing a fourth industrial revolution powered, in part, by the developments noted above. Sensors, drones and paperless solutions are just a few of the technologies driving this revolution. Carriers must make these types of innovations a priority because they are fundamentally changing the expectations of clients. It’s time for all insurance professionals to acknowledge and embrace this digital transformation.

Robin Roberson

Profile picture for user RobinSmith

Robin Roberson

Robin Roberson is the managing director of North America for Claim Central, a pioneer in claims fulfillment technology with an open two-sided ecosystem. As previous CEO and co-founder of WeGoLook, she grew the business to over 45,000 global independent contractors.

Complexity Theory Offers Insights (Part 1)

The conceptual framework best suited to understanding our networked world is complexity science. It shows how insurance must evolve.

sixthings
In the first of this series of four segments, we will look at the current state of the risk markets and the insurance industry; the emerging peer-to-peer (P2P) segment of the risk markets; how blockchain technology is enabling a new taxonomy in the risk markets; and what changes may occur as a result of these new technologies and methods. The purpose of this series hails from the open source movement in the software industry. Key to the open source philosophy is the transparent and voluntary collaboration of all interested parties. While this work has been kept fairly close to the vest for the past few years, I have taken meetings with two Fortune 500 insurance companies' strategy and venture teams, both of which asked for a proof of concept — as well as with a handful of other large international insurance companies and one of the big four accounting firms. At the other end of the spectrum, I have also spoken with other founders of P2P insurance startups around the world, and I have participated in the communities surrounding blockchain technology. I feel that these handful of folks have already enjoyed early access to these concepts, and my motivation with this series is to achieve a more level playing field for all parties interested in the future of the risk markets. There are links at the bottom of this article to join the conversation via a LinkedIn group and get access to the whole series. To begin, let's take a look at the current state of risk markets. It is important to distinguish between drivers of economic systems and the impact they have on business models in the industrial age vs. in the information age. See also: Should We Take This Risk?   Hardware and technology was a key driver throughout the industrial age, which saw a growing batch of new technologies — from cars and planes, to computers and smart phones, to industrial robots, etc. Industrial age business models were almost always “extractionary” in their nature. The business model engages with some market, and it profits by keeping some portion of the market's value. Extracting value from the market The strategies of the industrial age were:
  • Standardization — interchangeable parts
  • Centralization — big factories, vertical integration, economies of scale
  • Consolidation —an indication that an industry is about to experience a phase change
In the information age, business models almost always embody some creation of “network effect.” When the business model engages with a market, the individual actors all benefit as more actors engage with the business model. The value creation is usually tied to a network's graph, and the value creation will grow exponentially as the network's density grows. Creating value for the market, not extracting value from the market The strategies and efficiency-drivers in the information age are:
  • Cheap connections — enabling multiple paths through the network's graph
  • Low transaction cost — in terms of time, effort and money
  • Lateral scaling — not vertical structures, which will be flattened out (“top down” increases network fragility)
  • Increase in node diversity — and in the ways each node can connect
All of these drivers lead to increasing network density and flow. Things are moving away from large, brittle centralized organizational structures and toward “distributed,” P2P, “crowd” or “sharing economy” types of organizational structures. Moving away from command-and-control organizational structures is almost impossible for organizations that profit from efficiency gains derived from a centralized effort. It is this attribute of their business model that necessitates startups and new business models coming in and bringing improvements to the market — challenging incumbent economic and business models. The information age is all about networks (not technology), and building graphs that create positive network effects. The conceptual framework best suited to understanding networks and the networked world we now live in is complexity science. The study of complex adaptive systems has grown out of its roots in the 1940s and has proliferated since the 1990s and the explosion of computer networks and social networks. Here is an introduction: When looking at complex systems, we start by looking at the system’s graph. To get an idea of what a graph is, let’s look at a few examples of “graph companies.”
  • Facebook built the “social graph” of acquaintances; it did not create acquaintances.
  • Linkedin built the “professional graph” of coworkers and colleagues; it did not create coworkers and colleagues.
  • Google built the “link graph” for topics searched; it did not create back links for the topics searched.
Notice that, in each of these cases, the company built and documented the connections between the things or nodes in the network and did not create the things or nodes themselves. Those already existed. To start looking at the risk markets, we must first understand what is being connected or transferred between the nodes (a.k.a. the users). It should be of little surprise that, in the risk markets, it is risk that is being transferred between nodes, like a user transferring risk to an insurance company. In terms of risk graphing, there are currently two dominant graphs. A third is emerging. Let’s take a look at the graphs that make up the risk markets and the insurance industry.
  1. Insurance — is the “hub and spoke” graph.
  2. Reinsurance — is the decentralized graph connecting risk hubs.
  3. P2P Coverage — will be formalized in a distributed graph. (This is the one that does obviously not exist formally, but, informally, you see people calling parents/friends and using GoFundMe/their church/their office/other community organizations to spread risk out laterally.)
In today’s risk markets, insurance companies act as centralized hubs where risk is transferred to and carried through time. The reinsurance industry graph is enabling second-degree connections between insurance companies, creating a decentralized graph. In the current industry's combined graph structure or stack, only these two graphs formally exist. While an insurance company’s ledgers remain a hub where risk is transferred to and carried through time, reinsurance enables those risk hubs to network together, achieving a higher degree of overall system resilience. See also: Are Portfolios Taking Too Much Risk?   The P2P distributed graph currently exists via informal social methods. Stack all three graphs, and you can observe how total risk is addressed across all three graph types. Each has its strengths and weaknesses, which leads to its existing in its proper place within the risk markets. The fact that insurance as a financial service gets more expensive per $1,000 of coverage as coverage approaches the first dollar of loss means that, as a financial service, there is a boundary where insurance's weaknesses will outweigh its strengths. My expectation is that much of the risk currently being carried on the hub-and-spoke insurance graph will accrue to the P2P distributed graph because of improved capital efficiency on small losses via a trend of increasing deductibles. This may lead to some of the risk currently carried on the reinsurance decentralized graph being challenged by centralized insurance. The proportion of total risk — or “market share” — that each graph carries will shift in this phase change. When people say insurance is dropping the ball, they are expressing that there is a misunderstanding or poor expectation-setting about how much of total risk the first two graphs should be absorbing. Users are unhappy that they end up resorting to informal P2P methods to fully cover risk. To increase the resilience of society’s risk management systems and fill the gaps left by the insurance and reinsurance graphs, we need the third risk distribution graph: a distributed P2P system. Society needs a distributed system that enables the transfer of risk laterally from individual to individual via formalized methods. This P2P service must be able to carry un-insurable risk exposures, such as deductibles, or niche risk exposures that insurance is not well-suited to cover. Much of this activity already occurs today and, in fact, has been occurring since the dawn of civilization. KarmaCoverage.com is designed to formalize these informal methods and enable end users to benefit from financial leverage created by the system’s network effect on their savings. When observing a system through the complexity paradigm, another key measure to observe is a system’s level of resilience vs. efficiency. Resilience and efficiency sit on opposite sides of a spectrum. A system that is 100% resilient will exhibit an excess of redundancy and wasted resources, while a system that is 100% efficient will exhibit an extreme brittleness that lends itself to a system collapse. When we look at the real world and natural ecosystems as an example, we find that systems tend to self-organize toward a balance of roughly 67% resilient and 33% efficient. Here is a video for more on this optimum balance. Industrial-age ideas have driven economics as a field of study to over-optimize for efficiency, but economics has, in recent years, begun to challenge this notion as the field expands into behavioral economics, game theory and complexity economics — all of which shift the focus away from solely optimizing for efficiency and toward optimizing for more sustainable and resilient systems. In the risk markets, optimizing for resilience should have obvious benefits. Now, let’s take a look at how this applies practically to the risk markets, by looking at those three industry graphs. Centralized network structures are highly efficient. This is why a user can pay only $1,000 per year for home insurance and when her home burns down get several hundred thousand dollars to rebuild. From the user’s point of view, the amount of leverage she was able to achieve via the insurance policy was highly efficient. However, like yin and yang, centralized systems have an inherent weakness — if a single node in the network (the insurance company) is removed, the entire system will collapse. It is this high risk of system collapse that necessitates so much regulation. In the risk markets, we can observe two continuing efforts to reduce the risk of an insurance system collapse. We observe a high degree of regulation, and we see the existence of reinsurance markets. The reinsurance markets function as a decentralized graph in the risk markets, and their core purpose is to connect the centralized insurance companies in a manner to ensure that their inherent brittleness does not materialize a “too big to fail” type of event. Reinsurance achieves this increase in resilience by insuring insurance companies on a global scale. If a hurricane or tsunami hits a few regional carriers of risk, those carriers can turn to their reinsurance for coverage on the catastrophic loss. Reinsurance companies are functionally transferring the risk of that region’s catastrophic loss event to insurance carriers in other regions of the globe. By stacking the two system’s graphs (insurance and reinsurance), the risk markets' ability to successfully transfer risk across society has improved overall system resilience while still retaining a desired amount of efficiency. Observations of nature reveal what appears to be a natural progression of networks that grow in density of connections. Therefore, it makes sense that the reinsurance industry came into existence after the insurance industry, boosting the risk markets' overall density of connections. Along the same line of thought, we would expect to see the risk markets continue to increase in the density of connections from centralized to decentralized and further toward distributed. A distributed network in the risk markets will materialize as some form of financial P2P, "crowd” or “sharing economy” coverage service. A network's density is defined by the number of connections between the nodes. More connections between nodes mean the network has a higher density. For example, a distributed network has a higher density of connections than a centralized network. However, a higher density of connections requires more intense management efforts. There is a limit to how much complexity a centralized management team can successfully organize and control. See also: 5 Steps to Profitable Risk Taking   When a network’s connections outgrow centralized management’s capacity to control, the network will begin to self-organize or exhibit distributed managerial methods. Through this self-organization, a new graph structure of the network’s connections will begin to emerge. As this process unfolds, an entirely new macro system structure will emerge that shows little resemblance to the system’s prior state, much like a new species through evolution. What emerges is a macro phase change (aka “disruption”) that does not necessitate any new resource inputs, only a reorganization of the resources. For example, the macro state of water can go through a phase change and become ice. The micro parts that make up water and ice are the same. The macro state, however, has undergone a phase change, and the nature of the connections between the micro parts will have been reorganized. In his book “Why Information Grows: The Evolution of Order from Atoms to Economies,” MIT’s Cesar Hidalgo explains that, as time marches forward, the amount of information we carry with us increases. That information ultimately requires a higher density of connections as it grows. This can be understood at the level of an individual who grows wiser with experiences over time. However, as the saying goes, “The more you know, the more you know you don’t know.” In the history of human systems, we have observed the need for families to create a tribe, tribes to create a society and society-organizing-firms to achieve cross-society economic work. We are now at the point of needing these firms to create a network of firms that can handle increased complexity and coordination. It is this network of firms that will be achieved via distributed methods because no individual firm will ever agree to let another single firm be the centralized controller of the whole network — nor could a single firm do so. In the next segment of this series, we will look more closely at the distributed graph that will become formalized, creating a P2P system in the risk markets. I have started a LinkedIn group for discussion on blockchain, complexity and P2P insurance. Feel free to join here: https://www.linkedin.com/groups/8478617 If you are interesting exploring working with KarmaCoverge please feel free to reach out to me.

Ron Ginn

Profile picture for user RonGinn

Ron Ginn

Ron Ginn is a financial engineer who has focused on “peer-to-peer insurance” since 2013 and who sees blockchain as the enabling technology for scalable trust networks.

The Next Step in Underwriting

Lenders draw data on individuals from all three credit bureaus. Why don't insurers do the same with the three sources of hazard data?

sixthings
When a person applies for a mortgage in the U.S., credit reports are pulled from all three bureaus -- Equifax, Experian and TransUnion. Why? Because a single bureau does not provide the whole story. When you’re lending hundreds of thousands or millions of dollars it makes sense to find out as much as you can about the people borrowing the money. The lender wants the whole story. When you’re underwriting the property, doesn’t it make sense to get more than one perspective on its risk exposure? Everyone in the natural hazard risk exposure business collects different data, models that data differently, projects that data in different ways and scores the information uniquely. While most companies start with similar base data, how it gets treated from there varies greatly. When it comes to hazard data there are also three primary providers, HazardHub, CoreLogic and Verisk. Each company has its team of hazard scientists and its own way of providing an answer to whatever risk underwriting and actuarial could be concerned with. While there are similarities in the answers provided, there are also enough differences -- usually in properties with questionable risk exposure -- that it makes sense to mitigate your risk by looking at multiple answers. Like the credit bureaus, each company provides a good picture of risk exposure, but, when you combine the data, you get as complete a picture as possible. See also: Next Generation of Underwriting Is Here   Looking at risk data is becoming more commonplace for insurers. However, if you are looking at a single source of data, it is much more difficult to use hazard risk data to limit your risk and provide competitive advantage. Advances in technology (including HazardHub’s incredibly robust APIs) make it easier than ever to incorporate multi-sourced hazard data into your manual and automated underwriting processes. As an insurer, your risk is enormous. Using hazard data -- especially multi-sourced hazard data -- provides you with a significantly more robust risk picture than a single source. At HazardHub, we believe in the power of hazard information and the benefits of multi-sourcing. Through the end of July, we’ll append our hazard data onto a file of your choice absolutely free, to let you see for yourself the value of adding HazardHub data to your underwriting efforts. For more information, please contact us.

John Siegman

Profile picture for user JohnSiegman

John Siegman

John Siegman is the co-founder of Hazard Hub, a property risk data company that was acquired by Guidewire in mid-2021. He is now a senior executive at Guidewire helping to lead the direction of the HazardHub solution and guiding P&C insurance clients in innovating their data integration into critical processes.

Key Trends in Innovation (Part 6)

The ability to combine innovation with a model that improves the way individuals perceive and interact with insurance is critical.

sixthings
This article is the fifth in a series on key forces shaping the insurance industry. Parts One, Two, Three and Four/Five can be found here, here, here and here. Trend #6: Delivering on the customer promise is the key The ability to dynamically innovate (new risk pools, new segments, new channels) and deliver on the customer promise will become the most important competitive advantage as known risks continue to get commoditized and move to the direct channels. One of the key forces driving the growth of insurtech is the current lack of engagement and, in some instances, the lack of trust between consumers and the industry. In our view, the ability to combine innovation with a model that improves the way individuals perceive and interact with insurance is critical in driving value creation. Claims is at the heart of the customer promise At the heart of the customer promise is the claims process. In many ways, it’s surprising that innovation in this area has been relatively modest given its economic importance (representing 60% to 70% of the overall cost base) and its importance in driving customer satisfaction. Our analysis has shown that the impact on renewals rates between a poor or positive claims experience is as high as 50%. See also: Can Insurance Innovate?   In most cases, claims acceptance rates are very high (often more than 90%). Despite this, individuals are skeptical. This is driven, in part, by the approach taken by the industry. As a minor example, often while you wait for your call to a claims center to be answered, you are reminded of the consequences of making a fraudulent claim. The insurance company seems to imply the most likely scenario is a fraudulent one. The interaction is off to a bad start before it’s even properly begun. We have positioned claims innovation as one of our four key pillars with a solution built around a customer-managed claims platform called RightIndem. RightIndem allows an inefficient analog process to be converted into a digital one, resulting in significant cost savings for the insurer — both in claims handling costs and cost of claim. More importantly, though, the platform significantly enhances customer satisfaction by placing him or her at the heart of the process and by being easy, mobile-enabled, transparent and quick. (As an example, total loss motor claims were settled in a matter of days rather than the 21-day average that exists in the U.K.). New models, new channels, new risk pools Customer engagement is another important area of innovation, with several new approaches being tested. It’s important that insurance becomes more relevant and tailored to individuals' needs and circumstances. As this happens, insurance moves from being a “sell” decision to a “buy” decision. Our earlier article on just-in-time insurance explored some of these trends in more detail. Innovation is also allowing new ways to interact with customers, and we see potential in solutions that enable insurance at point of sale or point of demand. In addition, new risk pools that allow niche or tailored solutions make insurance more relevant to individuals. Finally, there are a number of models that are looking to change the nature of the customer promise. Insure A Thing (IAT), for example, has turned the traditional insurance process on its head. Rather than pay an upfront premium, customers are placed in affinity groups where cost of claims is shared among members, with a safety net provided by an insurance carrier. All parts of the value chain are aligned; IAT only earns fees when it pays claims. Innovation, if insurers embrace it, will allow them to fundamentally change the customer dynamic and to create a new value proposition that is truly appreciated and valued by the customer. See also: InsurTech: Golden Opportunity to Innovate   We hope you enjoy these insights, and we look forward to collaborating with you as we create a new insurance future. Next article in the series: Trend #7: Internal innovation, incubation and maturing of capabilities will no longer be the optimal option; dynamic innovation will require aggressive external partnerships and acquisitions.

Sam Evans

Profile picture for user SamEvans

Sam Evans

Sam Evans is founder and general partner of Eos Venture Partners. Evans founded Eos in 2016. Prior to that, he was head of KPMG’s Global Deal Advisory Business for Insurance. He has lived in Sydney, Hong Kong, Zurich and London, working with the world’s largest insurers and reinsurers.

Forget Big Data; You Need Fast Data

You need to be able to apply big data analytics in near-real or even real time while engaging with a customer or another computer.

sixthings
In 1989, Queen released a very successful single called “I Want It All.” The opening repeats the song title twice, then changes subtly to “and I want it now!” This could be a battle cry for today’s fast-moving society. We’ve all come to expect a rapid response to our requests for service, and we’ve become impatient with those who can’t deliver. We even watch kettles heat up and wonder why they take so long to boil, and we stand and complain about queue lengths. Whereas consumers might take some comfort (or the opposite) in knowing that most companies they deal with hold vast amounts of data about them, all of this data is historic and, actually, very little is used productively. Yet we are increasingly engaged in real-time conversations with companies either via a mobile app, our PCs or the good old-fashioned telephone, providing real-time data about a need or a problem. So why aren’t companies, by and large, capturing and acting on that data in real-time while they are interacting with their customers? The simple explanation is that acting on data captured in real time is beyond the means of most of the systems built by these companies, and it’s not a trivial matter to change, given that this inevitably means tinkering with legacy systems. See also: Producing Data’s Motion Pictures   But there is a solution in sight, and it’s called "fast data." Fast data is the application of big data analytics to smaller data sets in near-real or real time to solve a problem or create business value while engaging with a customer or another computer. Fast data is not a new idea, but it’s going to get very important to embrace fast data. A Fast Data Architecture What high-level requirements must a fast data architecture satisfy? They form a triad:
  1. Reliable data ingestion.
  2. Flexible storage and query options.
  3. Sophisticated analytics tools.
The components that meet these requirements must also be reactive, meaning they scale up and down with demand, are resilient against the failures that are inevitable in large distributed systems (we don’t want any failures on autonomous cars!), always respond to service requests even if failures limit the ability to deliver services and are driven by messages or events from the world around them. The chart below shows an emerging architecture that can meet these requirements. The good news is that you can graft such an architecture on top of legacy systems, which is exactly what ING has been doing. Unlocking valuable intelligence Back in the halcyon days, banks were very close to their customers. They knew customers intimately and treated them personally. With the proliferation of customers, products and channels, though, this intimacy has been lost. ING wanted to recapture the "golden era" with a global strategy to make the bank more customer focused, "mobile first" and altogether more helpful. A typical bank these days captures and processes billions of customer requests, instructions and transactions. In doing so, they capture and store vast amounts of customer data – but, and here’s the startling truth, few (if any) of the major banks use this data effectively for the benefit of their customers. ING appointed a manager of fast data, Bas Geerdink, to address this problem. His broad international remit is to create a truly customer-friendly, omni-channel experience. To kick start this process, he turned his attention to ING's vast but disparate data stores, as he was convinced they could unlock valuable intelligence. Historical data can often reveal customer behaviors and trends that are crucial to predictive analytics. For example, past data can be used to plot future pressure points on personal finances – e.g., key payment events can be anticipated and mitigated with predictive analytics. However, mining this data presents major challenges. Most banks are hampered by disparate and disconnected legacy applications that cannot operate in real time. Confronted with this dysfunctional problem, ING made some fundamental decisions:
  1. Create a single, secure data lake.
  2. Employ a variety of open source technologies (along the lines of those shown in the chart above). These technologies were used to build the over-arching notifications platform to enable data to be captured and acted on in real time.
  3. Work with the legacy application teams to ensure that critical events (during a customer’s "moment of truth") are notified to this fast data platform.
  4. Trigger two vital platform responses: a. Instantly contact the customer to establish whether help is urgently needed (for example, to complete a rapid loan application); b. Run predictive analytics to decide whether the customer needs to be alerted.
The future role of banks Partly in response to the Open Banking directive, the bank is now opening up its data to third parties who have been authorized by customers to process certain transactions on their behalf (e.g. paying bills). This is a fascinating development with potentially far-reaching implications. It raises a question about the future role of banks. For example, would the rise of nimble, tech-driven third parties reduce banks to mere processing utilities? ING is determined not to be marginalized, which is why it has invested in this fast data platform and its building real-time predictive apps – both on its own and with third parties  (such as Yolt). It is a bold and very radical strategy – and, not surprisingly, it raises some searching questions. Hearing this story made me wonder what types of customer would most welcome this kind of service, and was there any risk of alienating less technology-literate customers? The bank doesn’t yet have definitive answers to these questions. However, ING is adamant that all technology-driven initiatives must have universal appeal, and that is why ING is introducing change on a very gradual, phased basis. See also: When Big Data Can Define Pricing (Part 2)   In the first instance, ING is testing these services on employees of the bank and then on beta test groups of (external) customers. To date, feedback has been extremely positive, and this has encouraged the bank to keep investing. However, Bas emphasizes the need to appreciate customer sensitivities and preferences. For example, there is a fine line between providing a valued service and becoming intrusive – that is why the bank specifically considers factors such as the best, most receptive time of day to make interventions (if at all). Fraud detection is another intriguing development where fast data is having a significant impact. At the moment, traditional fraud detection systems often lack finesse. When a customer attempts to use a credit card, it can trigger a false positive 90% of the time (or even more). This can be inconvenient both for the bank and especially for the customer (although a false positive is not always perceived in a negative way – it shows the bank is checking money flows). ING is hopeful that its fast data platform will radically reduce the level of false positives as well as the level of fraud. Other applications of fast data I’m aware that Capital One has deployed a fast data service and is now able to authorize a car loan in seconds – instant on-the- line confirmation that massively improves the customer experience. Yet I’ve also heard of instances where data is anything but fast! Take the Lloyds Insurance market. Currently, some full risk assessments for specialist insurance are completed two weeks after prices have been quoted – quite clearly, this is a risk too far! We can also see applications in places like the police and military, who often have to capture and act upon a variety of data sources, in real time, in often hazardous and difficult circumstances. Fast data analytics could be used, for example, to predict when supplies of ammunition will run out and to trigger immediate air drops to front-line troops. The opportunities to change lives with fast data are enormous. Luckily, it’s becoming easier and easier to achieve. The time to start is now.

Robert Baldock

Profile picture for user RobertBaldock

Robert Baldock

Robert Baldock has been conceiving and delivering innovative solutions to major institutions for all of his 40 working years. He is a serial entrepreneur in the IT field. Today, he is the managing director of Clustre, an innovation broker.

The Big Lesson From Amazon-Whole Foods

As grocers just saw, the incursion by Amazon is the new nature of disruption: Disruptive competition comes out of nowhere.

sixthings
I doubt that Google and Microsoft ever worried about the prospect that a book retailer, Amazon, would come to lead one of their highest-growth markets: cloud services. And I doubt that Apple ever feared that Amazon’s Alexa would eat Apple’s Siri for lunch. For that matter, the taxi industry couldn’t have imagined that a Silicon Valley startup would be its greatest threat, and AT&T and Verizon surely didn’t imagine that a social media company, Facebook, could become a dominant player in mobile telecommunications. But this is the new nature of disruption: Disruptive competition comes out of nowhere. The incumbents aren’t ready for this and, as a result, the vast majority of today’s leading companies will likely become what toast—in a decade or less. Note the march of Amazon. First it was bookstores, publishing and distribution, then cleaning supplies, electronics and assorted home goods. Now, Amazon is set to dominate all forms of retail as well as cloud services, electronic gadgetry and small-business lending. And the proposed acquisition of Whole Foods sees Amazon literally breaking the barriers between the digital and physical realms. See also: Huge Opportunity in Today’s Uncertainty   This is the type of disruption we will see in almost every industry over the next decade, as technologies advance and converge and turn the incumbents into toast. We have experienced the advances in our computing devices, with smartphones having greater computing power than yesterday’s supercomputers. Now, every technology with a computing base is advancing on an exponential curve—including sensors, artificial intelligence, robotics, synthetic biology and 3-D printing. And when technologies converge, they allow industries to encroach on one another. Uber became a threat to the transportation industry by taking advantage of the advances in smartphones, GPS sensors and networks. Airbnb did the same to hotels by using these advancing technologies to connect people with lodging. Netflix’s ability to use internet connections put Blockbuster out of business. Facebook’s  WhatsApp and Microsoft’s Skype helped decimate the costs of texting and roaming, causing an estimated $386 billion loss to telecommunications companies from 2012 to 2018. Similarly, having proven the viability of electric vehicles, Tesla is building batteries and solar technologies that could shake up the global energy industry. Now, tech companies are building sensor devices that monitor health. With artificial intelligence, these will be able to provide better analysis of medical data than doctors can. Apple’s ResearchKit is gathering so much clinical-trial data that it could eventually upend the pharmaceutical industry by correlating the effectiveness and side effects of the medications we take. As well, Google, Facebook, SpaceX and Oneweb are in a race to provide Wi-Fi internet access everywhere through drones, microsatellites and balloons. At first, they will use the telecom companies to provide their services; then they will turn the telecom companies into toast. The motivation of the technology industry is, after all, to have everyone online all the time. The industry's business models are to monetize data rather than to charge cell, data or access fees. They will also end up disrupting electronic entertainment—and every other industry that deals with information. The disruptions don’t happen within an industry, as business executives have been taught by gurus such as Clayton Christensen, author of management bible “The Innovator’s Dilemma”; rather, the disruptions come from where you would least expect them to. Christensen postulated that companies tend to ignore the markets most susceptible to disruptive innovations because these markets usually have very tight profit margins or are too small, leading competitors to start by providing lower-end products and then scale them up, or to go for niches in a market that the incumbent is ignoring. But the competition no longer comes from the lower end of a market; it comes from other, completely different industries. The problem for incumbents, the market leaders, is that they aren’t ready for this disruption and are often in denial. Because they have succeeded in the past, companies believe that they can succeed in the future, that old business models can support new products. Large companies are usually organized into divisions and functional silos, each with its own product development, sales, marketing, customer support and finance functions. Each division acts from self-interest and focuses on its own success; within a fortress that protects its ideas, it has its own leadership and culture. And employees focus on the problems of their own divisions or departments—not on those of the company. Too often, the divisions of a company consider their competitors to be the company’s other divisions; they can’t envisage new industries or see the threat from other industries. This is why the majority of today’s leading companies are likely to go the way of Blockbuster, Motorola, Sears and Kodak, which were at the top of their game until their markets were disrupted, sending them toward oblivion. See also: How to Respond to Industry Disruption   Companies now have to be on a war footing. They need to learn about technology advances and see themselves as a technology startup in Silicon Valley would: as a juicy target for disruption. They have to realize that the threat may arise in any industry, with any new technology. Companies need all hands on board — with all divisions working together employing bold new thinking to find ways to reinvent themselves and defend themselves from the onslaught of new competition. The choice that leaders face is to disrupt themselves—or to be disrupted.

Vivek Wadhwa

Profile picture for user VivekWadhwa

Vivek Wadhwa

Vivek Wadhwa is a fellow at Arthur and Toni Rembe Rock Center for Corporate Governance, Stanford University; director of research at the Center for Entrepreneurship and Research Commercialization at the Pratt School of Engineering, Duke University; and distinguished fellow at Singularity University.

Has Insurtech Jumped the Shark?

If we haven't reached peak-hype yet, then we surely can't be that far off. But the need to change is still very real.

sixthings
On Sept. 20, 1977, Happy Days broadcast its season five premiere. The central characters visited Los Angeles and, having had his bravery questioned, Fonzie took to the water (still wearing his leather jacket, of course) on water skis. And jumped over a shark. Even at the time, the scene was immediately seen for what it was -- a credulity-stretching ratings ploy that revealed the network's desperation to win back an audience for a show that had run out of ideas. Over the years, the concept of "jumping the shark" evolved into an idiom to describe that moment when any idea, brand, design or franchise demonstrably loses its way. Could it be applied to the Insurtech industry today, I wonder? Certainly, the numbers seem to be pointing in the wrong direction. Insurtech investment was down 35% in 2016 vs. 2015, from $2.6 billion to $1.9 billion, according to CB Insights. The trend accelerated in the first quarter of 2017, with insurtech funding down 64% vs. 2016 to $283 million. The market's collective pulse can hardly be said to be racing. For those of us who lived through the dot-com boom (and bust), there is also a depressingly familiar echo between how corporates reacted to the emergence of the internet then and what is happening now. Hardly a day goes by, it seems, without yet another corporate incubator or venture fund being announced or newly minted chief digital officer (whatever they are) being appointed. And while the (often dumb) money continues to pour in to ever more outlandishly named startups, the media is falling over itself to write the incumbents’ obituaries and crown their sneaker-wearing young pretenders. If we haven't reached peak-hype yet, then we surely can't be that far off. Of course, we shouldn't ignore the insurance industry's ability to remain resolutely analog in a digital world, insulated from reality thanks to the formidable barriers to disruption that are regulation, brand, customer base and balance sheet. I am reliably told that two trucks a day still leave Lloyd's for an offsite document storage facility, loaded to the gunwales with paper, while another comes back the other way… Dig a little deeper, however, and a different picture emerges. The 2015 numbers were arguably distorted by two huge one-off investments (totaling $1.4 billion) in Zenefits and Zhong-An. Ignore those, and the underlying growth story remains compelling, with insurtech investments between 2010 and 2016 growing at a CAGR of more than 50%. See also: FinTech: Epicenter of Disruption (Part 1)   Importantly, insurtech, for so long fintech's poor relation, is closing the investment gap. Analysis by CB Insights shows that the ratio of total fintech to insurtech investments has more than halved from 9.1 to 1 in 2014 to 4.5 to 1 in 2016 as investors wake up to the opportunities on offer. Also encouraging is where that insurtech investment is being made. While 67% of insurtech investments between 2012 and 2016 have been in the U.S., that proportion shrank to 47% in the first quarter of 2017. A swallow does not a summer make, but other data suggests that this is consistent with a growing diversification of insurtech investment away from the U.S. to other insurance markets, in particular Europe. Of course, investment is only one window on the insurtech story. And if there is a surprise, it is perhaps that the numbers are not much, much larger, given the size of the industry, the opportunities on offer for new entrants and the stakes at play for the incumbents. There is some confusion, however, as to exactly what the nature of the insurtech opportunity is, particularly on the P&C side of the industry, which is arguably where the most immediate focus should be. Some talk of the potential for robotics to drive operational efficiency, particularly in the claims process. This may well be true, but to my mind isn't really insurtech. This is just the insurance industry waking up to the potential of process automation. Most other parts of the financial services industry got there at least 10 years ago. Others talk of the impact of driverless cars and how this will slash motor premiums, as vehicles become inherently less prone to crash and the liability burden shifts to software manufacturers. Or how 3D printing will decimate the trade indemnity market as products are printed locally rather than shipped internationally. This may well be true, but isn't insurtech. This is simply the impact of new technologies on different parts of the global insurance premium pool. Some talk of the rise of cyber risk and drones and how this will create new categories of risk. Again, this may well be true, but isn't insurtech. This is just the emergence of new classes of risk that the market will assess, price and refine over time, as it has always done. To understand what the P&C insurtech opportunity truly represents, you have to strip the insurance industry back to its fundamentals. On this basis, insurance, at its core, could be said to be simply the flow of money and data. Money to pay premiums and pay claims. Data to price risk and analyze claims. Accept this, and the beating heart of the insurtech opportunity lies in three main areas: distribution, underwriting and claims.
  1. Distribution in terms of i) using technology to identify, attract and convert clients far more effectively than before and ii) in terms of delivering a far better customer experience that more closely matches expectations of convenience, access and transparency formed through people's interactions with leading online brands and services. Look at the rise of peer-to-peer insurance platforms such as Lemonade or Guevara, for example, and the emergence of products based on actual usage rather than an annual policy, such as sold by Trov and Metro Mile. The change in distribution will be marked by an increasing shift from insurance being viewed as a grudge purchase to being truly optional.
  2. Underwriting in terms of a revolution in the way that data is used to accurately price risk at the individual level, using not just historic information but a continuous stream of data that enables live pricing based on actual risk and usage. Gone are the days when a risk might be underwritten based on five data points and a couple of tickets to Wimbledon. An MGA I met the other day is using more than 1,000 data points in its rating engine, sourced for free through public information, to make tens of thousands of individual underwriting decisions in milliseconds.
  3. Claims in terms of using technology to deliver significant efficiencies in how quickly claims are handled and resolved and through the application of advanced analytics to reduce fraud. And of course if you underwrite better, you will in any case have fewer claims!
The interesting thing is how little the industry still appears to have shifted its ways of working in reaction/anticipation, outside of the well-publicized activities of some of the larger players such as Axa, Munich Re, Allianz and Mass Mutual. There are many potential explanations for this corporate heel-dragging: leadership teams who are the wrong side of the digital divide and who therefore simply don't get it, a lack of organizational agility, a fear of upsetting existing distribution channels/cannibalization or upsetting staff, the difficulty in running a traditional model alongside a new one, network dependency (i.e. you can only go at the speed of the slowest), a lack of investment capital and the uncertain ROI of any technology investment. After all, why invest in a speculative digital strategy when you can hire a couple of extra brokers and almost guarantee a few hundred thousand of extra commission? Further, those that are taking action are arguably placing a disproportionate amount of effort into leveraging technology to improve their internal efficiency and reduce costs. The problem is that, while easier and more tangible for them to tackle, internal operations only represent about 10% of the average insurer's cost base. Compare that with 20% to 40% for distribution and 40% to 60% for claims, and companies appear to be fishing in the wrong pond. This observation hasn't been lost on the PE/venture industry, or on the more progressive corporate venture funds. But even here, a disproportionate amount of investment capital appears to have gone toward distribution alone. McKinsey's Panorama Insurtech Database suggests that 17% of P&C startups are focused on distribution, vs. 10% on pricing and only 7% on claims. At one level this is understandable: Data is boring and incremental, while distribution is higher-profile, easier to target and quicker to monetize. But, by the same token, this means that more and more players are trying to disrupt the same, increasingly crowded part of the value chain, in slightly different ways. They can’t, and won’t, all survive. A few big winners will no doubt emerge from this feeding frenzy. However, the vast majority of today's media darlings will fail or find themselves overtaken (or perhaps taken over if they are lucky) by incumbents informed (at their and others' expense) by the success or otherwise of all these live "pilots" and armed with deeper pockets, a balance sheet, a trusted brand and actual customers. Students of history will again see a clear parallel to the winners and losers of the dot com years. Anyone else remember clickmango.com and its pink inflatable boardroom? What conclusions to draw from all this? Well, firstly that the real, lasting disruptive opportunity for the P&C insurance industry is far more likely to be within underwriting and claims. It may be unsexy, but it is where the bulk of the cost is and is the hardest to get at. It is also, therefore, where there is a real opportunity for new entrants to build something of meaningful, differentiated and sustainable value, rather than on the distribution side, where the barriers to entry are far lower and it is far easier for incumbents to simply adapt their offerings and compete away efficiency savings in the form of price reductions. Secondly, if the above is true, then whoever has the most data wins. This favors incumbents, in particular scale players (on both the broking and the insurer side) or medium-sized players willing to work collaboratively with others to pool resources and know-how and access third-party services. But it also creates opportunities for players (hardware (e.g. telematics), software or consulting) that are supplying, enriching or analyzing data on a partnership basis with players who would not otherwise have the resources to go it alone. Those unable or unwilling to be part of these collaborative networks will fail or have to sell out. Thirdly, as a consequence of the above, the real risk to incumbents is perhaps less from disruptive new entrants that in many cases will be more interested in partnering with them than eating their lunch, than it is from their traditional competitors stealing a decisive march on them. In today's kinetic world, being a fast follower may no longer be good enough. Hence the logic of all these corporate incubators and venture funds (as long as they are investing in the right things, of course). Fourthly, much of what we read about in the media in terms of the incubator/venture activities of the major players should be seen for what it is -- noise that more than likely is designed to conceal their real focus: investing in machine learning and advanced analytics that promises to utterly transform their ability to accurately price and distribute risk. This smoke screen is hardly surprising given the potentially seismic implications for their existing broker relationships and staff, not to mention their customers. Indeed, the potential consequences for those no longer able to secure coverage, or only able to do so at rates far beyond what they pay today, are serious and will surely trigger regulation to avoid vast societal imbalances. Finally, partly as a result of the above as well as related technological innovation, the one thing that is absolutely certain is that the size of the overall insurance revenue pool will shrink significantly. Driverless cars and sensor/IoT technology mean that there will simply be far fewer losses than before. McKinsey's base case scenario sees a 30% drop in global motor premiums and as much as a 70% fall under some conditions. What's more, better data doesn't just lead to better underwriting but also enables better risk prevention and avoidance. Wearable tech, for example, drives healthier living, telematics safer driving. This promises to drive further consolidation across every part of the value chain toward truly value-added players, shift fundamentally the role of the broker (and arguably the insurer too) toward risk consultancy and trigger the rise of a range of complementary services, platforms and product offerings to fuel and profit from this trend. See also: Insurtech Is Ignoring 2/3 of Opportunity   For an industry already reeling under the combined impact of increasing regulation, an unrelentingly soft rating cycle, over-capacity, terminally low interest rates and vicious competition, the rise of insurtech could perhaps not come at a worse time. And yet ironically it is precisely this combination of factors that means that the industry has finally come round to the realization that, if it doesn’t change soon, the world will change around it. Or, as Chinese philosopher Lao Tzu said, "If you do not change direction, you may end up where you are heading." Global insurance premiums stood at around $3.6 trillion last year, according to Swiss Re. This is a huge, unreformed global market crying out for change. Forget jumping the shark. Now is the time to grab the insurtech bull by the horns. And hold on if you can.