Download

Six Innovators to Watch - June 2017

This month's list of 6 Innovators to Watch, for instance, is mostly full of data plays and focused on life and health, not P&C (with a decidely non-U.S. profile).

sixthings

While much of the coverage of the insurtech world focuses on innovations in the distribution part of the process and in the P&C arena, the 1,400 insurtechs that we monitor as part of our Innovator's Edge service are showing plenty of sharp thinking in other areas, too. This month's list of 6 Innovators to Watch, for instance, is mostly full of data plays and focused on life and health, not P&C (with a decidely non-U.S. profile). There seems to be plenty of innovation to go around.

The June 6 Innovators to Watch honorees are:

Atidot aims to help life insurance companies make better decisions by unlocking the power of customer data in existing books of business. Based in Tel Aviv, Israel, the company combines life insurance expertise with data science, delivering greater efficiency and accuracy. Using machine learning, the company can structure data from a variety of sources in a client company and analyze it to find signals that it says leads to new customer insights, predictive models and faster go-to-market strategies for products. Atidot is currently serving a South African life insurer and has several pilots under way with other life insurers. To learn more about Atidot, click here.

GeneYouIn offers a product called Pillcheck designed to deliver personalized medicine by ensuring a better match between a person's prescription medicine and his or her genetic profile. The Toronto-based company uses a saliva sample to develop a full genetic profile of a person's suitability for more than 100 medications. Matching drugs to a person's DNA can avoid harmful side effects, improve the efficacy of treatment and avoid the standard trial-and-error process of finding the right medication and dosage. GeneYouIn, currently working with the Canadian military, also targets benefit management companies and disability management companies. To learn more about GeneYouIn, click here.

Jornaya provides insurance companies with an in-depth view of where a customer lead is along the consumer buying journey to more effectively convert leads to customers. Jornaya technology places code on more than 30,000 websites, such as insurance company web sites and quote comparison sites, letting it view consumer activity and score leads according to age, behavior and other metrics. The technology also spots fraud and provides users with records to ensure compliance with consumer protection laws. Insurance is one of Jornaya's fastest-growing verticals, and it currently is working with personal lines insurers in the property/casualty, life and health sectors. Learn more about Jornaya, click here.

Lapetus Solutions provides an innovative way for life insurers to quickly conduct a health risk assessment. Using a mobile device selfie and a brief questionnaire, Lapetus not only can estimate a person's longevity but also identify key health markers, such as age, smoking status and certain disease markers. The Lapetus solution aims to give life insurers access to information as reliable as blood chemistry but in a way that is less invasive, costs less and delivers results more quickly. Wilmington, N.C.-based Lapetus was created by a public health researcher with a focus on longevity and an academic specialist in facial analytics, and is working on several pilots with insurance and reinsurance companies. To learn more about Lapetus, click here.

Safe Beyond offers what it calls the first "emotional life insurance" platform, designed to let its customers deliver important information and personal messages to designated beneficiaries after death. Customers can store important information in a digital safe—such as documents, passwords and more—as well as record video messages to be delivered at designated times or upon the occurrence of certain events in the future after their death, such as the marriage of a child or graduation of a grandchild. The Tel Aviv-based company has identified life insurance and financial advisers as markets that would use the product as a new way to engage with their customers. To learn more about Safe Beyond, click here.

Vericred wants to be the utility company that powers your innovative health plan data and analytics products and services. The New York-based company currently focuses on three main data sets: health plan design and rate data; provider network data; and formulary data. The goal is to make it easier for small group health insurers and innovative tech companies to build tools for searching providers, selling benefit plans, quoting coverage and enrolling policyholders, among other things. Several innovators within Innovator's Edge are customers for Vericred's data as a service, the company says. To learn more about Vericred, click here.

The June honorees are drawn from among the nearly 1,400 insurtech companies that are featured in Innovator’s Edge, a technology platform created by ITL to drive strategic connections between insurance providers and insurtech innovators. From this pool, only those companies that have completed their Market Maturity Review—a series of modules designed to help insurers conduct baseline due diligence on the innovator and make a more informed connection—are eligible to be considered for Innovators to Watch.

For information on previous honorees, click here: MayApril and March

Cheers,

Paul Carroll,
Editor-in-Chief


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

Insurtech Is an Epic Climb: Can You Do It?

Insurtech is like the Tour de France. What’s good enough for now will likely be the equivalent of a 40-pound bike in five years.

sixthings
“If you try to win, you might lose, but if you don't try to win, you lose for sure!” —Jens Voigt, cyclist Alpe d’Huez is a legendary climb, world-renowned by cyclists. A relentless 8.5 miles with 21 hairpin bends and an 8.1% gradient, it’s been a stage that can make or break the Tour de France for riders and determine the outcome of the entire race. What Alpe d'Huez is for cyclists, insurtech is for insurers. See also: 10 Trends at Heart of Insurtech Revolution   Right now, insurers are faced with an epic climb: insurtech. A new breed of insurance technology has changed the game and disrupted an industry that’s been largely status quo. A very large bump (actually, more like a mountain) has appeared in the road — making it more critical than ever to “see over the horizon,” according to Jon Bidwell, former Chubb chief innovation officer and now SVP and underwriting transformation leader at QBE North America. However, to see beyond the horizon, you first must climb to the top. “When we look back at today, the winners and losers will be defined by those that did and did not embrace an insurtech digital implementation strategy.” —Insurance Thought Leadership, “Death of Core Systems.” The only way to compete is with technology that evens the playing field. Over the past 110-plus years, the Tour de France has gone from 40-pound, fixed-gear road bikes (and no helmets!), to sub-15-pound, carbon bikes and electronic drive trains. Innovation, technology and engineering have played a role in the evolution of the sport of cycling. Think about it: If the 22 teams that compete in the Tour didn’t progress with some equality in the equipment they employ, there would be a very large gap on the field. It would be abundantly clear who’s still pedaling 40-pound bikes up Alpe d’Huez. Insurtech is a game changer. What worked in the past will not work in the future. Insurance technology and innovation is undoubtedly moving at race pace. And, what’s “good enough” for now will likely be a 40-pound bike in five years. See also: Why AI Will Transform Insurance   From the Internet of Things (IOT) to vehicle telematics and, especially, advanced data and analytics — which is fast becoming a key competitive differentiator — insurtech presents the opportunity to evolve and compete. But if we don’t get on the bike and climb, there’s no possibility of winning, no possibility of moving the industry forward. With the right partner, or, in true Tour de France fashion, “domestique,” insurers can create a slipstream that accelerates the insurtech climb. It won’t be long before we start seeing players screaming down the backside — trying to catch the next horizon.

Bret Stone

Profile picture for user BretStone

Bret Stone

Bret Stone is president at SpatialKey. He’s passionate about solving insurers' analytic challenges and driving innovation to market through well-designed analytics, workflow and expert content. Before joining SpatialKey in 2012, he held analytic and product management roles at RMS, Willis Re and Allstate.

Key Trends in Innovation (Part 7)

In the current environment, the sales cycle for onboarding at an insurer averages 12 months. That is an awfully long time for a startup.

sixthings
This article is the sixth in a series on key forces shaping the insurance industry. The other parts can be found at these links: Parts One, Two, Three, Four/Five and Six can be found Part OnePart Two, Part Three, Parts Four and Five and Part Six.   Trend #7: Partnerships and alliances are the way forward in internal innovation; incubation and maturing of capabilities will no longer be the optimal option. And dynamic innovation will require aggressive external partnerships and acquisitions. As such, a model that encourages collaboration and embraces partnerships and alliances with third parties has the best chance of driving successful innovation. Incumbents can innovate from within, but this is often difficult because of the challenges of innovating within an existing business, the risk of disrupting BAU activities and the different motivations that drive individuals. See also: The One Thing to Do to Innovate on Claims   At Eos, one of the core principles underlying our vision is that insurtech will deliver the most value through a collaborative approach between incumbents and startups. Identifying common goals will be key to ensure the collaboration is a success, particularly given significant cultural differences. Another challenge in the current environment is that the sales cycle is painfully long. The average is 12 months, an awfully long time for a typical startup. Even with senior buy-in and a decision to proceed, it can still take six months to get a startup to launch. In many instances, the process adopted by an insurer to onboard a large technology provider — like Guidewire or SAP for a major transformation project — seems to be the same as for the startup. In response, an increasing number of startups have decided to apply for a license and set up a full-stack insurer. This is a challenging model and will require significant investment capital, but many are succeeding, and others will, too. However, it would be a real shame if insurers cannot find a way to become more open, agile and responsive. They bring customers, distribution, products, underwriting capacity and a wealth of experience that can be applied. They are also working on internal innovation projects that can play a key role. The Eos model has been designed to address this challenge; we have created a bridge between incumbents and startups. The investors in our fund are from within the insurance sector, are reinsurers, are insurers and are brokers. They make the investment for two reasons: the prospect of strong financial returns and, more importantly, an opportunity to create a strategic partnership that gives them the ability to access and engage with cutting-edge innovation. By creating an ecosystem that supports collaboration and embraces development, we significantly shorten the adoption cycle. One of the interesting dynamics as we embrace new technology is that AI sits at the center of three exponential forces:
  • Moore’s law refers to the fact that computing increases in power and decreases in relative cost at an exponential pace;
  • Kryder’s law refers to the rapid increases in density and the capability of hard-drive storage media over time; and
  • Metcalfe’s law refers to the community value of a network that grows as the square of the number of its users increase.
Those laws mean that change happens so fast that, if you miss the boat, there will be no way of catching up…. The cost of sitting on the sidelines and not embracing insurtech could mean the death of your business. See also: 10 Reasons to Innovate — NOW!   We hope you enjoy these insights, and we look forward to collaborating with you as we create a new insurance future. The next article in the series, “Trend #8: Simple 'Grow or Go,’” will showcase how decisions of the last decade will be sub-optimal as the dust settles in insurtech and how degrees of freedom will be the key.

Sam Evans

Profile picture for user SamEvans

Sam Evans

Sam Evans is founder and general partner of Eos Venture Partners. Evans founded Eos in 2016. Prior to that, he was head of KPMG’s Global Deal Advisory Business for Insurance. He has lived in Sydney, Hong Kong, Zurich and London, working with the world’s largest insurers and reinsurers.

The Dawn of Digital Reinsurance

A new reinsurance facility powered by transparent distributed ledger technology is diffusing the impact of adverse events.

sixthings
The physical science of matter and motion provides valuable insight for the effective digital management of risk. In this first installment of the DelphX Innovation Series, we describe how a new reinsurance facility powered by transparent distributed ledger technology is employing digital parallels to physical science to diffuse the impact of adverse events. Background The physics of diffusion enables the force of a speeding bullet to be absorbed and rendered nearly harmless by the energy-distributing property of graphene lattice incorporated within a bulletproof vest. That diffusion results from the efficient random-walk distribution of the bullet’s force among the graphene fibers – conveying its energy down a gradient from cells of greater concentration to those of lesser concentration. That property of absorption also causes fluid collected in a sponge to be efficiently distributed among its cavities in proportion to the relative size and fluid concentration of each cavity. Correspondingly, fluid contained in a saturated sponge placed in a vacuum will be released from each cavity in proportion to its relative size and concentration – with those containing higher concentrations sourcing respectively higher amounts of the outflowing fluid. Science of Digital Reinsurance A digital corollary to that balanced distribution process has been integrated into the patent-pending technology of a new risk-pooling reinsurance facility, styled “Quantem” to reflect its risk/collateral-minimizing utility. The facility will be operated by a major global reinsurer to optimally diffuse the impact of loss among risk holders – rendering even a material event nearly harmless to any individual holder. Ceded risks will be distributed within a transparent digital ledger that allocates each risk among all cedents in proportion to the net current size and concentration of risk ceded by each. While all cedents will have full viewing access to all elements of the ledger, their identity will never be disclosed. Operation of the Quantem ledger will be perpetual and open-ended, with the level of concentration of each new risk being determined at origination by the anonymous interaction of competing participants in the SEC-regulated DelphX Alternative Trading System (ATS) market. Demand in that market will be sourced from participants seeking to cost-effectively transfer (or speculate on) risk, and its supply will be sourced from participants seeking to assume referenced risks in return for their continuing receipt of a negotiated annual premium/spread. The aggregate size of the ledger will dynamically increase as new risks are added and decrease as existing risks expire due to their maturity or settlement. As each new risk is added, its size and premium/spread (risk concentration) will determine its positioning along the ledger’s concentration gradient – incrementally adjusting the proportionate quota-share exposure of each other risk. See also: The Need to Automate Reinsurance Programs   Credit Market Solution The Quantem facility will be initially deployed in the global credit market to provide participants a low-cost and more efficient alternative to single-name credit-default-swap contracts. Quantem will diffuse the impact of adverse credit events and provide a regulated security-based solution to the dwindling derivative-based CDS market. To accommodate that efficient transfer of credit risks and the supporting cash flows among investors, Quantem will commoditize those risks/flows within a new form of digital default compensation receipt (DCR) securities that provide:
  • Fixed negotiated spreads and maturities;
  • MTM-collateralization by cash-equivalent assets held in trust by a highly rated custodian bank;
  • Lump-sum compensation payments to holders upon occurrence of a qualifying credit event involving the referenced corporate, municipal, sovereign, structured or other security; and
  • Anonymous negotiation, origination and trading within the transparent DelphX ATS market.
To source the collateral required and minimize the cost of DCRs, Quantem will also commoditize and reinsure the related DCR risks through the sale of digital collateralized reference obligation (CRO) securities that provide:
  • Fixed negotiated coupons and maturities;
  • Full collateralization by cash-equivalent assets held in trust by a highly rated custodian bank;
  • Deeply discounted purchase prices reflecting the lower risks resulting from Quantem’s reinsurance facility; and
  • Anonymous negotiation, origination and trading within the transparent DelphX ATS market.
Note: All CRO sale proceeds are available to Quantem solely for use in funding the collateral requirements and compensation of DCR holders. The risk-mitigating utility of Quantem results in DCR spreads well below the cost of comparable CDS protection and low purchase prices for CROs. The enduring benefit of that lower cost of purchase is economically evidenced in the considerable post-claim yields payable to CRO investors. For example, an assumed annual loss ratio of 4.0% (which is more than twice the aggregate mean default rate of all U.S. corporate bonds since 1981), would produce the following post-claim CRO yields: Market-Based Underwriting The fixed risk concentration of each new DCR added to the ledger is determined at origination by the clearing premium/spread resulting from the competitive interaction of participants in the transparent DelphX market. That risk-concentration thus reflects the market’s then-current equilibrium of supply and demand for protection relating to the risk of the subject CUSIP/ID. That transparent interaction among symmetrically informed market participants facilitates the efficient market-based underwriting and selection of new risks - avoiding adverse selection and subjective/uninformed assessments of risk concentration. While the current DCR pricing for each referenced CUSIP/ID will increase and decrease on the DelphX market, the ledger’s design facilitates the aggregate behavior of pooled DCRs to gradually converge onto a normal (Gaussian) distribution. As the market’s current risk assessment of each CUSIP/ID increases and decreases, the MTM collateral requirements of holders of the related DCRs will correspondingly increase and decrease in response to those changing market prices. Consistent with the law of large numbers, however, as risks of some DCRs are increasing others will be decreasing – resulting in an increasingly predictable mean exposure within the ledger. As exhibited by the historical behavior of participants in the single-name CDS market, demand for DCR protection (and speculation) for a given CUSIP/ID is expected to increase in proportion to the collective assessment of participants of the likelihood of a loss involving that security. If the risk assessment increases, the pricing and volume of DCR purchases for the subject issue will correspondingly increase. As those new, freshly priced DCRs are ceded, their higher price/risk concentration will cause the aggregate concentration of risk for the subject CUSIP/ID in the ledger to correspondingly increase. Thus the collateral sourced by those higher risks will proportionately increase the ledger’s aggregate collateral available for MTM adjustments and minimize the impact of a related loss on all other DCR risks. See also: Transparent Reinsurance for Health   Market-Based Adjudication Quantem will also employ its diffusion protocol to distribute the cost of claims among risk holders based on the net size and concentration of each holder’s ceded risk at the time of adjudication of each claim. That adjudication process is transparently accomplished through anonymous single-price auctions conducted within DelphX. Upon the reporting of a credit event meeting the definition and conditions specified in the DCR documentation, a single-price auction is scheduled within DelphX to facilitate the sale of the collective offerings of the referenced CUSIP/ID by its holders. The clearing price of that auction is then subtracted from the par value of the referenced security, with the remainder determining the compensation payable to holders of DCR(s) referencing the sold issue. Next Series Installment - Digital Risk Speculation

Larry Fondren

Profile picture for user LarryFondren

Larry Fondren

Larry Fondren is a veteran of the insurance and securities industries, where he has worked to develop and promote fair and electronic markets. He has served in capacities ranging from agent to senior officer and shareholder of domestic and international insurance and reinsurance companies.

Lemonade’s Crazy Market Share

We have 27% share among newcomers to insurance! You don’t need clairvoyance to see the predictive power of that metric.

sixthings
It’s the craziest thing: In the State of New York, Lemonade appears to have overtaken Allstate, GEICO, Liberty Mutual, State Farm and the others in what is probably the single most critical market share metric of all. But I’m getting ahead of myself. Our story starts a few months back, when a few digits in a tedious insurance report woke me with a jolt: “723,030.” Why the drama? 723,030 was the number of New Yorkers with renters insurance, and Lemonade had sold way more than 7,230 renters policies to New Yorkers. The upshot: We captured more than 1% market share in just a few months. That seemed crazy. In homeowners insurance in the U.S., a 1.6% market share makes you a top 10 insurance company. And this exclusive club has been at it, on average, for 104 years. Lemonade launched in September. See also: Lemonade Reports: ‘Our First 100 Days’   I went to my shelf, pulled my copy of "Microtrends" and highlighted its punchline:
“It takes only 1% of people making a dedicated choice — contrary to the mainstream’s choice — to create a movement that can change the world.” (xiv)
Then It Got Crazier No sooner had we come back down to Earth, when a new study suggested that our "movement" was on the move. This survey, dated April 2017, updated Lemonade’s NY market share to a crazier 4.2% (E:+2.1/-1.4). Note that while our market share numbers are from dependable sources (reports by regulators, surveys by Google), differing methodologies and timeframes make a conclusive number hard to pin down. That’s just fine by us. For one, we’re growing fast, making any precise number passé by the time it’s computed. For another, "overall market share" — whatever the number — misses the craziest part. The Craziest Part Most New Yorkers got their insurance policy before Lemonade existed. That means that "overall market share" pits our few months of sales against sales made by legacy carriers in the decades before we launched. Which raises the question: What’s our market share among New Yorkers who entered the market since we did? What’s our share of brand new policies? Looks Like We’re Number One It’s totally crazy but also totally logical. Given that about 90% of the market bought their policy before we launched, it stands to reason that our "brand new" market share will be about 10x our "overall" market share. Logic is nice, of course, but it’d be better if there was some empirical evidence to back it up. There is. A second survey broke down marketshare based on when people first bought insurance and found that Lemonade’s market share among first time buyers is more than 27%! 27% share among newcomers to insurance! You don’t need clairvoyance to see the predictive power of that metric. Nothing foretells tomorrow’s "overall" market share like today’s "brand new" market share. Note that the margin of error in the survey is wide (+12.6/-9.8), so our true "brand new" marketshare could be as little as 18%. Again, I’m not spending any time narrowing the range. Pick any point within the margin of error, and the thrust of the story is unchanged: It’s crazy. Crazy Is the New Normal Lemonade is growing exponentially, and today’s subscriber base is more than 2X what it was when those surveys ran 10 weeks ago. In fact, new bookings have doubled every 10 weeks since launch and show no sign of letting up. But exponential growth isn’t the craziest part. The craziest part is that, even if that acceleration stopped, even if we just maintained the status quo from April, within a few years our overall market share would automatically climb to match our "brand new" market share.
That’s what "brand new: market share means; and that’s why it’s probably the single most critical metric of all. Today’s crazy is tomorrow’s normal.
See also: Lemonade: From Local to Everywhere   I know: We’re still tiny, and incumbents won’t stand idly by as we coast from #1 in "brand new" to #1 nationwide. But that’s the trajectory we’re on. And with a nod to Newton’s first law, we’ll keep moving along that trajectory unless stopped by an external force. Game on.

Daniel Schreiber

Profile picture for user DanielSchreiber

Daniel Schreiber

Daniel Schreiber is CEO and co-founder at Lemonade, a licensed insurance carrier offering homeowners and renters insurance powered by artificial intelligence and behavioral economics. By replacing brokers and bureaucracy with bots and machine learning, Lemonade promises zero paperwork and instant everything.

How Tech Created a New Industrial Model

With a connected device for every acre of inhabitable land, we are starting to remake design, manufacturing, sales. Really, everything.

sixthings

With a connected device for every acre of inhabitable land, we are starting to remake design, manufacturing, sales. Really, everything.

With little fanfare, something amazing happened: Wherever you go, you are close to an unimaginable amount of computing power. Tech writers use the line “this changes everything” too much, so let’s just say that it’s hard to say what this won’t change.

It happened fast. According to Cisco Systems, in 2016 there were 16.3 billion connections to the internet around the globe. That number, a near doubling in just four years, works out to 650 connections for every square mile of Earth’s inhabitable land, or roughly one every acre, everywhere. Cisco figures the connections will grow another 60% by 2020.

Instead of touching a relatively simple computer, a connected smartphone, laptop, car or sensor in some way touches a big cloud computing system. These include Amazon Web Services, Microsoft Azure or my employer, Google (which I joined from the New York Times earlier this year to write about cloud computing).

Over the decade since they started coming online, these big public clouds have moved from selling storage, network and computing at commodity prices to also offering higher-value applications. They host artificial intelligence software for companies that could never build their own and enable large-scale software development and management systems, such as Docker and Kubernetes. From anywhere, it’s also possible to reach and maintain the software on millions of devices at once.

For consumers, the new model isn’t too visible. They see an app update or a real-time map that shows traffic congestion based on reports from other phones. They might see a change in the way a thermostat heats a house, or a new layout on an auto dashboard. The new model doesn’t upend life.

For companies, though, there is an entirely new information loop, gathering and analyzing data and deploying its learning at increasing scale and sophistication.

Sometimes the information flows in one direction, from a sensor in the Internet of Things. More often, there is an interactive exchange: Connected devices at the edge of the system send information upstream, where it is merged in clouds with more data and analyzed. The results may be used for over-the-air software upgrades that substantially change the edge device. The process repeats, with businesses adjusting based on insights.

See also: ‘Core in the Cloud’ Reaches Tipping Point  

This cloud-based loop amounts to a new industrial model, according to Andrew McAfee, a professor at M.I.T. and, with Eric Brynjolfsson, the coauthor of “Machine, Platform, Crowd,” a new book on the rise of artificial intelligence. AI is an increasingly important part of the analysis. Seeing the dynamic as simply more computers in the world, McAfee says, is making the same kind of mistake that industrialists made with the first electric motors.

“They thought an electric engine was more efficient but basically like a steam engine,” he says. “Then they put smaller engines around and created conveyor belts, overhead cranes — they rethought what a factory was about, what the new routines were. Eventually, it didn’t matter what other strengths you had, you couldn’t compete if you didn’t figure that out.”

The new model is already changing how new companies operate. Startups like Snap, Spotify or Uber create business models that assume high levels of connectivity, data ingestion and analysis — a combination of tools at hand from a single source, rather than discrete functions. They assume their product will change rapidly in look, feel and function, based on new data.

The same dynamic is happening in industrial businesses that previously didn’t need lots of software.

Take Carbon, a Redwood City, CA maker of industrial 3D printers. More than 100 of its cloud-connected products are with customers, making resin-based items for sneakers, helmets and cloud computing parts, among other things.

Rather than sell machines, Carbon offers them like subscriptions. That way, it can observe what all of its machines are doing under different uses, derive conclusions from all of them on a continuous basis and upgrade the printers with monthly software downloads. A screen in the company’s front lobby shows total consumption of resins being collected on AWS, the basis for Carbon’s collective learning.

“The same way Google gets information to make searches better, we get millions of data points a day from what our machines are doing,” says Joe DeSimone, Carbon’s founder and CEO. “We can see what one industry does with the machine and share that with another.”

One recent improvement involved changing the mix of oxygen in a Carbon printer’s manufacturing chamber. That improved drying time by 20%. Building sneakers for Adidas, Carbon was able to design and manufacture 50 prototype shoes faster than it used to take to do half a dozen test models. It manufactures novel designs that were previously theoretical.

The cloud-based business dynamic raises a number of novel questions. If using a product is now also a form of programming a producer’s system, should a company’s avid data contributions be rewarded?

For Wall Street, which is the more interesting number: the revenue from sales of a product, or how much data is the company deriving from the product a month later?

Which matters more to a company, a data point about someone’s location, or its context with things like time and surroundings? Which is better: more data everywhere, or high-quality and reliable information on just a few things?

Moreover, products are now designed to create not just a type of experience but a type of data-gathering interaction. A Tesla’s door handles emerge as you approach it carrying a key. An iPhone or a Pixel phone comes out of its box fully charged. Google’s search page is a box awaiting your query. In every case, the object is yearning for you to learn from it immediately, welcoming its owner to interact, so it can begin to gather data and personalize itself. “Design for interaction” may become a new specialization.

 The cloud-based industrial model puts information-seeking responsive software closer to the center of general business processes. In this regard, the tradition of creating workflows is likely to change again.

See also: Strategist’s Guide to Artificial Intelligence  

A traditional organizational chart resembled a factory, assembling tasks into higher functions. Twenty-five years ago, client-server networks enabled easier information sharing, eliminating layers of middle management and encouraging open-plan offices. As naming data domains and rapidly interacting with new insights move to the center of corporate life, new management theories will doubtless arise as well.

“Clouds already interpenetrate everything,” says Tim O’Reilly, a noted technology publisher and author. “We’ll take for granted computation all around us, and our things talking with us. There is a coming generation of the workforce that is going to learn how we apply it.”


Quentin Hardy

Profile picture for user QuentinHardy

Quentin Hardy

Quentin Hardy is the head of editorial at Google Cloud, writing about the ways that cloud computing technology, and by extension the advent of computer intelligence at every point on the planet, is reshaping society.

It's Time to Accelerate Digital Change

Companies have started, but addressing narrowly defined problems or one specific part of the business has delivered limited value.

sixthings
For global insurers, digital transformation and disruptive innovation have gone from being vague futuristic concepts to immediate action items on senior leaders’ strategic agendas. New competitive threats, continuing cost pressures, aging technology, increasing regulatory requirements and generally lackluster financial performance are among the forces that demand significant change and entirely new business models. Other external developments — the steady progress toward driverless cars, the rapid emergence of the Internet of Things (IoT) and profound demographic shifts — are placing further pressure on insurers. A common fear is that new market entrants will do to insurance what Uber has done to ride hailing, Amazon has done to retail and robo advisers are doing to investment and wealth management. Yes, "digital transformation" has become an overused term beloved by industry analysts, consultants and pundits in the business press. Yes, it can mean different things to different companies. However, nearly every insurer on the planet — no matter its size, structure or particular circumstances — should undertake digital transformation immediately. This is true because of ever-rising consumer expectations and the insurance sector’s lagging position in terms of embracing digital. The good news is that many early adopters and fast followers have already demonstrated the potential to generate value by embedding digital capabilities deeply and directly into their business models. Even successful pilot programs have been of limited scope. By addressing narrowly defined problems or one specific part of the business, they have delivered limited value. Formidable cultural barriers also remain; most insurers are simply not accustomed or equipped to move at the speed of digital. Similarly, few, if any, insurers have the talent or workforce they need to thrive in the industry’s next era. Because the value proposition for digital transformation programs reaches every dimension of the business, it can drive breakthrough performance both internally (through increased efficiency and process automation) and externally (through increased speed to market and richer consumer and agent experiences). Therefore, insurers must move boldly to devise enterprise-scale digital strategies (even if they are composed of many linked functional processes and applications) and “industrialize” their digital capabilities — that is, deploy them at scale across the business. This paper will explore a range of specific use cases that can produce the breakthrough performance gains and ROI insurers need. From core transformation to digital transformation Recognizing the need to innovate and the limitations of existing technology, many insurers undertook core transformation programs. These investments were meant to help insurers set foot in the digital age, yet represented a very first step or foundation so insurers could use basic digital communications, paperless documents, online data entry, mobile apps and the like. These were necessary steps, as the latest EY insurance consumer research shows that more than 80% of customers are willing to use digital and remote contact channels (including web chat, email, mobile apps, video or phone) in place of interacting with insurers via agents or brokers. More advanced technologies, which can enable major efficiency gains and cost improvements for basic service tasks, also require stronger and more flexible core systems. Chatbot technology, for instance, can deliver considerable value in stand-alone deployments (i.e., without being fully integrated with core claims platforms). However, the full ROI cannot be achieved without integration. For many insurers, core transformation programs are still underway, even as insurers recognize a need to do more. Linking digital transformation programs to core transformation can help insurers use resources more effectively and strengthen the business case. Waiting for core transformation programs to be completed and then taking up the digital transformation would likely result in many missed performance improvement and innovation opportunities, as well as higher implementation costs. One key challenge is the industry’s lack of standardized methodologies and metrics to assess digital maturity. With unclear visibility, insurance leaders will have a difficult time knowing where to prioritize investments or recognizing the most compelling parts of the business case for digital transformation. But, because digital transformation is a long journey, most insurers are best served by a phased or progressive approach. This is not to suggest that culturally risk-averse insurers be even more cautious. Rather, it is to acknowledge that complete digital transformation at one go can’t be managed; there are simply too many contingencies, dependencies and risks that must be accounted for. See also: The Key to Digital Innovation Success   Insurers must be focused and bold within their progressive approach to digital transformation, as it is the way to generate quick wins and create near-term value that can be invested in the next steps. Each step along the digital maturity curve enables future gains. Rather than waiting to be disrupted, truly digital insurers move boldly, testing and learning in pursuit of innovation and redesigning operations, engaging customers in new ways and seeking out new partners. Digital transformation across the insurance value chain: a path to maturity and value creation Digital transformation delivers tangible and intangible value across the insurance value chain, with specific benefits in six key areas: It’s important to emphasize speed and agility as essential attributes of the digital insurer. Even the most innovative firms must move quickly if they are to fully capitalize on their innovations — a concept that applies across the entire value chain. The idea is to launch microservices faster and embrace modernized technology where possible. For instance, deploying cloud infrastructures will enable some parts of the business to scale up and scale down faster, without disrupting other parts of the business with “big dig” implementations. The dependencies and limitations of legacy technology are also worth reiterating. Insurers that can integrate process innovations and new tools with existing systems — and do so efficiently and without introducing operational risk — will gain a sustainable competitive advantage. The following digital transformation scorecards reflect how the benefits apply to different technologies and initiatives. Omni-channel Today’s consumers are naturally omni-channel, researching products online, recommending and talking about them with friends and contacts on social media and then buying them via mobile apps or at brick-and-mortar retail locations. Basically, they want a wide range of options — text, email, web chat, phone and sometimes in-person. A better omni-channel environment may also enable insurers to place new products in front of potential customers sooner and more directly than in the past. Insurers must look beyond merely supporting multiple channels and find the means to allow customers to move seamlessly between channels, or even within channels (such as when they move from chatting with a bot to chatting with a human agent). It is difficult to overstate how challenging it is to create the capabilities (both technological and organizational) to recognize customers and what they are seeking to do, without forcing them to re-enter their passwords or repeat their questions. There are many other subtleties to master, including context. For example, a customer trying to connect via social media to voice concerns is not likely to respond well to a default ad or up-sell offering. Omni-channel is increasingly a baseline capability that insurers must establish to achieve digital maturity. Big data analytics The application of advanced analytical techniques to large and ever-expanding data sets is also foundational for digital insurers. For instance, predictive analytics can identify suitable products for customers in particular regions and demographic cohorts that go far beyond the rudimentary cross-selling and up-selling approaches used by many insurers. Big data analytics also hold the key for creating personalized user experiences. Analytics that “listen” to customer inputs and recognize patterns can identify opportunities for new products that can be launched quickly to seize market openings. Deep analysis of the customer base may make clear which distribution channels (including individual agents and brokers) are the best fit for certain types of leads, leading to increased sales productivity. The back-office value proposition for big data analytics can also be built on superior recognition of fraudulent claims, which are estimated to be around 10% of all submitted claims, with an impact of approximately $40 billion in the U.S. alone. Reducing that number is an example of how digital transformation efforts can be self-funding. Plus, the analytics capabilities established in anti-fraud units can be extended into other areas of the business. Big data is also reshaping the risk and compliance space in important ways. As insurers move toward more precise risk evaluations (including the use of data from social channels), they must also be cognizant of shifting regulations regarding data security and consumer privacy. It won’t be easy ground to navigate. Internet of Things (IoT) The onset of smart homes gives insurers a unique opportunity to adopt more advanced and effective risk mitigation techniques. For instance, intelligent sensors can monitor the flow of water running through pipes to protect against losses caused by a broken water pipe. Similar technology can be used to monitor for fire or flood conditions or break-ins at both private homes and commercial properties. The IoT clearly illustrates the new competitive fronts and partnership opportunities for insurers; leading technology and consumer electronics providers have a head start in engaging consumers via smart appliances and thermostats. Consumers, therefore, may not wish to share the same or additional data with their insurers. Insurers may also be confronted by the data capture and management challenges related to IoT and other connected devices. Telematics Sometimes grouped with IoT, data from sensors and telematics devices have applications across the full range of insurance lines:
  • Real-time driver behavior data for automotive insurance
  • Smart appliances — including thermostats and security alarms — within homeowners insurance
  • Fitness trackers for life and health insurance
  • Warehouse monitors and fleet management in commercial insurance
The data streams from these devices are invaluable for more precise underwriting and more responsive claims management, as well as product innovation. Telematics data provides the foundation for usage-based insurance (UBI), which is sometimes called “pay-as-you-drive” or “pay-as-you-live.” Premium pricing could be based on actual usage and driving habits, with discounts linked to miles driven, slow or moderate speeds and safe braking patterns, for instance. Consider, too, how in-vehicle devices enable a fully automated claims process:
  • Telematics data registers an automobile accident and automatically triggers a first notice of loss (FNOL) entry.
  • Claims information is updated through text-based interactions with drivers or fleet managers.
  • Claimants could be offered the opportunity to close claims in 60 minutes or less.
Such data could also be used to combat claims fraud, with analysis of the links between severity of the medical condition and the impact of the accident. Some insurers are already realizing the benefits of safe driving discounts and more effective fraud prevention. These telematics-driven processes will likely become standard operating procedure for all insurers in the near future. Voice biometrics and analysis Audio and voice data may be the most unstructured data of all, but it too offers considerable potential value to those insurers that can learn to harness it. A first step is to use voice biometrics to identify customers when they call into contact centers, saving customers the inconvenience of entering policy numbers and passwords, information that may not be readily at hand. Other insurers seeking to better understand their customers may convert analog voice data from call center interactions into digital formats that can be scanned and analyzed to identify customer emotions and adjust service delivery or renewal and cross-selling offers accordingly. The manual quality control process checks for less than 1% of the recordings, which is insufficient. Through automation, the entire recording can be assessed to identify improvement areas. See also: 4 Rules for Digital Transformation   Drones and satellites Early-adopting insurers are already using drones and satellites to handle critical tasks in underwriting and claims. In commercial insurance, for instance, drones can conduct site inspections, capturing thermal imagery of facilities or work sites. Their reviews can be as specific as looking for roof cracks, old or damaged boilers and other physical plan defects that can pose claims risks. Within homeowners lines, satellites can capture data to analyze roofs, chimneys and surrounding terrain so that insurers can determine which homeowner they want to add to underwrite, as well as calculate competitive and profitable premiums. When linked to digital communications tools, drone and satellite data can even trigger notifications to customers of new price options or policy adjustments. Within claims, drones and satellites can handle many tasks previously handled by human adjustors across all lines of business. Such remote assessments can reduce claims processing time by a considerable degree. This method is particularly effective in situations such as after floods, fires and natural disasters, where direct assessment is not possible. While many transformation programs that use drones and satellites remain in the experimental stages due to operational challenges, it is possible that they can improve the efficiency and accuracy of underwriting and claims information gathering by 40%. Blockchain Blockchain provides a foundation for entirely new business models and product offerings, such as peer-to-peer insurance, thanks to its ability to provide virtual assistance for quoting, claims handling and other tasks. It also provides a new level of information transparency, accuracy and currency, with easier access for all parties and stakeholders in an insurance contract. With higher levels of autonomy and attribution, blockchain’s architectural properties provide a strong digital foundation to drive use of mobile-to-mobile transactions and swifter, secure payment models, improved data transparency and reduced risk of duplication or exposure management. Insurance companies are interested in converting selected policies from an existing book to a peer-to-peer market. A blockchain network is developed as a mechanism for integrating this peer-to-peer market with a distributed transaction ledger, transparent auditability and “smart” executable policy. E-aggregators are another emerging business model that is likely to gain traction, because it is appealing to both insurers and the customers. Insurers can offer better pricing due to reduced commissions compared with a traditional agent-based distribution model, while customers gain freedom to compare different policies based on better information. Of course, e-aggregators (whether fully independent or built through an existing technology platform) will require a sophisticated and robust digital platform for gathering information from different insurance companies to present it to consumers in the context of a clear, intuitive experience. It is also important for insurance companies to transfer information to e-aggregators rapidly; otherwise, there is the risk they will miss out on sales opportunities. This is why blockchain is the right technology for connecting e-aggregators and insurers. To see the full report from EY, click here.

David Connolly

Profile picture for user DavidConnolly

David Connolly

David Connolly is a global insurance digital leader at EY. He leads the EY global insurance digital practice. EY has defined a multitude of technology-enabled business offerings that help insurers quickly launch digital solutions to remain competitive. Connolly is based in Silicon Valley, California.

3 Technology Trends Worth Watching

Many insurers think 20% of their business could be soon lost to insurtech startups, so staying ahead of technology trends is vital.

sixthings
At a time when many insurers believe that 20% of their business could be soon lost to insurtech startups and when roughly one third of insurance industry CIOs said that, if given an extra $5 million to spend, they would spend it on big data or increased data collection, understanding technology trends is critical to gaining an edge. So, let’s look at three of the emerging technologies affecting the insurance sector. 1. Sensors and other data-tracking technologies In the past, insurance companies and actuaries based their pricing on aggregated data from large numbers of customers. Today, innovations in internet-connected devices such as wearables, auto devices and smart homes are giving insurance companies meaningful data that is specific to individual policyholders. For example, Progressive Insurance created Snapshot, a device a policyholder can install in his or her car that allows Progressive to monitor certain data about the customer's driving habits and to adjust pricing accordingly. Progressive claims to have distributed $600 million in discounts to its policyholders, largely because of data from Snapshot. See also: 10 Trends at Heart of Insurtech Revolution   Snapshot is just one example of how sensors and data-tracking technology can generate savings for policyholders while, at the same time, making insurers more efficient. As this technology continues to gain adoption, many more sensors will be available to monitor policyholder data on health, autos, homes and more. 2. Drones Drone technology is a rapidly growing niche in the insurance industry, with some predicting it will reach a yearly value of $6.8 billion in the coming years. This growing interest in drone technology was a driving force behind a recent panel discussion on drones at the Contractor Connection conference in St. Louis. WeGoLook’s COO, Kenneth Knoll, participated in this panel, which was attended by more than 3,000 industry professionals. Knoll noted that drone technology applies to a wide range of insurance services — roof inspections, underwriting, disaster relief, crop inspections, and much more. Consider an order recently received by WeGoLook requesting a scene inspection at a commercial location where an injury occurred. As compared with photos taken from the ground, aerial imagery captured by one of WeGoLook’s licensed drone operators offered the insurer client a much more effective representation of the scene in question. 3. Paperless solutions Evolving technology also makes it possible for insurers to onboard new clients, handle claims and send notifications in a completely paperless manner. The increased digitization of insurance solutions has the potential to dramatically improve the speed and efficiency with which insurance companies operate. For example, Lemonade, an insurtech company, allows clients to sign up for policies and file claims in less than three minutes, using only a mobile device. Mobile is the new paper as millennials have an extremely high percentage of smartphone use (97%). Carriers that can best cater to paperless, mobile solutions will gain a strong competitive advantage. See also: The Story Behind the Lemonade Hype   Final Thoughts Some have argued that we are currently experiencing a fourth industrial revolution powered, in part, by the developments noted above. Sensors, drones and paperless solutions are just a few of the technologies driving this revolution. Carriers must make these types of innovations a priority because they are fundamentally changing the expectations of clients. It’s time for all insurance professionals to acknowledge and embrace this digital transformation.

Robin Roberson

Profile picture for user RobinSmith

Robin Roberson

Robin Roberson is the managing director of North America for Claim Central, a pioneer in claims fulfillment technology with an open two-sided ecosystem. As previous CEO and co-founder of WeGoLook, she grew the business to over 45,000 global independent contractors.

Complexity Theory Offers Insights (Part 1)

The conceptual framework best suited to understanding our networked world is complexity science. It shows how insurance must evolve.

sixthings
In the first of this series of four segments, we will look at the current state of the risk markets and the insurance industry; the emerging peer-to-peer (P2P) segment of the risk markets; how blockchain technology is enabling a new taxonomy in the risk markets; and what changes may occur as a result of these new technologies and methods. The purpose of this series hails from the open source movement in the software industry. Key to the open source philosophy is the transparent and voluntary collaboration of all interested parties. While this work has been kept fairly close to the vest for the past few years, I have taken meetings with two Fortune 500 insurance companies' strategy and venture teams, both of which asked for a proof of concept — as well as with a handful of other large international insurance companies and one of the big four accounting firms. At the other end of the spectrum, I have also spoken with other founders of P2P insurance startups around the world, and I have participated in the communities surrounding blockchain technology. I feel that these handful of folks have already enjoyed early access to these concepts, and my motivation with this series is to achieve a more level playing field for all parties interested in the future of the risk markets. There are links at the bottom of this article to join the conversation via a LinkedIn group and get access to the whole series. To begin, let's take a look at the current state of risk markets. It is important to distinguish between drivers of economic systems and the impact they have on business models in the industrial age vs. in the information age. See also: Should We Take This Risk?   Hardware and technology was a key driver throughout the industrial age, which saw a growing batch of new technologies — from cars and planes, to computers and smart phones, to industrial robots, etc. Industrial age business models were almost always “extractionary” in their nature. The business model engages with some market, and it profits by keeping some portion of the market's value. Extracting value from the market The strategies of the industrial age were:
  • Standardization — interchangeable parts
  • Centralization — big factories, vertical integration, economies of scale
  • Consolidation —an indication that an industry is about to experience a phase change
In the information age, business models almost always embody some creation of “network effect.” When the business model engages with a market, the individual actors all benefit as more actors engage with the business model. The value creation is usually tied to a network's graph, and the value creation will grow exponentially as the network's density grows. Creating value for the market, not extracting value from the market The strategies and efficiency-drivers in the information age are:
  • Cheap connections — enabling multiple paths through the network's graph
  • Low transaction cost — in terms of time, effort and money
  • Lateral scaling — not vertical structures, which will be flattened out (“top down” increases network fragility)
  • Increase in node diversity — and in the ways each node can connect
All of these drivers lead to increasing network density and flow. Things are moving away from large, brittle centralized organizational structures and toward “distributed,” P2P, “crowd” or “sharing economy” types of organizational structures. Moving away from command-and-control organizational structures is almost impossible for organizations that profit from efficiency gains derived from a centralized effort. It is this attribute of their business model that necessitates startups and new business models coming in and bringing improvements to the market — challenging incumbent economic and business models. The information age is all about networks (not technology), and building graphs that create positive network effects. The conceptual framework best suited to understanding networks and the networked world we now live in is complexity science. The study of complex adaptive systems has grown out of its roots in the 1940s and has proliferated since the 1990s and the explosion of computer networks and social networks. Here is an introduction: When looking at complex systems, we start by looking at the system’s graph. To get an idea of what a graph is, let’s look at a few examples of “graph companies.”
  • Facebook built the “social graph” of acquaintances; it did not create acquaintances.
  • Linkedin built the “professional graph” of coworkers and colleagues; it did not create coworkers and colleagues.
  • Google built the “link graph” for topics searched; it did not create back links for the topics searched.
Notice that, in each of these cases, the company built and documented the connections between the things or nodes in the network and did not create the things or nodes themselves. Those already existed. To start looking at the risk markets, we must first understand what is being connected or transferred between the nodes (a.k.a. the users). It should be of little surprise that, in the risk markets, it is risk that is being transferred between nodes, like a user transferring risk to an insurance company. In terms of risk graphing, there are currently two dominant graphs. A third is emerging. Let’s take a look at the graphs that make up the risk markets and the insurance industry.
  1. Insurance — is the “hub and spoke” graph.
  2. Reinsurance — is the decentralized graph connecting risk hubs.
  3. P2P Coverage — will be formalized in a distributed graph. (This is the one that does obviously not exist formally, but, informally, you see people calling parents/friends and using GoFundMe/their church/their office/other community organizations to spread risk out laterally.)
In today’s risk markets, insurance companies act as centralized hubs where risk is transferred to and carried through time. The reinsurance industry graph is enabling second-degree connections between insurance companies, creating a decentralized graph. In the current industry's combined graph structure or stack, only these two graphs formally exist. While an insurance company’s ledgers remain a hub where risk is transferred to and carried through time, reinsurance enables those risk hubs to network together, achieving a higher degree of overall system resilience. See also: Are Portfolios Taking Too Much Risk?   The P2P distributed graph currently exists via informal social methods. Stack all three graphs, and you can observe how total risk is addressed across all three graph types. Each has its strengths and weaknesses, which leads to its existing in its proper place within the risk markets. The fact that insurance as a financial service gets more expensive per $1,000 of coverage as coverage approaches the first dollar of loss means that, as a financial service, there is a boundary where insurance's weaknesses will outweigh its strengths. My expectation is that much of the risk currently being carried on the hub-and-spoke insurance graph will accrue to the P2P distributed graph because of improved capital efficiency on small losses via a trend of increasing deductibles. This may lead to some of the risk currently carried on the reinsurance decentralized graph being challenged by centralized insurance. The proportion of total risk — or “market share” — that each graph carries will shift in this phase change. When people say insurance is dropping the ball, they are expressing that there is a misunderstanding or poor expectation-setting about how much of total risk the first two graphs should be absorbing. Users are unhappy that they end up resorting to informal P2P methods to fully cover risk. To increase the resilience of society’s risk management systems and fill the gaps left by the insurance and reinsurance graphs, we need the third risk distribution graph: a distributed P2P system. Society needs a distributed system that enables the transfer of risk laterally from individual to individual via formalized methods. This P2P service must be able to carry un-insurable risk exposures, such as deductibles, or niche risk exposures that insurance is not well-suited to cover. Much of this activity already occurs today and, in fact, has been occurring since the dawn of civilization. KarmaCoverage.com is designed to formalize these informal methods and enable end users to benefit from financial leverage created by the system’s network effect on their savings. When observing a system through the complexity paradigm, another key measure to observe is a system’s level of resilience vs. efficiency. Resilience and efficiency sit on opposite sides of a spectrum. A system that is 100% resilient will exhibit an excess of redundancy and wasted resources, while a system that is 100% efficient will exhibit an extreme brittleness that lends itself to a system collapse. When we look at the real world and natural ecosystems as an example, we find that systems tend to self-organize toward a balance of roughly 67% resilient and 33% efficient. Here is a video for more on this optimum balance. Industrial-age ideas have driven economics as a field of study to over-optimize for efficiency, but economics has, in recent years, begun to challenge this notion as the field expands into behavioral economics, game theory and complexity economics — all of which shift the focus away from solely optimizing for efficiency and toward optimizing for more sustainable and resilient systems. In the risk markets, optimizing for resilience should have obvious benefits. Now, let’s take a look at how this applies practically to the risk markets, by looking at those three industry graphs. Centralized network structures are highly efficient. This is why a user can pay only $1,000 per year for home insurance and when her home burns down get several hundred thousand dollars to rebuild. From the user’s point of view, the amount of leverage she was able to achieve via the insurance policy was highly efficient. However, like yin and yang, centralized systems have an inherent weakness — if a single node in the network (the insurance company) is removed, the entire system will collapse. It is this high risk of system collapse that necessitates so much regulation. In the risk markets, we can observe two continuing efforts to reduce the risk of an insurance system collapse. We observe a high degree of regulation, and we see the existence of reinsurance markets. The reinsurance markets function as a decentralized graph in the risk markets, and their core purpose is to connect the centralized insurance companies in a manner to ensure that their inherent brittleness does not materialize a “too big to fail” type of event. Reinsurance achieves this increase in resilience by insuring insurance companies on a global scale. If a hurricane or tsunami hits a few regional carriers of risk, those carriers can turn to their reinsurance for coverage on the catastrophic loss. Reinsurance companies are functionally transferring the risk of that region’s catastrophic loss event to insurance carriers in other regions of the globe. By stacking the two system’s graphs (insurance and reinsurance), the risk markets' ability to successfully transfer risk across society has improved overall system resilience while still retaining a desired amount of efficiency. Observations of nature reveal what appears to be a natural progression of networks that grow in density of connections. Therefore, it makes sense that the reinsurance industry came into existence after the insurance industry, boosting the risk markets' overall density of connections. Along the same line of thought, we would expect to see the risk markets continue to increase in the density of connections from centralized to decentralized and further toward distributed. A distributed network in the risk markets will materialize as some form of financial P2P, "crowd” or “sharing economy” coverage service. A network's density is defined by the number of connections between the nodes. More connections between nodes mean the network has a higher density. For example, a distributed network has a higher density of connections than a centralized network. However, a higher density of connections requires more intense management efforts. There is a limit to how much complexity a centralized management team can successfully organize and control. See also: 5 Steps to Profitable Risk Taking   When a network’s connections outgrow centralized management’s capacity to control, the network will begin to self-organize or exhibit distributed managerial methods. Through this self-organization, a new graph structure of the network’s connections will begin to emerge. As this process unfolds, an entirely new macro system structure will emerge that shows little resemblance to the system’s prior state, much like a new species through evolution. What emerges is a macro phase change (aka “disruption”) that does not necessitate any new resource inputs, only a reorganization of the resources. For example, the macro state of water can go through a phase change and become ice. The micro parts that make up water and ice are the same. The macro state, however, has undergone a phase change, and the nature of the connections between the micro parts will have been reorganized. In his book “Why Information Grows: The Evolution of Order from Atoms to Economies,” MIT’s Cesar Hidalgo explains that, as time marches forward, the amount of information we carry with us increases. That information ultimately requires a higher density of connections as it grows. This can be understood at the level of an individual who grows wiser with experiences over time. However, as the saying goes, “The more you know, the more you know you don’t know.” In the history of human systems, we have observed the need for families to create a tribe, tribes to create a society and society-organizing-firms to achieve cross-society economic work. We are now at the point of needing these firms to create a network of firms that can handle increased complexity and coordination. It is this network of firms that will be achieved via distributed methods because no individual firm will ever agree to let another single firm be the centralized controller of the whole network — nor could a single firm do so. In the next segment of this series, we will look more closely at the distributed graph that will become formalized, creating a P2P system in the risk markets. I have started a LinkedIn group for discussion on blockchain, complexity and P2P insurance. Feel free to join here: https://www.linkedin.com/groups/8478617 If you are interesting exploring working with KarmaCoverge please feel free to reach out to me.

Ron Ginn

Profile picture for user RonGinn

Ron Ginn

Ron Ginn is a financial engineer who has focused on “peer-to-peer insurance” since 2013 and who sees blockchain as the enabling technology for scalable trust networks.

The Next Step in Underwriting

Lenders draw data on individuals from all three credit bureaus. Why don't insurers do the same with the three sources of hazard data?

sixthings
When a person applies for a mortgage in the U.S., credit reports are pulled from all three bureaus -- Equifax, Experian and TransUnion. Why? Because a single bureau does not provide the whole story. When you’re lending hundreds of thousands or millions of dollars it makes sense to find out as much as you can about the people borrowing the money. The lender wants the whole story. When you’re underwriting the property, doesn’t it make sense to get more than one perspective on its risk exposure? Everyone in the natural hazard risk exposure business collects different data, models that data differently, projects that data in different ways and scores the information uniquely. While most companies start with similar base data, how it gets treated from there varies greatly. When it comes to hazard data there are also three primary providers, HazardHub, CoreLogic and Verisk. Each company has its team of hazard scientists and its own way of providing an answer to whatever risk underwriting and actuarial could be concerned with. While there are similarities in the answers provided, there are also enough differences -- usually in properties with questionable risk exposure -- that it makes sense to mitigate your risk by looking at multiple answers. Like the credit bureaus, each company provides a good picture of risk exposure, but, when you combine the data, you get as complete a picture as possible. See also: Next Generation of Underwriting Is Here   Looking at risk data is becoming more commonplace for insurers. However, if you are looking at a single source of data, it is much more difficult to use hazard risk data to limit your risk and provide competitive advantage. Advances in technology (including HazardHub’s incredibly robust APIs) make it easier than ever to incorporate multi-sourced hazard data into your manual and automated underwriting processes. As an insurer, your risk is enormous. Using hazard data -- especially multi-sourced hazard data -- provides you with a significantly more robust risk picture than a single source. At HazardHub, we believe in the power of hazard information and the benefits of multi-sourcing. Through the end of July, we’ll append our hazard data onto a file of your choice absolutely free, to let you see for yourself the value of adding HazardHub data to your underwriting efforts. For more information, please contact us.

John Siegman

Profile picture for user JohnSiegman

John Siegman

John Siegman is the co-founder of Hazard Hub, a property risk data company that was acquired by Guidewire in mid-2021. He is now a senior executive at Guidewire helping to lead the direction of the HazardHub solution and guiding P&C insurance clients in innovating their data integration into critical processes.