Download

Complexity Theory Offers Insights (Part 1)

The conceptual framework best suited to understanding our networked world is complexity science. It shows how insurance must evolve.

sixthings
In the first of this series of four segments, we will look at the current state of the risk markets and the insurance industry; the emerging peer-to-peer (P2P) segment of the risk markets; how blockchain technology is enabling a new taxonomy in the risk markets; and what changes may occur as a result of these new technologies and methods. The purpose of this series hails from the open source movement in the software industry. Key to the open source philosophy is the transparent and voluntary collaboration of all interested parties. While this work has been kept fairly close to the vest for the past few years, I have taken meetings with two Fortune 500 insurance companies' strategy and venture teams, both of which asked for a proof of concept — as well as with a handful of other large international insurance companies and one of the big four accounting firms. At the other end of the spectrum, I have also spoken with other founders of P2P insurance startups around the world, and I have participated in the communities surrounding blockchain technology. I feel that these handful of folks have already enjoyed early access to these concepts, and my motivation with this series is to achieve a more level playing field for all parties interested in the future of the risk markets. There are links at the bottom of this article to join the conversation via a LinkedIn group and get access to the whole series. To begin, let's take a look at the current state of risk markets. It is important to distinguish between drivers of economic systems and the impact they have on business models in the industrial age vs. in the information age. See also: Should We Take This Risk?   Hardware and technology was a key driver throughout the industrial age, which saw a growing batch of new technologies — from cars and planes, to computers and smart phones, to industrial robots, etc. Industrial age business models were almost always “extractionary” in their nature. The business model engages with some market, and it profits by keeping some portion of the market's value. Extracting value from the market The strategies of the industrial age were:
  • Standardization — interchangeable parts
  • Centralization — big factories, vertical integration, economies of scale
  • Consolidation —an indication that an industry is about to experience a phase change
In the information age, business models almost always embody some creation of “network effect.” When the business model engages with a market, the individual actors all benefit as more actors engage with the business model. The value creation is usually tied to a network's graph, and the value creation will grow exponentially as the network's density grows. Creating value for the market, not extracting value from the market The strategies and efficiency-drivers in the information age are:
  • Cheap connections — enabling multiple paths through the network's graph
  • Low transaction cost — in terms of time, effort and money
  • Lateral scaling — not vertical structures, which will be flattened out (“top down” increases network fragility)
  • Increase in node diversity — and in the ways each node can connect
All of these drivers lead to increasing network density and flow. Things are moving away from large, brittle centralized organizational structures and toward “distributed,” P2P, “crowd” or “sharing economy” types of organizational structures. Moving away from command-and-control organizational structures is almost impossible for organizations that profit from efficiency gains derived from a centralized effort. It is this attribute of their business model that necessitates startups and new business models coming in and bringing improvements to the market — challenging incumbent economic and business models. The information age is all about networks (not technology), and building graphs that create positive network effects. The conceptual framework best suited to understanding networks and the networked world we now live in is complexity science. The study of complex adaptive systems has grown out of its roots in the 1940s and has proliferated since the 1990s and the explosion of computer networks and social networks. Here is an introduction: When looking at complex systems, we start by looking at the system’s graph. To get an idea of what a graph is, let’s look at a few examples of “graph companies.”
  • Facebook built the “social graph” of acquaintances; it did not create acquaintances.
  • Linkedin built the “professional graph” of coworkers and colleagues; it did not create coworkers and colleagues.
  • Google built the “link graph” for topics searched; it did not create back links for the topics searched.
Notice that, in each of these cases, the company built and documented the connections between the things or nodes in the network and did not create the things or nodes themselves. Those already existed. To start looking at the risk markets, we must first understand what is being connected or transferred between the nodes (a.k.a. the users). It should be of little surprise that, in the risk markets, it is risk that is being transferred between nodes, like a user transferring risk to an insurance company. In terms of risk graphing, there are currently two dominant graphs. A third is emerging. Let’s take a look at the graphs that make up the risk markets and the insurance industry.
  1. Insurance — is the “hub and spoke” graph.
  2. Reinsurance — is the decentralized graph connecting risk hubs.
  3. P2P Coverage — will be formalized in a distributed graph. (This is the one that does obviously not exist formally, but, informally, you see people calling parents/friends and using GoFundMe/their church/their office/other community organizations to spread risk out laterally.)
In today’s risk markets, insurance companies act as centralized hubs where risk is transferred to and carried through time. The reinsurance industry graph is enabling second-degree connections between insurance companies, creating a decentralized graph. In the current industry's combined graph structure or stack, only these two graphs formally exist. While an insurance company’s ledgers remain a hub where risk is transferred to and carried through time, reinsurance enables those risk hubs to network together, achieving a higher degree of overall system resilience. See also: Are Portfolios Taking Too Much Risk?   The P2P distributed graph currently exists via informal social methods. Stack all three graphs, and you can observe how total risk is addressed across all three graph types. Each has its strengths and weaknesses, which leads to its existing in its proper place within the risk markets. The fact that insurance as a financial service gets more expensive per $1,000 of coverage as coverage approaches the first dollar of loss means that, as a financial service, there is a boundary where insurance's weaknesses will outweigh its strengths. My expectation is that much of the risk currently being carried on the hub-and-spoke insurance graph will accrue to the P2P distributed graph because of improved capital efficiency on small losses via a trend of increasing deductibles. This may lead to some of the risk currently carried on the reinsurance decentralized graph being challenged by centralized insurance. The proportion of total risk — or “market share” — that each graph carries will shift in this phase change. When people say insurance is dropping the ball, they are expressing that there is a misunderstanding or poor expectation-setting about how much of total risk the first two graphs should be absorbing. Users are unhappy that they end up resorting to informal P2P methods to fully cover risk. To increase the resilience of society’s risk management systems and fill the gaps left by the insurance and reinsurance graphs, we need the third risk distribution graph: a distributed P2P system. Society needs a distributed system that enables the transfer of risk laterally from individual to individual via formalized methods. This P2P service must be able to carry un-insurable risk exposures, such as deductibles, or niche risk exposures that insurance is not well-suited to cover. Much of this activity already occurs today and, in fact, has been occurring since the dawn of civilization. KarmaCoverage.com is designed to formalize these informal methods and enable end users to benefit from financial leverage created by the system’s network effect on their savings. When observing a system through the complexity paradigm, another key measure to observe is a system’s level of resilience vs. efficiency. Resilience and efficiency sit on opposite sides of a spectrum. A system that is 100% resilient will exhibit an excess of redundancy and wasted resources, while a system that is 100% efficient will exhibit an extreme brittleness that lends itself to a system collapse. When we look at the real world and natural ecosystems as an example, we find that systems tend to self-organize toward a balance of roughly 67% resilient and 33% efficient. Here is a video for more on this optimum balance. Industrial-age ideas have driven economics as a field of study to over-optimize for efficiency, but economics has, in recent years, begun to challenge this notion as the field expands into behavioral economics, game theory and complexity economics — all of which shift the focus away from solely optimizing for efficiency and toward optimizing for more sustainable and resilient systems. In the risk markets, optimizing for resilience should have obvious benefits. Now, let’s take a look at how this applies practically to the risk markets, by looking at those three industry graphs. Centralized network structures are highly efficient. This is why a user can pay only $1,000 per year for home insurance and when her home burns down get several hundred thousand dollars to rebuild. From the user’s point of view, the amount of leverage she was able to achieve via the insurance policy was highly efficient. However, like yin and yang, centralized systems have an inherent weakness — if a single node in the network (the insurance company) is removed, the entire system will collapse. It is this high risk of system collapse that necessitates so much regulation. In the risk markets, we can observe two continuing efforts to reduce the risk of an insurance system collapse. We observe a high degree of regulation, and we see the existence of reinsurance markets. The reinsurance markets function as a decentralized graph in the risk markets, and their core purpose is to connect the centralized insurance companies in a manner to ensure that their inherent brittleness does not materialize a “too big to fail” type of event. Reinsurance achieves this increase in resilience by insuring insurance companies on a global scale. If a hurricane or tsunami hits a few regional carriers of risk, those carriers can turn to their reinsurance for coverage on the catastrophic loss. Reinsurance companies are functionally transferring the risk of that region’s catastrophic loss event to insurance carriers in other regions of the globe. By stacking the two system’s graphs (insurance and reinsurance), the risk markets' ability to successfully transfer risk across society has improved overall system resilience while still retaining a desired amount of efficiency. Observations of nature reveal what appears to be a natural progression of networks that grow in density of connections. Therefore, it makes sense that the reinsurance industry came into existence after the insurance industry, boosting the risk markets' overall density of connections. Along the same line of thought, we would expect to see the risk markets continue to increase in the density of connections from centralized to decentralized and further toward distributed. A distributed network in the risk markets will materialize as some form of financial P2P, "crowd” or “sharing economy” coverage service. A network's density is defined by the number of connections between the nodes. More connections between nodes mean the network has a higher density. For example, a distributed network has a higher density of connections than a centralized network. However, a higher density of connections requires more intense management efforts. There is a limit to how much complexity a centralized management team can successfully organize and control. See also: 5 Steps to Profitable Risk Taking   When a network’s connections outgrow centralized management’s capacity to control, the network will begin to self-organize or exhibit distributed managerial methods. Through this self-organization, a new graph structure of the network’s connections will begin to emerge. As this process unfolds, an entirely new macro system structure will emerge that shows little resemblance to the system’s prior state, much like a new species through evolution. What emerges is a macro phase change (aka “disruption”) that does not necessitate any new resource inputs, only a reorganization of the resources. For example, the macro state of water can go through a phase change and become ice. The micro parts that make up water and ice are the same. The macro state, however, has undergone a phase change, and the nature of the connections between the micro parts will have been reorganized. In his book “Why Information Grows: The Evolution of Order from Atoms to Economies,” MIT’s Cesar Hidalgo explains that, as time marches forward, the amount of information we carry with us increases. That information ultimately requires a higher density of connections as it grows. This can be understood at the level of an individual who grows wiser with experiences over time. However, as the saying goes, “The more you know, the more you know you don’t know.” In the history of human systems, we have observed the need for families to create a tribe, tribes to create a society and society-organizing-firms to achieve cross-society economic work. We are now at the point of needing these firms to create a network of firms that can handle increased complexity and coordination. It is this network of firms that will be achieved via distributed methods because no individual firm will ever agree to let another single firm be the centralized controller of the whole network — nor could a single firm do so. In the next segment of this series, we will look more closely at the distributed graph that will become formalized, creating a P2P system in the risk markets. I have started a LinkedIn group for discussion on blockchain, complexity and P2P insurance. Feel free to join here: https://www.linkedin.com/groups/8478617 If you are interesting exploring working with KarmaCoverge please feel free to reach out to me.

Ron Ginn

Profile picture for user RonGinn

Ron Ginn

Ron Ginn is a financial engineer who has focused on “peer-to-peer insurance” since 2013 and who sees blockchain as the enabling technology for scalable trust networks.

The Next Step in Underwriting

Lenders draw data on individuals from all three credit bureaus. Why don't insurers do the same with the three sources of hazard data?

sixthings
When a person applies for a mortgage in the U.S., credit reports are pulled from all three bureaus -- Equifax, Experian and TransUnion. Why? Because a single bureau does not provide the whole story. When you’re lending hundreds of thousands or millions of dollars it makes sense to find out as much as you can about the people borrowing the money. The lender wants the whole story. When you’re underwriting the property, doesn’t it make sense to get more than one perspective on its risk exposure? Everyone in the natural hazard risk exposure business collects different data, models that data differently, projects that data in different ways and scores the information uniquely. While most companies start with similar base data, how it gets treated from there varies greatly. When it comes to hazard data there are also three primary providers, HazardHub, CoreLogic and Verisk. Each company has its team of hazard scientists and its own way of providing an answer to whatever risk underwriting and actuarial could be concerned with. While there are similarities in the answers provided, there are also enough differences -- usually in properties with questionable risk exposure -- that it makes sense to mitigate your risk by looking at multiple answers. Like the credit bureaus, each company provides a good picture of risk exposure, but, when you combine the data, you get as complete a picture as possible. See also: Next Generation of Underwriting Is Here   Looking at risk data is becoming more commonplace for insurers. However, if you are looking at a single source of data, it is much more difficult to use hazard risk data to limit your risk and provide competitive advantage. Advances in technology (including HazardHub’s incredibly robust APIs) make it easier than ever to incorporate multi-sourced hazard data into your manual and automated underwriting processes. As an insurer, your risk is enormous. Using hazard data -- especially multi-sourced hazard data -- provides you with a significantly more robust risk picture than a single source. At HazardHub, we believe in the power of hazard information and the benefits of multi-sourcing. Through the end of July, we’ll append our hazard data onto a file of your choice absolutely free, to let you see for yourself the value of adding HazardHub data to your underwriting efforts. For more information, please contact us.

John Siegman

Profile picture for user JohnSiegman

John Siegman

John Siegman is the co-founder of Hazard Hub, a property risk data company that was acquired by Guidewire in mid-2021. He is now a senior executive at Guidewire helping to lead the direction of the HazardHub solution and guiding P&C insurance clients in innovating their data integration into critical processes.

Key Trends in Innovation (Part 6)

The ability to combine innovation with a model that improves the way individuals perceive and interact with insurance is critical.

sixthings
This article is the fifth in a series on key forces shaping the insurance industry. Parts One, Two, Three and Four/Five can be found here, here, here and here. Trend #6: Delivering on the customer promise is the key The ability to dynamically innovate (new risk pools, new segments, new channels) and deliver on the customer promise will become the most important competitive advantage as known risks continue to get commoditized and move to the direct channels. One of the key forces driving the growth of insurtech is the current lack of engagement and, in some instances, the lack of trust between consumers and the industry. In our view, the ability to combine innovation with a model that improves the way individuals perceive and interact with insurance is critical in driving value creation. Claims is at the heart of the customer promise At the heart of the customer promise is the claims process. In many ways, it’s surprising that innovation in this area has been relatively modest given its economic importance (representing 60% to 70% of the overall cost base) and its importance in driving customer satisfaction. Our analysis has shown that the impact on renewals rates between a poor or positive claims experience is as high as 50%. See also: Can Insurance Innovate?   In most cases, claims acceptance rates are very high (often more than 90%). Despite this, individuals are skeptical. This is driven, in part, by the approach taken by the industry. As a minor example, often while you wait for your call to a claims center to be answered, you are reminded of the consequences of making a fraudulent claim. The insurance company seems to imply the most likely scenario is a fraudulent one. The interaction is off to a bad start before it’s even properly begun. We have positioned claims innovation as one of our four key pillars with a solution built around a customer-managed claims platform called RightIndem. RightIndem allows an inefficient analog process to be converted into a digital one, resulting in significant cost savings for the insurer — both in claims handling costs and cost of claim. More importantly, though, the platform significantly enhances customer satisfaction by placing him or her at the heart of the process and by being easy, mobile-enabled, transparent and quick. (As an example, total loss motor claims were settled in a matter of days rather than the 21-day average that exists in the U.K.). New models, new channels, new risk pools Customer engagement is another important area of innovation, with several new approaches being tested. It’s important that insurance becomes more relevant and tailored to individuals' needs and circumstances. As this happens, insurance moves from being a “sell” decision to a “buy” decision. Our earlier article on just-in-time insurance explored some of these trends in more detail. Innovation is also allowing new ways to interact with customers, and we see potential in solutions that enable insurance at point of sale or point of demand. In addition, new risk pools that allow niche or tailored solutions make insurance more relevant to individuals. Finally, there are a number of models that are looking to change the nature of the customer promise. Insure A Thing (IAT), for example, has turned the traditional insurance process on its head. Rather than pay an upfront premium, customers are placed in affinity groups where cost of claims is shared among members, with a safety net provided by an insurance carrier. All parts of the value chain are aligned; IAT only earns fees when it pays claims. Innovation, if insurers embrace it, will allow them to fundamentally change the customer dynamic and to create a new value proposition that is truly appreciated and valued by the customer. See also: InsurTech: Golden Opportunity to Innovate   We hope you enjoy these insights, and we look forward to collaborating with you as we create a new insurance future. Next article in the series: Trend #7: Internal innovation, incubation and maturing of capabilities will no longer be the optimal option; dynamic innovation will require aggressive external partnerships and acquisitions.

Sam Evans

Profile picture for user SamEvans

Sam Evans

Sam Evans is founder and general partner of Eos Venture Partners. Evans founded Eos in 2016. Prior to that, he was head of KPMG’s Global Deal Advisory Business for Insurance. He has lived in Sydney, Hong Kong, Zurich and London, working with the world’s largest insurers and reinsurers.

Forget Big Data; You Need Fast Data

You need to be able to apply big data analytics in near-real or even real time while engaging with a customer or another computer.

sixthings
In 1989, Queen released a very successful single called “I Want It All.” The opening repeats the song title twice, then changes subtly to “and I want it now!” This could be a battle cry for today’s fast-moving society. We’ve all come to expect a rapid response to our requests for service, and we’ve become impatient with those who can’t deliver. We even watch kettles heat up and wonder why they take so long to boil, and we stand and complain about queue lengths. Whereas consumers might take some comfort (or the opposite) in knowing that most companies they deal with hold vast amounts of data about them, all of this data is historic and, actually, very little is used productively. Yet we are increasingly engaged in real-time conversations with companies either via a mobile app, our PCs or the good old-fashioned telephone, providing real-time data about a need or a problem. So why aren’t companies, by and large, capturing and acting on that data in real-time while they are interacting with their customers? The simple explanation is that acting on data captured in real time is beyond the means of most of the systems built by these companies, and it’s not a trivial matter to change, given that this inevitably means tinkering with legacy systems. See also: Producing Data’s Motion Pictures   But there is a solution in sight, and it’s called "fast data." Fast data is the application of big data analytics to smaller data sets in near-real or real time to solve a problem or create business value while engaging with a customer or another computer. Fast data is not a new idea, but it’s going to get very important to embrace fast data. A Fast Data Architecture What high-level requirements must a fast data architecture satisfy? They form a triad:
  1. Reliable data ingestion.
  2. Flexible storage and query options.
  3. Sophisticated analytics tools.
The components that meet these requirements must also be reactive, meaning they scale up and down with demand, are resilient against the failures that are inevitable in large distributed systems (we don’t want any failures on autonomous cars!), always respond to service requests even if failures limit the ability to deliver services and are driven by messages or events from the world around them. The chart below shows an emerging architecture that can meet these requirements. The good news is that you can graft such an architecture on top of legacy systems, which is exactly what ING has been doing. Unlocking valuable intelligence Back in the halcyon days, banks were very close to their customers. They knew customers intimately and treated them personally. With the proliferation of customers, products and channels, though, this intimacy has been lost. ING wanted to recapture the "golden era" with a global strategy to make the bank more customer focused, "mobile first" and altogether more helpful. A typical bank these days captures and processes billions of customer requests, instructions and transactions. In doing so, they capture and store vast amounts of customer data – but, and here’s the startling truth, few (if any) of the major banks use this data effectively for the benefit of their customers. ING appointed a manager of fast data, Bas Geerdink, to address this problem. His broad international remit is to create a truly customer-friendly, omni-channel experience. To kick start this process, he turned his attention to ING's vast but disparate data stores, as he was convinced they could unlock valuable intelligence. Historical data can often reveal customer behaviors and trends that are crucial to predictive analytics. For example, past data can be used to plot future pressure points on personal finances – e.g., key payment events can be anticipated and mitigated with predictive analytics. However, mining this data presents major challenges. Most banks are hampered by disparate and disconnected legacy applications that cannot operate in real time. Confronted with this dysfunctional problem, ING made some fundamental decisions:
  1. Create a single, secure data lake.
  2. Employ a variety of open source technologies (along the lines of those shown in the chart above). These technologies were used to build the over-arching notifications platform to enable data to be captured and acted on in real time.
  3. Work with the legacy application teams to ensure that critical events (during a customer’s "moment of truth") are notified to this fast data platform.
  4. Trigger two vital platform responses: a. Instantly contact the customer to establish whether help is urgently needed (for example, to complete a rapid loan application); b. Run predictive analytics to decide whether the customer needs to be alerted.
The future role of banks Partly in response to the Open Banking directive, the bank is now opening up its data to third parties who have been authorized by customers to process certain transactions on their behalf (e.g. paying bills). This is a fascinating development with potentially far-reaching implications. It raises a question about the future role of banks. For example, would the rise of nimble, tech-driven third parties reduce banks to mere processing utilities? ING is determined not to be marginalized, which is why it has invested in this fast data platform and its building real-time predictive apps – both on its own and with third parties  (such as Yolt). It is a bold and very radical strategy – and, not surprisingly, it raises some searching questions. Hearing this story made me wonder what types of customer would most welcome this kind of service, and was there any risk of alienating less technology-literate customers? The bank doesn’t yet have definitive answers to these questions. However, ING is adamant that all technology-driven initiatives must have universal appeal, and that is why ING is introducing change on a very gradual, phased basis. See also: When Big Data Can Define Pricing (Part 2)   In the first instance, ING is testing these services on employees of the bank and then on beta test groups of (external) customers. To date, feedback has been extremely positive, and this has encouraged the bank to keep investing. However, Bas emphasizes the need to appreciate customer sensitivities and preferences. For example, there is a fine line between providing a valued service and becoming intrusive – that is why the bank specifically considers factors such as the best, most receptive time of day to make interventions (if at all). Fraud detection is another intriguing development where fast data is having a significant impact. At the moment, traditional fraud detection systems often lack finesse. When a customer attempts to use a credit card, it can trigger a false positive 90% of the time (or even more). This can be inconvenient both for the bank and especially for the customer (although a false positive is not always perceived in a negative way – it shows the bank is checking money flows). ING is hopeful that its fast data platform will radically reduce the level of false positives as well as the level of fraud. Other applications of fast data I’m aware that Capital One has deployed a fast data service and is now able to authorize a car loan in seconds – instant on-the- line confirmation that massively improves the customer experience. Yet I’ve also heard of instances where data is anything but fast! Take the Lloyds Insurance market. Currently, some full risk assessments for specialist insurance are completed two weeks after prices have been quoted – quite clearly, this is a risk too far! We can also see applications in places like the police and military, who often have to capture and act upon a variety of data sources, in real time, in often hazardous and difficult circumstances. Fast data analytics could be used, for example, to predict when supplies of ammunition will run out and to trigger immediate air drops to front-line troops. The opportunities to change lives with fast data are enormous. Luckily, it’s becoming easier and easier to achieve. The time to start is now.

Robert Baldock

Profile picture for user RobertBaldock

Robert Baldock

Robert Baldock has been conceiving and delivering innovative solutions to major institutions for all of his 40 working years. He is a serial entrepreneur in the IT field. Today, he is the managing director of Clustre, an innovation broker.

The Big Lesson From Amazon-Whole Foods

As grocers just saw, the incursion by Amazon is the new nature of disruption: Disruptive competition comes out of nowhere.

sixthings
I doubt that Google and Microsoft ever worried about the prospect that a book retailer, Amazon, would come to lead one of their highest-growth markets: cloud services. And I doubt that Apple ever feared that Amazon’s Alexa would eat Apple’s Siri for lunch. For that matter, the taxi industry couldn’t have imagined that a Silicon Valley startup would be its greatest threat, and AT&T and Verizon surely didn’t imagine that a social media company, Facebook, could become a dominant player in mobile telecommunications. But this is the new nature of disruption: Disruptive competition comes out of nowhere. The incumbents aren’t ready for this and, as a result, the vast majority of today’s leading companies will likely become what toast—in a decade or less. Note the march of Amazon. First it was bookstores, publishing and distribution, then cleaning supplies, electronics and assorted home goods. Now, Amazon is set to dominate all forms of retail as well as cloud services, electronic gadgetry and small-business lending. And the proposed acquisition of Whole Foods sees Amazon literally breaking the barriers between the digital and physical realms. See also: Huge Opportunity in Today’s Uncertainty   This is the type of disruption we will see in almost every industry over the next decade, as technologies advance and converge and turn the incumbents into toast. We have experienced the advances in our computing devices, with smartphones having greater computing power than yesterday’s supercomputers. Now, every technology with a computing base is advancing on an exponential curve—including sensors, artificial intelligence, robotics, synthetic biology and 3-D printing. And when technologies converge, they allow industries to encroach on one another. Uber became a threat to the transportation industry by taking advantage of the advances in smartphones, GPS sensors and networks. Airbnb did the same to hotels by using these advancing technologies to connect people with lodging. Netflix’s ability to use internet connections put Blockbuster out of business. Facebook’s  WhatsApp and Microsoft’s Skype helped decimate the costs of texting and roaming, causing an estimated $386 billion loss to telecommunications companies from 2012 to 2018. Similarly, having proven the viability of electric vehicles, Tesla is building batteries and solar technologies that could shake up the global energy industry. Now, tech companies are building sensor devices that monitor health. With artificial intelligence, these will be able to provide better analysis of medical data than doctors can. Apple’s ResearchKit is gathering so much clinical-trial data that it could eventually upend the pharmaceutical industry by correlating the effectiveness and side effects of the medications we take. As well, Google, Facebook, SpaceX and Oneweb are in a race to provide Wi-Fi internet access everywhere through drones, microsatellites and balloons. At first, they will use the telecom companies to provide their services; then they will turn the telecom companies into toast. The motivation of the technology industry is, after all, to have everyone online all the time. The industry's business models are to monetize data rather than to charge cell, data or access fees. They will also end up disrupting electronic entertainment—and every other industry that deals with information. The disruptions don’t happen within an industry, as business executives have been taught by gurus such as Clayton Christensen, author of management bible “The Innovator’s Dilemma”; rather, the disruptions come from where you would least expect them to. Christensen postulated that companies tend to ignore the markets most susceptible to disruptive innovations because these markets usually have very tight profit margins or are too small, leading competitors to start by providing lower-end products and then scale them up, or to go for niches in a market that the incumbent is ignoring. But the competition no longer comes from the lower end of a market; it comes from other, completely different industries. The problem for incumbents, the market leaders, is that they aren’t ready for this disruption and are often in denial. Because they have succeeded in the past, companies believe that they can succeed in the future, that old business models can support new products. Large companies are usually organized into divisions and functional silos, each with its own product development, sales, marketing, customer support and finance functions. Each division acts from self-interest and focuses on its own success; within a fortress that protects its ideas, it has its own leadership and culture. And employees focus on the problems of their own divisions or departments—not on those of the company. Too often, the divisions of a company consider their competitors to be the company’s other divisions; they can’t envisage new industries or see the threat from other industries. This is why the majority of today’s leading companies are likely to go the way of Blockbuster, Motorola, Sears and Kodak, which were at the top of their game until their markets were disrupted, sending them toward oblivion. See also: How to Respond to Industry Disruption   Companies now have to be on a war footing. They need to learn about technology advances and see themselves as a technology startup in Silicon Valley would: as a juicy target for disruption. They have to realize that the threat may arise in any industry, with any new technology. Companies need all hands on board — with all divisions working together employing bold new thinking to find ways to reinvent themselves and defend themselves from the onslaught of new competition. The choice that leaders face is to disrupt themselves—or to be disrupted.

Vivek Wadhwa

Profile picture for user VivekWadhwa

Vivek Wadhwa

Vivek Wadhwa is a fellow at Arthur and Toni Rembe Rock Center for Corporate Governance, Stanford University; director of research at the Center for Entrepreneurship and Research Commercialization at the Pratt School of Engineering, Duke University; and distinguished fellow at Singularity University.

Has Insurtech Jumped the Shark?

If we haven't reached peak-hype yet, then we surely can't be that far off. But the need to change is still very real.

sixthings
On Sept. 20, 1977, Happy Days broadcast its season five premiere. The central characters visited Los Angeles and, having had his bravery questioned, Fonzie took to the water (still wearing his leather jacket, of course) on water skis. And jumped over a shark. Even at the time, the scene was immediately seen for what it was -- a credulity-stretching ratings ploy that revealed the network's desperation to win back an audience for a show that had run out of ideas. Over the years, the concept of "jumping the shark" evolved into an idiom to describe that moment when any idea, brand, design or franchise demonstrably loses its way. Could it be applied to the Insurtech industry today, I wonder? Certainly, the numbers seem to be pointing in the wrong direction. Insurtech investment was down 35% in 2016 vs. 2015, from $2.6 billion to $1.9 billion, according to CB Insights. The trend accelerated in the first quarter of 2017, with insurtech funding down 64% vs. 2016 to $283 million. The market's collective pulse can hardly be said to be racing. For those of us who lived through the dot-com boom (and bust), there is also a depressingly familiar echo between how corporates reacted to the emergence of the internet then and what is happening now. Hardly a day goes by, it seems, without yet another corporate incubator or venture fund being announced or newly minted chief digital officer (whatever they are) being appointed. And while the (often dumb) money continues to pour in to ever more outlandishly named startups, the media is falling over itself to write the incumbents’ obituaries and crown their sneaker-wearing young pretenders. If we haven't reached peak-hype yet, then we surely can't be that far off. Of course, we shouldn't ignore the insurance industry's ability to remain resolutely analog in a digital world, insulated from reality thanks to the formidable barriers to disruption that are regulation, brand, customer base and balance sheet. I am reliably told that two trucks a day still leave Lloyd's for an offsite document storage facility, loaded to the gunwales with paper, while another comes back the other way… Dig a little deeper, however, and a different picture emerges. The 2015 numbers were arguably distorted by two huge one-off investments (totaling $1.4 billion) in Zenefits and Zhong-An. Ignore those, and the underlying growth story remains compelling, with insurtech investments between 2010 and 2016 growing at a CAGR of more than 50%. See also: FinTech: Epicenter of Disruption (Part 1)   Importantly, insurtech, for so long fintech's poor relation, is closing the investment gap. Analysis by CB Insights shows that the ratio of total fintech to insurtech investments has more than halved from 9.1 to 1 in 2014 to 4.5 to 1 in 2016 as investors wake up to the opportunities on offer. Also encouraging is where that insurtech investment is being made. While 67% of insurtech investments between 2012 and 2016 have been in the U.S., that proportion shrank to 47% in the first quarter of 2017. A swallow does not a summer make, but other data suggests that this is consistent with a growing diversification of insurtech investment away from the U.S. to other insurance markets, in particular Europe. Of course, investment is only one window on the insurtech story. And if there is a surprise, it is perhaps that the numbers are not much, much larger, given the size of the industry, the opportunities on offer for new entrants and the stakes at play for the incumbents. There is some confusion, however, as to exactly what the nature of the insurtech opportunity is, particularly on the P&C side of the industry, which is arguably where the most immediate focus should be. Some talk of the potential for robotics to drive operational efficiency, particularly in the claims process. This may well be true, but to my mind isn't really insurtech. This is just the insurance industry waking up to the potential of process automation. Most other parts of the financial services industry got there at least 10 years ago. Others talk of the impact of driverless cars and how this will slash motor premiums, as vehicles become inherently less prone to crash and the liability burden shifts to software manufacturers. Or how 3D printing will decimate the trade indemnity market as products are printed locally rather than shipped internationally. This may well be true, but isn't insurtech. This is simply the impact of new technologies on different parts of the global insurance premium pool. Some talk of the rise of cyber risk and drones and how this will create new categories of risk. Again, this may well be true, but isn't insurtech. This is just the emergence of new classes of risk that the market will assess, price and refine over time, as it has always done. To understand what the P&C insurtech opportunity truly represents, you have to strip the insurance industry back to its fundamentals. On this basis, insurance, at its core, could be said to be simply the flow of money and data. Money to pay premiums and pay claims. Data to price risk and analyze claims. Accept this, and the beating heart of the insurtech opportunity lies in three main areas: distribution, underwriting and claims.
  1. Distribution in terms of i) using technology to identify, attract and convert clients far more effectively than before and ii) in terms of delivering a far better customer experience that more closely matches expectations of convenience, access and transparency formed through people's interactions with leading online brands and services. Look at the rise of peer-to-peer insurance platforms such as Lemonade or Guevara, for example, and the emergence of products based on actual usage rather than an annual policy, such as sold by Trov and Metro Mile. The change in distribution will be marked by an increasing shift from insurance being viewed as a grudge purchase to being truly optional.
  2. Underwriting in terms of a revolution in the way that data is used to accurately price risk at the individual level, using not just historic information but a continuous stream of data that enables live pricing based on actual risk and usage. Gone are the days when a risk might be underwritten based on five data points and a couple of tickets to Wimbledon. An MGA I met the other day is using more than 1,000 data points in its rating engine, sourced for free through public information, to make tens of thousands of individual underwriting decisions in milliseconds.
  3. Claims in terms of using technology to deliver significant efficiencies in how quickly claims are handled and resolved and through the application of advanced analytics to reduce fraud. And of course if you underwrite better, you will in any case have fewer claims!
The interesting thing is how little the industry still appears to have shifted its ways of working in reaction/anticipation, outside of the well-publicized activities of some of the larger players such as Axa, Munich Re, Allianz and Mass Mutual. There are many potential explanations for this corporate heel-dragging: leadership teams who are the wrong side of the digital divide and who therefore simply don't get it, a lack of organizational agility, a fear of upsetting existing distribution channels/cannibalization or upsetting staff, the difficulty in running a traditional model alongside a new one, network dependency (i.e. you can only go at the speed of the slowest), a lack of investment capital and the uncertain ROI of any technology investment. After all, why invest in a speculative digital strategy when you can hire a couple of extra brokers and almost guarantee a few hundred thousand of extra commission? Further, those that are taking action are arguably placing a disproportionate amount of effort into leveraging technology to improve their internal efficiency and reduce costs. The problem is that, while easier and more tangible for them to tackle, internal operations only represent about 10% of the average insurer's cost base. Compare that with 20% to 40% for distribution and 40% to 60% for claims, and companies appear to be fishing in the wrong pond. This observation hasn't been lost on the PE/venture industry, or on the more progressive corporate venture funds. But even here, a disproportionate amount of investment capital appears to have gone toward distribution alone. McKinsey's Panorama Insurtech Database suggests that 17% of P&C startups are focused on distribution, vs. 10% on pricing and only 7% on claims. At one level this is understandable: Data is boring and incremental, while distribution is higher-profile, easier to target and quicker to monetize. But, by the same token, this means that more and more players are trying to disrupt the same, increasingly crowded part of the value chain, in slightly different ways. They can’t, and won’t, all survive. A few big winners will no doubt emerge from this feeding frenzy. However, the vast majority of today's media darlings will fail or find themselves overtaken (or perhaps taken over if they are lucky) by incumbents informed (at their and others' expense) by the success or otherwise of all these live "pilots" and armed with deeper pockets, a balance sheet, a trusted brand and actual customers. Students of history will again see a clear parallel to the winners and losers of the dot com years. Anyone else remember clickmango.com and its pink inflatable boardroom? What conclusions to draw from all this? Well, firstly that the real, lasting disruptive opportunity for the P&C insurance industry is far more likely to be within underwriting and claims. It may be unsexy, but it is where the bulk of the cost is and is the hardest to get at. It is also, therefore, where there is a real opportunity for new entrants to build something of meaningful, differentiated and sustainable value, rather than on the distribution side, where the barriers to entry are far lower and it is far easier for incumbents to simply adapt their offerings and compete away efficiency savings in the form of price reductions. Secondly, if the above is true, then whoever has the most data wins. This favors incumbents, in particular scale players (on both the broking and the insurer side) or medium-sized players willing to work collaboratively with others to pool resources and know-how and access third-party services. But it also creates opportunities for players (hardware (e.g. telematics), software or consulting) that are supplying, enriching or analyzing data on a partnership basis with players who would not otherwise have the resources to go it alone. Those unable or unwilling to be part of these collaborative networks will fail or have to sell out. Thirdly, as a consequence of the above, the real risk to incumbents is perhaps less from disruptive new entrants that in many cases will be more interested in partnering with them than eating their lunch, than it is from their traditional competitors stealing a decisive march on them. In today's kinetic world, being a fast follower may no longer be good enough. Hence the logic of all these corporate incubators and venture funds (as long as they are investing in the right things, of course). Fourthly, much of what we read about in the media in terms of the incubator/venture activities of the major players should be seen for what it is -- noise that more than likely is designed to conceal their real focus: investing in machine learning and advanced analytics that promises to utterly transform their ability to accurately price and distribute risk. This smoke screen is hardly surprising given the potentially seismic implications for their existing broker relationships and staff, not to mention their customers. Indeed, the potential consequences for those no longer able to secure coverage, or only able to do so at rates far beyond what they pay today, are serious and will surely trigger regulation to avoid vast societal imbalances. Finally, partly as a result of the above as well as related technological innovation, the one thing that is absolutely certain is that the size of the overall insurance revenue pool will shrink significantly. Driverless cars and sensor/IoT technology mean that there will simply be far fewer losses than before. McKinsey's base case scenario sees a 30% drop in global motor premiums and as much as a 70% fall under some conditions. What's more, better data doesn't just lead to better underwriting but also enables better risk prevention and avoidance. Wearable tech, for example, drives healthier living, telematics safer driving. This promises to drive further consolidation across every part of the value chain toward truly value-added players, shift fundamentally the role of the broker (and arguably the insurer too) toward risk consultancy and trigger the rise of a range of complementary services, platforms and product offerings to fuel and profit from this trend. See also: Insurtech Is Ignoring 2/3 of Opportunity   For an industry already reeling under the combined impact of increasing regulation, an unrelentingly soft rating cycle, over-capacity, terminally low interest rates and vicious competition, the rise of insurtech could perhaps not come at a worse time. And yet ironically it is precisely this combination of factors that means that the industry has finally come round to the realization that, if it doesn’t change soon, the world will change around it. Or, as Chinese philosopher Lao Tzu said, "If you do not change direction, you may end up where you are heading." Global insurance premiums stood at around $3.6 trillion last year, according to Swiss Re. This is a huge, unreformed global market crying out for change. Forget jumping the shark. Now is the time to grab the insurtech bull by the horns. And hold on if you can.

Major Opportunities in Microinsurance

Microinsurance in developing countries is not just a reduced-cost coverage for poor people: It’s an innovative way of selling insurance.

sixthings
Microinsurance in developing countries is not just a reduced-cost coverage for poor people: It’s an innovative way of selling insurance in a customer-centric approach… and the insurtech wave has a big role to play! Microinsurance already covers around 135 million people, which represents around 5% of the entire market potential, with an average of 10% annual growth. The risks covered by such solutions are the typical ones of the traditional insurance market: life insurance, health insurance, accidental death and disability and property insurance. Developing countries have economies that are generally based on farming and agriculture. and they can’t cover all the needs of a growing population exclusively with the goods they produce. Approximately 70% of the world’s 7 billion people live in poverty. In such a context, there is significant demand for a certain range of insurance products from health and life, agricultural and property insurance, to catastrophe cover. The potential market for insurance in developing countries is estimated to be between 1.5 and 3 billion policies. See also: Big New Role for Microinsurance   Microinsurance presents a different type of business potential in comparison with the microfinance and microcredit current. Microinsurance is not just a reduced-cost and specific-risk insurance coverage for people in developing countries. It is an innovative way of selling insurance that is aligned with customer expectations while covering a specific need, at the right moment, at the right price, in a customer-centric approach. This type of insurance could help close the protection gap both in developed countries and underdeveloped ones. Microfinance instead can be defined as "a world in which as many poor and near-poor households as possible have permanent access to an appropriate range of high-quality financial services, including not just credit but also savings, insurance and fund transfers."  Microcredit (generally considered to have originated with the Grameen Bank founded in Bangladesh in 1983) means providing credit services to those with low income. It is an extension of very small loans to impoverished borrowers who typically lack collateral, steady employment and a verifiable credit history. Provided that people with low income are offered the right products, means and knowledge, they will become effective consumers of financial services. The MicroInsurance Centre estimates that in the next 10 years or so, the microinsurance market could grow to 1 billion policyholders. An important concept is that insurance demand should not be taken for granted. This is because of the often negative connotation it is being given in the developing world, which stops it from reaching more people. The market needs an innovative approach based on customer education and incentives. Insurance benefits have to be clear in the mind of potential customers and, for that to be achieved, trust has to be built. This can be done through new and engaging approaches like plots in TV and radio programs or even through literacy campaigns. To create demand, other types of incentives can be used: tax exemption, subsidies or compulsory cover. For microinsurance to function in a developing country, the products and the processes have to be simple and the premiums need to be low. A change of mindset is needed from insurers, alongside a more efficient administration strategy and distribution channel. The key question that insurers have to pose to themselves is: How do you sell insurance to someone who never had to deal with such a concept before? How to generate revenues from a policy where the premium is just a few dollars per year? These questions show the essential challenges of microinsurance that insurers need to tackle in a quick and cheap manner to provide cover for people who have little money. New solutions for developing countries are starting to emerge on the market; for example, in some parts of Asia pre-pay cards provide insurance cover for flood damage. Insurers will have to find the right business model and partners when approaching such markets and consider less common mechanisms for controlling moral hazard, adverse selection and fraud. For example, proxy underwriting, group policies and waiting periods mitigate adverse selection. At first, investing in microinsurance might seem a bit reckless, but the returns do exist: starting from reputational gains in the short term, knowledge in the medium term and growth in the long term. If indeed microinsurance will start to grow at its true potential by entering developing economies, then there are some critical areas that need more thought: starting from product innovation and technological solutions that are adapted to low-income markets, to choosing the right partners to work with (NGOs, community-based organizations, international reinsurers and so on) and understanding which are the risk factors that will affect the region in the future (for example, economic development, climate change or population growth trends). See also: 5 Innovations in Microinsurance   The direction in which technology is heading indicates that developing countries will fast forward straight to mobile, skipping desktop computers, which are less feasible as communication tools. Already, more than half of the world's population is using a mobile phone, and almost 25% is using internet regularly as fewer and fewer people use fixed telephone lines. Mobiles are the dominating means of communication, even in the Third World, with smartphone ownership and internet usage on the rise. According to a survey by Pew Research Center, in the last two years there has been a significant increase in the number of people from developing nations that declare they use internet and own a smartphone. Moreover, in nearly every country, millennials are much more likely to be internet and smartphone users compared with those over age 35. This phenomenon is a characteristic of both advanced and emerging economies. In spite of these trends, less than 5% of people with low income have access to insurance or to covers that they actually need, which makes underdeveloped countries an ideal market to explore.

Andrea Silvello

Profile picture for user AndreaSilvello

Andrea Silvello

Andrea Silvello has more than 10 years of experience at internal consulting firms, such as BCG and Bain. Since 2016, Silvello has been the co-founder and CEO of Neosurance, an insurance startup. It is a virtual insurance agent that sells micro policies.

Do we need robots in the kitchen?

sixthings

Although I believe in the capabilities of technology as much as anyone, breathless articles sometimes set me off. I will now rant about one, because I think these articles should be a warning about how even smart people can get sucked in by the possibilities of digital technology of the sort that is currently turning insurance on its head. (Yes, if I'm honest, I also want to vent a little.)

The article that made my head explode (most recently) described how great it would be to live in a connected home where you would wake up to the smell of bacon that had automatically started cooking on your stove just minutes before your alarm went off. Sounds great, right? Who doesn't love the smell of bacon in the morning (or the afternoon or evening)? Everything is better with bacon.

But think for a moment. Who put the bacon in the skillet? You did, unless there's a robot involved here that the article didn't describe. When did you put the bacon in the skillet? The night before. Do you really want to eat bacon that has been sitting out all night? I don't, no matter how good it smells.

This lack of thinking through an issue from beginning to end is not an isolated event. The bacon idea is actually just a variant of the hoary notion that, on the way home from work, we'll turn on our microwaves remotely and start cooking our dinner (which has been sitting, unrefrigerated, in the microwave all day). People have been touting the idea of internet toasters and refrigerators for many years, even though the toaster has no conceivable use and the refrigerator actually sits in the middle of a complex issue that isn't solved just by connecting it to the internet—no, I don't want the refrigerator ordering milk for me simply because I've run out, and I certainly don't want it managing my whole shopping list. 

The lack of thorough thinking isn't new. It has been going on at least since I started covering the world of technology for the Wall Street Journal in 1986. And the thinking infects even people and companies that should know much better. In April 1988, I wrote an article on the front page of the second section that described how even some very savvy companies made their products worse through digital technology. BMW added electronics to some top-line cars that required a 40-minute video to explain; just the section on locking and unlocking the car required three minutes. Buick so confused drivers that some who tried to turn down the radio wound up turning off the air conditioning. When some of the geekiest of the geeks in Silicon Valley—including the CEO of Sun Microsystems and a future CEO of Microsoft—went bowling, they couldn't figure out how to use the digital scoring system.

I haven't quite given up hope. But I'm close, given the persistence of the thinking that it's good to do things digitally just because it's possible to do them digitally. 

I thought I should at least call the issue to your attention. We're smarter about so many things than we were in 1988. Let's get smarter, too, about how digital technology fits (and doesn't fit) in end-to-end solutions.

Rant over. Thanks for hearing me out.   

Cheers,

Paul Carroll,
Editor-in-Chief 


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

How to Reinvent P&C Pricing

The race is on to find the next insurance credit score—and the winners (if there are winners) will gain a pricing (and underwriting) edge.

sixthings
The answer to P&C pricing may lie in the insurance credit score. That is basically a set of algorithms applied to data from credit reports that provide guidance for pricing and underwriting personal lines insurance. Although the score has been a source of political and regulatory controversy over the years, the use of insurance credit scores is now widespread. Much of the controversy has been over possible disparate impacts on various societal groups. But a root of the controversy has been the non-intuitive relationship between a given person’s use or misuse of credit on the one hand, and that person’s probability of incurring insured losses on the other hand. The correlation just doesn’t seem to make much sense. But statistically there are correlations, which in general have passed regulatory review. See also: Credit Reports Are Just the Beginning   Insurance credit score controversies are now ancient history (i.e. were settled before most millennials graduated from high school). But suddenly something interesting is happening. The race is on to find the next insurance credit score—and the winners (if there are winners) will gain a pricing (and underwriting) edge. There are only two requirements to enter this race.
  1. You have to forget about all the kinds of data and information that insurers have been using to price and underwrite risks.
  2. You have to use your digital imagination to find some new data and models that provide the same or better lift as the old data and models that you have just thrown out the window. (Lift is the increase in the ability of a new pricing model to distinguish between good and bad risks when compared with an existing pricing model.)
So what kind of new data might a digital imagination look at?
  • For personal auto, connected cars will provide a rich data set to mine. How about whether a car is serviced at the manufacturer’s suggested intervals (correlated with whether the car is serviced by a dealer or by an independent repair shop)? Or the use of a mobile phone while the car is in motion (correlated with time of day, precipitation and whether satellite radio is also playing)? Or use of headlights during daylight hours (correlated with the frequency of manually shifting gears in a vehicle with an automatic transmission).
  • For homeowners insurance, connected homes could supply all types of new data. For example, whether Alexa (or similar device) controls the home’s HVAC systems, correlated with setting security alarms before 11 p.m. Or, electricity and gas consumption, correlated with use of video streaming services on weeknights. Or, the number and type of connected appliances, correlated with the number of functioning smoke, carbon monoxide and moisture detectors.
  • For commercial liability insurance, telematics and IoT will be the key data sources. Does a business with 10 or more commercial vehicles use both fleet management and telematics solutions? What mobile payment options are offered (correlated with dynamic pricing capabilities)? What are the business’ use of social media and messaging apps, correlated with the degree of supply chain digitization?
See also: Why Credit Monitoring Isn’t Enough   Of course, obtaining a lot of this data will require permission from policyholders—and even with permission these methods may raise social or political issues. But premium discount and loss control incentives for telematics programs have proven effective. And for better or worse,  Scott McNealy got it right in 1999 when he was asked about privacy and said, Nope, you don't have any.

Donald Light

Profile picture for user DonaldLight

Donald Light

Donald Light is a director in Celent’s North America property/casualty insurance practice. His coverage areas include: technology and business strategy, transformative technologies, core systems and insurance technology M&A due diligence.

Healthcare Needs a Data Checkup

This haste to complete implementation of electronic health records has led to a deficiency in data protection and security measures.

sixthings
As the healthcare industry continues to digitize, data protection technology has not been able to keep pace. Unfortunately for industry participants, healthcare has become a top target for state-sponsored and free-agent hackers. In fact, a study released by Michigan State University in April 2017 found that healthcare providers reported 1,225 of the total 1,798 data breaches in the U.S. from 2009 to 2016. Why has the healthcare industry become such a target? And what can healthcare providers do to protect their organizations and the thousands of patients they serve? One primary reason for the target on healthcare’s figurative back is the rapid implementation of electronic health records (EHRs). From 2009 to 2014, adoption of EHRs rose from less than 10% to 97%. This haste to complete implementation has led to a deficiency in data protection and security measures within EHRs. Additionally, with more and more providers leveraging mobile devices and turning to data driven by the Internet of Things, attackers have a plethora of new entry points to access private and sensitive data. See also: Data Security Critical as IoT Multiplies   A quick scan of the Identity Theft Center’s 2016 Data Breach Report shows that lost workplace laptops and stolen company-issued cell phones are frequently listed as reasons for a data breach. Given the growing use of workplace devices in the healthcare industry, as well as the corresponding danger of transmitting information from a central data center to end-user devices and back again, it is crucial that data is protected the moment it is created. Further, healthcare providers must ensure employees are aware that their devices could be compromised when the connection to the data center is lost. Mobile devices make it harder to protect data For example, an attacker could access data while employees are traveling between medical centers when the connection is lost and then sell the retrieved information or leverage it for ransom. As such, data should be protected regardless of whether it is at rest or in transit, as well as in connected and disconnected environments. To protect themselves from vulnerabilities that lead to data breaches, cyber attacks and ransomware, healthcare organizations must revisit their security strategy. This strategy should be comprehensive, flexible and capable of mitigating the impact of a breach at various levels within the enterprise via multiple layers of security solutions. The use of layered security allows for incremental defense to ultimately protect what is most vital to the business—its data. If other security countermeasures are defeated, data protection, which supersedes traditional encryption, will be vital as the last line of defense. For this reason, organizations must use data protection that travels with their data, rendering the data useless to the attacker should it be compromised. Training, technology part of treatment Data security is a threat that will not fade away, but rather grow in importance. As technology continues to advance, attackers and other entities involved in data theft will have just as many tools as the healthcare providers endeavoring to protect valuable and private information. See also: Aggressive Regulation on Data Breaches   Healthcare organizations must accept that their data will become a target and that these threats could originate from nontraditional sources, such as IoT and other innovations. Leaders must act now to protect their business, patients and other stakeholders. This article originally appeared on ThirdCertainty. It was written by Ermis Sfakiyanudis.

Byron Acohido

Profile picture for user byronacohido

Byron Acohido

Byron Acohido is a business journalist who has been writing about cybersecurity and privacy since 2004, and currently blogs at LastWatchdog.com.