Download

What Really Matters in Customer Experience

Companies that excel at customer experience recognize that they’re in the business of shaping memories, not just experiences.

No matter how hard you try to improve your company’s customer experience, the reality is that your customers won’t remember much of it. That’s because our brains aren’t wired like a video camera, recording every second of every experience. Rather, what we remember are a series of snapshots. And those snapshots aren’t taken at random. The camera shutter opens to capture the peaks and valleys in the experience – the really high points and the really low points. Most everything else, all the parts of the experience that are just “meh,” fade into the background and disappear from our memory. So, our recollections are less “streaming video” and more “still photograph.” But what does this have to do with the customer experience? Well, creating a great customer experience is a lot about shaping memories. For a business to derive strategic and economic advantage from its customer experience, people need to remember it positively. When a friend or colleague asks you – “what do you think of [Company X]?” – your response is grounded in your recollection of the experience, which is different from the experience itself. That’s because your assessment of the experience, the basis for repurchase and referral behavior, won’t be derived from some meticulous calculation of the ratio between pleasantness and unpleasantness. Rather, you’ll be making that judgment just based on the snapshots that your memory has taken from the encounter. This is why companies that excel at customer experience recognize that they’re in the business of shaping memories, not just experiences. They capitalize on cognitive science to influence what people will remember, strategically creating “peaks” in the experience that will outnumber and outweigh the “valleys.” Their success in this regard is why customers recall the experience so positively, even if every portion of it wasn’t “delightful.” (DisneyWorld’s customers spend a lot of time waiting in line at the park, but when they return home from their vacation, it’s not the lines they remember — it’s the attractions.) There are a variety of strategies that great companies use to positively influence customer memories, but they all essentially involve creating more and higher peaks, as well as fewer and less deep valleys. See also: Who Controls Your Customer Experience?   Great companies also recognize that it’s alright if there are parts of the customer experience that are just average (as long as they don’t involve interactions that are vital to customers). What’s more important is to make certain there are at least some parts of the experience that will generate those positive, memorable “peak” snapshots. Conversely, one must address aspects of the experience that may be leaving customers with memorable (but negative) “valley” snapshots. (Note that those valleys don’t necessarily need to be turned into peaks, but they at least need to be moved closer to “sea level.”) As you work to differentiate your company in the marketplace, keep an eye out for those peaks and valleys. They’re the features of your customer experience landscape that will shape people’s perceptions and, ultimately, their brand loyalty. And that’s something worth remembering. You can find the original published here on WaterRemarks.

Jon Picoult

Profile picture for user JonPicoult

Jon Picoult

Jon Picoult is the founder of Watermark Consulting, a customer experience advisory firm specializing in the financial services industry. Picoult has worked with thousands of executives, helping some of the world's foremost brands capitalize on the power of loyalty -- both in the marketplace and in the workplace.

How to Avoid Failed Catastrophe Models

A customized model that is fit-for-purpose one day can soon become obsolete if not updated for changing business practices and real-world data.

|||
Since commercial catastrophe models were introduced in the 1980s, they have become an integral part of the global (re)insurance industry. Underwriters depend on them to price risk, management uses them to set business strategies and rating agencies and regulators consider them in their analyses. Yet new scientific discoveries and claims insights regularly reshape our view of risk, and a customized model that is fit-for-purpose one day might quickly become obsolete if it is not updated for changing business practices and advances in our understanding of natural and man-made events in a timely manner. Despite the sophisticated nature of each new generation of models, new events sometimes expose previously hidden attributes of a particular peril or region. In 2005, Hurricane Katrina caused economic and insured losses in New Orleans far greater than expected because models did not consider the possibility of the city’s levees failing. In 2011, the existence of a previously unknown fault beneath Christchurch and the fact the city sits on an alluvial plain of damp soil created unexpected liquefaction in the New Zealand earthquake. And in 2012, Superstorm Sandy exposed the vulnerability of underground garages and electrical infrastructure in New York City to storm surge, a secondary peril in wind models that did not consider the placement of these risks in pre-Sandy event sets. Such surprises affect the bottom lines of (re)insurers, who price risk largely based on the losses and volatility suggested by the thousands of simulated events analyzed by a model. However, there is a silver lining for (re)insurers. These events advance modeling capabilities by improving our understanding of the peril’s physics and damage potential. Users can then often incorporate such advances themselves, along with new technologies and best practices for model management, to keep their company’s view of risk current – even if the vendor has not yet released its own updated version – and validate enterprise risk management decisions to important stakeholders. See also: Catastrophe Models Allow Breakthroughs   When creating a resilient internal modeling strategy, (re)insurers must weigh cost, data security, ease of use and dependability. Complementing a core commercial model with in-house data and analytics and standard formulas from regulators, and reconciling any material differences in hazard assumptions or modeled losses, can help companies of all sizes manage resources. Additionally, the work protects sensitive information, allows access to the latest technology and support networks and mitigates the impact of a crisis to vital assets – all while developing a unique risk profile. To the extent resources allow, (re)insurers should analyze several macro- and micro-level considerations when evaluating the merits of a given platform. On the macro level, unless a company’s underwriting and claims data dominated the vendor’s development methodology, customization is almost always desirable, especially at the bottom of the loss curve where there is more claim data; if a large insurer with robust exposure and claims data is heavily involved in the vendor’s product development, the model’s vulnerability assumptions and loss payout and developments patterns will likely mirror that of the company itself, so less customization is necessary. Either way, users should validate modeled losses against historical claims from both their own company and industry perspectives, taking care to adjust for inflation, exposure changes or non-modeled perils, to confirm the reasonability of return periods in portfolio and industry occurrence and aggregate exceedance-probability curves. Without this important step, insurers may find their modeled loss curves differ materially from observed historical results, as illustrated below. A micro-level review of model assumptions and shortcomings can further narrow the odds of a “shock” loss. As such, it is critical to precisely identify risks’ physical locations and characteristics, as loss estimates may vary widely within a short distance - especially for flood, where elevation is an important factor. When a model’s geocoding engine or a national address database cannot assign location, there are several disaggregation methodologies available, but each produces different loss estimates. European companies will need to be particularly careful regarding data quality and integrity as the new General Data Protection Regulation, which may mean less specific location data is collected, takes effect. Equally as important as location is a risk’s physical characteristics, as a model will estimate a range of possibilities without this information. If the assumption regarding year of construction, for example, differs materially from the insurer’s actual distribution, modeled losses for risks with unknown construction years may be under- or overestimated. The exhibit below illustrates the difference between an insurer’s actual data and a model’s assumed year of construction distribution based on regional census data in Portugal. In this case, the model assumes an older distribution than the actual data shows, so losses on risks with unknown construction years may be overstated. There is also no database of agreed property, contents or business interruption valuations, so if a model’s assumed valuations are under- or overstated, the damage function may be inflated or diminished to balance to historical industry losses. See also: How to Vastly Improve Catastrophe Modeling   Finally, companies must also adjust “off-the-shelf” models for missing components. Examples include overlooked exposures like a detached garage; new underwriting guidelines, policy wordings or regulations; or the treatment of sub-perils, such as a tsunami resulting from an earthquake. Loss adjustment difficulties are also not always adequately addressed in models. Loss leakage – such as when adjusters cannot separate covered wind loss from excluded storm surge loss – can inflate results, and complex events can drive higher labor and material costs or unusual delays. Users must also consider the cascading impact of failed risk mitigation measures, such as the malfunction of cooling generators in the Fukushima nuclear power plant after the Tohoku earthquake. If an insurer performs regular, macro-level analyses of its model, validating estimated losses against historical experience and new views of risk, while also supplementing missing or inadequate micro-level components appropriately, it can construct a more resilient modeling strategy that minimizes the possibility of model failure and maximizes opportunities for profitable growth. The views expressed herein are solely those of the author and do not reflect the views of Guy Carpenter & Company, LLC, its officers, managers, or employees. You can find the article originally published on Brink.

Imelda Powers

Profile picture for user ImeldaPowers

Imelda Powers

Imelda Powers has advised worldwide (re)insurers on catastrophe exposure management for more than 20 years. Her works include model development, statistical simulation and insurance-linked securities.

The Future of the Agency Channel

Agents who give personalized advice and advocacy when needed represent the great upside and the future of the agency channel.

In today’s insurance marketplace, agencies face heavy competition from digital insurance channels and direct marketers like GEICO and Progressive. So what does the future look like for the thousands of carrier and independent agents --- proponents of human engagement --- who realize that all the digital insurance channels in the world can’t replace the human connection? Independent and carrier agents can enhance and build on their own strengths to compete head-on with the impending rise of the competitive insurance channels. Agents who give personalized advice and advocacy when needed represent the great upside and the future of the agency channel. Insurance is a security blanket. People want to know that they will be covered appropriately in their time of need, and that an advocate will be there to support them when things don’t go quite as planned. Certainly people want to know a live human being can be there when their basement floods, but being a trusted adviser relies on really knowing the policy holder - being in the life of that person with quality, frequency and continuity. The challenge for the agency channel is building a velocity of contact with current and prospective policy holders in the insurance industry, which undeniably has the highest-touch and highest-volume requirement for interactions by its sales professionals. When we accomplish the role of trusted adviser, it results in higher retention, cross-sell and referral business. This is being evidenced by proponents of the agency model who study the insurance industry. See also: Reinventing Sales: Shifting Channels   Bain & Co.’s research shows that agency/agent connection is unique to earn customer loyalty, and that a loyal insurance customer – measured by Bain’s Net Promoter Score – delivers a whopping seven times the lifetime value of a low loyalty customer and three times the value of a neutral customer. And loyal customers reward their agents by buying 25% more insurance at higher prices, staying with and consolidating their insurance with one provider and even referring friends and family. But we are not out of the woods yet! Ernst & Young Global Customer Survey found that 86% of insurance consumers are "not very" satisfied with communications from their provider. A whopping 44% report remembering zero communications from their insurance provider in the last 18 months. So what does all this mean for agents? The most important task for the agency channel is to focus on what they do best, offering peace-of-mind to their customers even over the values of price and convenience , which are offered by direct carriers and other emerging digital channels of the world. To earn customer loyalty, drive growth and attract new customers, agents are adopting and mastering newer technology that can provide continuous engagement — connecting to people on email, text, phone and social media — which are the new ways consumers shop for insurance today. In this way agents are partnering with technology to manage leads and organize marketing programs to guide consumers through an elevated, sequential customer journey geared at building relationships that are very highly valued by future insurance policy holders. Again, research is ahead of this curve. Top insurance executives in a recent Accenture poll on the “Future Insurance Workforce” survey found that artificial intelligence is here to stay and will create workplace opportunities that will help agents work more efficiently to help drive growth and attract new customers. In fact, the only economically feasible way to scale agency-policy holder relationship-building today is through connecting technologies that consumers now use and expect of their vendors. Savvy agents know their customers’ values well – and are in a strong position to deliver original content through technology that best expresses the value of the agency in ways that are most meaningful to each customer. Contemporary insurance marketing automation solutions – integrated with agency management systems that maintain volume and feature sequential and automated practices – will make insurance agents more valuable in today’s market. See also: Global Trend Map No. 9: Distribution   Technology Tips to Compete Head-on With Digital Channels and Engage Customers When it comes to marketing insurance, the agency connections coming from trusted advisers remain invaluable to policy holders who must choose between this and a faceless organization that relies on advertising. An agency equipped with appropriate technologies elevates the message to a much higher level! It grabs consumers and keeps them coming back for years to come.
  • Use marketing acceleration programs that induce a repeatable pattern of activity garnered from artificial intelligence and machine learning. This will inform workflows that enable agents to have smarter marketing and more personalized and predictive customer experiences that will lead to better sales outcomes.
  • Use technology tools to help meet the Telephone Consumer Protection Act, (TCPA) guidelines where everyone will need to be internationally compliant or face stiff fines for wrongfully filling out forms and other violations.
  • Use technology tools to help cope with all applicable laws and regulations of the new General Data Protection Regulation, (GDPR) that took effect in Europe and promises to take on more importance in the U.S. in light of recent Facebook privacy issues.

Sam Fleming

Profile picture for user SamFleming

Sam Fleming

Sam Fleming is vice president of product marketing with Imprezzio. A true visionary and leader, Fleming’s enthusiasm and passion for discovering new applications of technology fuels a creativity that translates to incredible business solutions.

Why 'Modern' Is No Longer Enough

A core system that was top of the line in 2013 may be showing its age. Systems now need to be “digitally native."

Insurers, I have some good news and some bad news. Insurers have made tremendous progress in core modernization, purchasing and implementing new core systems and beginning to adapt their businesses to take full advantage of modern core systems’ capabilities. This is genuine cause for celebration – insurers that have made or are making these efforts are advancing their companies and our industry in general.

As insurers were engaged in these core modernization efforts, though, the personal lines market and technology itself have kept moving forward. A core system that was top of the line in 2013 may be showing its age unless it has been continually upgraded to serve the capabilities needed in 2018 and beyond. This may not be the most welcome news for those still thinking about core systems with an average lifespan of 10 years or more, but this is our new reality.

This is especially true for personal lines insurers, which are typically the first to catch the core modernization wave. They have also tended to be the leaders regarding new computing capabilities. It was true for mainframe systems, client servers, web-based applications. Now, heading into the new era of computing, it is true with computing trends like microservices and serverless computing coming to the fore. This is not technology for technology’s sake – insurers need to be able to handle an increasing number of transactions, including multi-threaded calls, and that increase is approaching an infinite number.

See also: 2018’s Top Projects in Personal Lines  

Further pressure comes from the insurtech startups active in the personal lines market. The original, widely known insurtech startups in P&C insurance were focused on personal lines. As the insurtech movement has matured, startups’ focus has widened to commercial lines and workers’ comp as well as crossing product lines. However, startups have been active in personal lines the longest, and those insurtechs that have thrived have gained market experience and are beginning to focus on organizational maturity. That means that for incumbent personal lines insurers, their insurtech counterparts tend to be comparatively robust and mature when compared with the commercial lines insurtechs I discussed in my earlier blog.

A key characteristic of insurtechs is that they are digitally native companies. That means that they are natively fluent, with enormous quantities of data and digital interactions, and their technology is geared toward this.

Core systems that can be described as “digitally native” have an edge in the digital market going into the future. Even though digital has been a crucial focus area for years, the insurance industry is still learning what a truly digital business entails – and what technology is needed to support it. Insurtech startups have given the insurance industry new examples of how to operate in the digital world.

Few core systems are built with the digitally native characteristics, but the core systems marketplace is beginning to adapt. Continued evolution toward open APIs and new data sources will provide insurers with the opportunity to interoperate with the new distribution channels and directly with the customer.

Whether you are a large insurer that is trying to support new digital brands and new product models (on-demand, telematics and others dependent on high amounts of data) or a small regional insurer trying to power consumer service portals, the key question is data availability and digital connectivity with the consumer and agent.

So, for insurers asking themselves: “Do we really need to think about modernizing our modern core systems?” the better question may be this: “Are your modern core systems digitally native?”


Karen Furtado

Profile picture for user KarenFurtado

Karen Furtado

Karen Furtado, a partner at SMA, is a recognized industry expert in the core systems space. Given her exceptional knowledge of policy administration, rating, billing and claims, insurers seek her unparalleled knowledge in mapping solutions to business requirements and IT needs.

Health care innovation: As easy as ABC?

sixthings

The Amazon/Berkshire Hathaway/JPMorganChase health care partnership known as ABC announced last week that it had hired as CEO the well-known innovator, author and surgeon Atul Gawande. Count me a fan.

While some note that he has never run an organization anywhere close to the size that will be needed to coordinate the health care of the roughly one million people covered in the partnership, I believe he brings just the right qualities to a job that could create great change in U.S. health care in five to 10 years, slashing costs while improving care. 

He has shown an ability to look at problems through beginner's eyes, even when he has been immersed in the issue for years. His 2009 article in The New Yorker comparing care in two Texas towns showed just how destructive the profit-maximizing culture of his industry has become.

Gawande has consistently shown the sort of empathy that often gets lost when care collides with dollars and cents. His latest book, "Being Mortal," on how life draws to a close, is remarkable. (Here is an interview that provides a look into his thinking: https://onbeing.org/programs/atul-gawande-what-matters-in-the-end-oct2017/)

He has also seen both how important and how hard it is to implement innovation. He devised a checklist for surgeons that is now used in hospitals around the world and that has greatly reduced post-operative infections. But surgeons, as dedicated as they are, bristle when told what to do, especially via something as rudimentary as a checklist, so progress hasn't exactly moved in a straight line.

Even if Gawande succeeds—a big if, given the size and complexity of the issue—you have to figure that progress will take years. First, ABC will have to figure out how to take better care of the million souls in its care. Then the problem really gets hard, as ABC will have to figure out how to roll out its solutions into remote parts of the country, to small businesses, to people older than employees at the ABC companies and so on, or others will have to figure out how to pull ideas out of ABC and implement them more broadly.

A million people is a lot, but it isn't 330 million. And the antibodies working against Gawande will be stronger than any he's ever seen. People will do a lot to protect the going-on $4 trillion spent on health care in the U.S. each year.

Still, I can't think of a better person to tackle the issue.

Have a great week.

Paul Carroll
Editor-in-Chief


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

The Industry Needs an Intervention

Traditional operating levers for executing strategy no longer work. 20th-century structures don't work in 21st-century conditions.

Leaders in the insurance industry, like many other industry executives, are seeking routes to profitable growth amid unprecedented economic, financial and regulatory change. No longer can companies pursue top-line growth for its own sake without adverse consequences or rely on cost cuts alone to boost margins. Today, companies must strike a strategic balance that will sustain profit growth and shareholder returns over the long term. This is no easy trick, as tectonic forces unsettle the insurance industry — which is accustomed to measuring the pace of change in decades, not years or quarters. A business-as-usual approach falters in the face of quickly shifting customer needs, rising capital requirements, new regulatory burdens, low interest rates, disruptive technology, and new competitors. Many companies aren’t getting the results they need from textbook moves such as fine-tuning marketing programs, updating products, enhancing customer-service systems or beefing up information technology. That’s because traditional operating levers for executing strategy simply weren’t designed for the challenges confronting insurers today. Strategic success now requires something more: a structural response. A company can’t adapt to 21st-century conditions without modernizing its 20th-century structures. The key is for companies to realize that strategy equals structure. Strategy — the big and important ways that a company chooses to compete — must naturally and intrinsically weave in key operating model dimensions, including legal entity, tax positioning, capital deployment, organization and governance. Finally, once strategy and structure are wed, companies must recognize the role of culture in making new structures work, and use their cultural strengths to promote the changes and ensure that they have staying power. Here’s how: Responding to the Pressures Rapid evolutionary change has rendered time-honored organizational structures ineffectual or obsolete in many cases. Before attempting to execute new strategies, insurance companies need to reevaluate every dimension of their operating model. Structural inadequacies take many forms. Some companies lack the scale needed to generate profitable growth under new capital requirements. Others with siloed, hierarchical organizations lack the flexibility to respond quickly to market shifts. Poor technological capabilities often hamstring old-line insurers facing new digitally oriented rivals. And tax reform and regulation looms as a potential threat to profitability in certain business lines. See also: Why Is Insurance Industry So Small? In our work with insurers, we at Strategy&, PwC’s strategy consulting business, have seen certain common responses to these pressures. Their responses divide these companies into three groups:
  • The first group of companies have anticipated the effects of marketplace trends and made appropriate structural adjustments, clearing the way to profitable growth. For example, life insurer MetLife avoided costly regulatory mandates by selling registered broker distribution to MassMutual and spinning off its Brighthouse retail operations. Others, including Manulife and Sun Life, have made substantial acquisitions to consolidate scale positions.
  • The second group of companies have recognized the need for structural change, but have yet to carry it out. With plans made, or under discussion, these companies are waiting opportunistically for the right deal to come along.
  • A third group of companies, however, have hunkered down behind existing structures, making only minor tweaks and hoping to emerge from the storm without too much damage. For some, this is a rational choice because of constraints that leave them with little or no maneuvering room. In other cases, action is impeded by a company culture that reflexively rejects certain options.
Companies in the first two groups are giving themselves a chance to win. But the response of companies in the third group smacks of self-delusion in an age when strategy equals structure. Time for Real Change Without a doubt, many insurers work diligently and continually to improve their businesses across dimensions. They gather insights into consumer needs and behaviors, nurture unique capabilities to differentiate themselves from competitors, modernize products, update distribution strategies and embrace digitization in all its forms. These are all sound approaches, but they’re inadequate in addressing the unknown facing insurers today. Their belief that they will persist assumes a certain stability in underlying economic and market conditions that hasn’t been seen since the financial collapse nearly a decade ago. Forces unleashed by that crash and its aftermath undermined the pillars of many insurance business models. We’ve seen years of only modest growth, with property/casualty insurers expanding at a 3% pace, and life insurers barely exceeding 1%. The long stretch of sluggish global growth has put pressure on revenues and forced insurers to compete harder on price. Near-0% interest rates that have prevailed since the Great Recession are squeezing profit margins, especially in life insurance. On the regulatory front, tougher accounting rules are driving up costs while heavier capital requirements weigh down balance sheets and dilute returns. Compounding these challenges are the potentially destabilizing effects of tax reform on earnings and growth. Taxes may actually rise for some insurers, an outcome that could force them to raise prices or find other ways to protect shareholder returns. In many cases, the benefits of falling tax rates may be diminished by the loss of deductions for affiliate premiums, limits on deductibility of life reserves, accelerated earnings recognition and a slowdown of deferred acquisition cost deductions. Competitive dynamics are shifting, too, as expanding “pure play” asset managers such as Vanguard and Fidelity block growth avenues for insurers. Established companies and some new entrants are innovating and experimenting with disruptive distribution models. Others, including private equity firms, are looking to bend the cost curve through aggressive acquisition and sourcing strategies. To be sure, some long-term trends could benefit certain insurers, or at least improve their risk profile. Longer life spans and the shift of responsibility for retirement funding to individuals may drive demand for annuities and other retirement products. However, many companies are as unprepared to capitalize on these opportunities as they are to meet long-term challenges. Often the problem comes down to scale. Some insurers lack the resources to build new distribution platforms and customer service capabilities in growing markets such as asset management, group insurance, ancillary benefits and retirement plans. Although offering an individual product may be relatively easy for new market entrants, the difficulty and cost of establishing such platforms creates a desire for scale and increases pressure on smaller competitors. Sometimes, the issue isn’t scale but a failure to respond quickly enough as conditions change. Buying habits are changing as consumers — particularly the younger cohorts — make more purchases online. Yet our research indicates that people still want some personal assistance with larger and more-complex transactions. It takes investment and experimentation to find and refine the right business model for new marketplace realities. But some companies haven’t built the necessary assets and capabilities or adjusted to evolving distribution patterns and consumer behaviors. The proper response to each challenge and opportunity will be different for every company, depending on its unique characteristics and circumstances. In virtually every case, the right solution will involve structural change. Joining Strategy and Structure As companies recognize that traditional approaches to annual planning, project funding and technology architecture may be hindering innovation and real-time responses to changing market conditions, many are rethinking and redesigning their core processes to facilitate change. Recent transactions in the sector show the range of structural options for companies that want to advance strategic goals in a changing marketplace. Below are some examples. Exiting businesses. Sometimes, the best choice is to move out of harm’s way; companies can preserve margins by exiting businesses targeted for higher capital requirements or costly new accounting standards. MetLife’s Brighthouse spin-off bolstered its case for relief from designation as a “systemically important financial institution,” and the associated capital requirements. Exiting U.S. retail life insurance markets also enabled MetLife to focus on faster-growing businesses that are less vulnerable to rock-bottom interest rates. The Hartford recently announced the sale of Talcott Resolution to a group of investors, completing its exit from the life and annuity business. Partnerships and acquisitions. When scale is an issue, the solution may lie outside the company or in new structural approaches. Some insurers form partnerships to expand distribution, diversify product portfolios or bolster capabilities. Companies also adjust their scale and capital structures through mergers, acquisitions and divestitures. Sun Life paid $975 million in 2016 for Assurant’s employee benefits business, filling gaps in its product portfolio and gaining scale to compete with larger rivals. MassMutual’s purchase of MetLife’s broker-dealer network in 2016 enlarged the MassMutual brokerage force by 70% and freed MetLife to pursue new distribution channels. Expanding into new lines and geographies. New product lines offer another path to faster growth or fatter profit margins. Several insurers have moved into expanding markets with lower capital requirements, such as asset management. Voya, Sun Life and MassMutual have acquired or established third-party asset management units to capitalize on investment expertise they developed managing internal portfolios. The Hartford recently agreed to acquire Aetna’s U.S. group life and disability business, deepening and enhancing its group benefits distribution capabilities and accelerating digital technology plans. We also see companies establishing technology-focused subsidiaries such as Reinsurance Group of America’s (RGA’s) RGAx and AIG’s Blackboard. Cutting costs. Some companies have moved aggressively to improve their cost structure. Insurers seeking greater financial flexibility have divested assets that require significant capital reserves. Aegon unleashed $700 million in capital by selling blocks of run-off annuity business to Wilton Re in 2017. An insurer that offloads its defined-benefit plan to another via pension-risk transfer frees up capital and eliminates continuing pension funding requirements. Other cost-saving moves focus on workforce expenses. In addition to rightsizing staff, such measures include relocating workers to low-cost areas or jurisdictions offering significant tax incentives. Prudential and Manulife slashed expenses by establishing overseas operating centers that take advantage of labor cost arbitrage, create global economies of scale and reduce taxes. See also: Key Findings on the Insurance Industry Transformation and Culture Once companies have launched ambitious structural initiatives, they don’t always recognize the role of culture in making the new structures work. But this is a mistake. Culture is a pattern of behaviors, norms and mind-sets that have grown up around existing organizational structures; the two (culture and structure) are tightly linked, and you can’t change one without affecting the other. No culture is all good or all bad. But certain cultural traits are more relevant to structural change than others. Cultural attributes affect a company’s ability to make necessary changes. A company that is consensus-driven and focused on preventing problems before they arise may be indecisive and slow to act. These traits may cause it to wait too long and miss the optimal moment for a structural transformation. Other companies, by contrast, have a tradition of quickly seizing opportunities. When this trait is supported by other important characteristics — more single points of accountability, strong leadership and an aligned senior management team — it can foster the rapid decision making essential to structural change. Culture also comes into play after executives decide to initiate structural change. Most employees have strong emotional connections to the culture — this source of pride, along with a clear and inspiring vision of the future, can motivate them to line up behind the change and can inspire collaboration across organizational boundaries to drive the transformation. Leaders at all levels can generate momentum by signaling the desired cultural shifts and embodying the new behaviors needed to execute structural change. A new structure without a corresponding evolution of culture amounts to little more than a redesigned organization chart. Culture makes or breaks the new structure, influencing factors as diverse as resource allocation, governance and the ability to follow through on a vow to “change how work gets done.” It’s not uncommon for a company to expend tremendous effort and resources on a complete structural overhaul, only to see incompatible cultural norms thwart its strategic execution. For example, a new, streamlined operating model intended to accelerate decision making and foster cross-functional collaboration won’t take root in a culture that exalts hierarchy and encourages employees to focus on narrow functional priorities. Culture also influences a company’s willingness to make the deep structural changes in time to avert a crisis. Those who wait until market conditions have undermined their operating model put themselves at a disadvantage. Nevertheless, few companies attempt structural change in “peacetime.” Absent a crisis, cultural expectations often limit directors to a narrow role monitoring indicators such as growth and profitability, while management concentrates on achieving specific strategic objectives. Under this traditional allocation of responsibilities, emerging structural issues may not get enough attention. Successful companies, by contrast, continually reassess their structure in light of evolving market conditions. They understand that organizational structures aren’t permanent fixtures, but strategic choices to be reconsidered as circumstances and objectives change. Capitalizing on Changes Amid the confusion of today’s insurance industry, one thing is clear: Business as usual won’t deliver sustained, profitable growth. As powerful forces reshape markets, conventional tools for executing strategy are losing their effectiveness. Today’s challenges are not operational, but structural. Many insurers lack the scale, capabilities or efficiency to compete effectively as competition intensifies, regulatory burdens increase and financial pressures rise. Winning companies are meeting structural challenges with structural solutions. Approaches vary from company to company. Some add scale or enhance capabilities, whereas others streamline cost structures or exit lagging business lines. With the right cultural support, these structural responses position a company to capitalize on industry changes that are confounding competitors. You can find the article originally published on Strategy & Business. This article was written by Bruce Brodie, Rutger von Post and Michael Mariani.

Bruce Brodie

Profile picture for user BruceBrodie

Bruce Brodie

Bruce Brodie is a managing director for PwC's insurance advisory practice focusing on insurance operations and IT strategy, new IT operating models and IT functional transformation. Brodie has 30 years of experience in the industry and has held a number of leadership positions in the insurance and consulting world.

How Digital Platform Smooths Operations

The sheer volume of repetitive, rule-driven work sets insurers apart from many other industries -- and creates big opportunities.

In a 2009 interview with Insurance Journal, Juan Andrade of The Hartford ranked “improving operational efficiency” third on a list of essential priorities for P&C insurers, below both customer retention and a systematic sales approach. This ranking made sense 10 years ago. At that time, Andrade’s top two priorities were customer connections and insurance sales, but digital means of providing either one had not fully developed. Today, however, all three of these top priorities can be addressed through a digital platform — and placing operational efficiency first on the list has the power to boost the other two. Digital operations management “is not only about technology,” says Eddy Lek at Schneider Electric. “It requires a holistic approach to transform operations; implementing changes to the existing business and operations models and training employees to effectively operate with new tools – [e]mpowering the workforce to leverage technology for greater efficiency.” Here, we look at how a digital platform improves operational efficiencies for P&C insurers. We also discuss how insurers can identify the top challenges they face and ask the right questions to ensure they implement digital tools that address those challenges effectively. The Digital Future and Its Challenges for Insurers Property and casualty insurers have seen stormy weather in the past few years, literally and figuratively. The need to respond to claims from the 2017 hurricane season, decreasing auto coverage purchases combined with rising claim costs and other factors have resulted in losses across the board, according to a Deloitte report. Customer needs and demands are changing, as well, as Insurance Journal’s Michael Kasdin notes. For instance, younger adults drive less, reducing demand for auto insurance policies and increasing interest in newer, more adaptable tools like pay-per-mile auto insurance. Gig economy work like driving for Uber or Lyft or listing rentals with AirBnB has changed needs in auto and home insurance, as well. See also: Digital Playbooks for Insurers (Part 4) According to Kasdin, insurtech is poised to address many of these problems. Yet concerns about cybersecurity and anticipating the “right” place to invest in digital platforms and similar tools continue to stall many insurers, as Nate Anderson, Pascal Roth and Pierre-Henri Boutot described in a Bain & Co. brief. To address sinking premiums, rising claims and the retention of a customer base shifting rapidly away from old standards of expectation in insurance, insurtech stands out. Improving operational efficiency via digital platforms can improve P&C insurers’ ability to address all three threats simultaneously. Digital Tools for Operational Efficiency A recent Audit and Risk Committee Forum survey by PwC found that 44% of insurance leaders surveyed believe that “most existing insurers will not survive, at least in their current form.” And one of the biggest causes of their demise will be operational inefficiency. Currently, operational inefficiencies in P&C insurance are commonly found in “repetitive, business rule-driven work,” according to February 2018 PwC white paper. While other inefficiencies exist, the sheer volume of repetitive, rule-driven work sets insurers apart from many other industries. For decades, such work has demanded human intervention because no machinery existed to ensure that the rules were followed and that the task was done correctly each time. Today, however, machine learning, AI and similar tools make it possible for insurance companies to automate much of this work for increased efficiency. “It’s always important to realize that 55% to 60% of all the cost within any given agency is going to be personnel cost,” Andrade told Insurance Journal in 2009. “The key here is making sure that your people, your employees are being as productive as they can.” Digital platforms offer new ways to ensure employee productivity. In an automated world, insurance companies can reevaluate the contributions each agent and employee makes based on the value added to the process, providing a powerful new way to determine and eliminate inefficiencies. How Efficiency and Customer Retention Meet on a Digital Platform The February 2018 PwC report noted that when it comes to insurtech, most P&C insurers are still thinking in an “outward”-facing mode. They’re embracing digital platforms primarily for the platforms’ ability to connect them with customers who increasingly demand easy digital communication, online purchasing and consistent points of contact. Meanwhile, Ben Kerschberg at Forbes identifies three “pillars of change” for digital platforms: customer service, operational processes and business processes. In other words, digital platforms do have the power to improve operational efficiencies in customer service — but customer service is only one of three pillars of opportunity. Insurers who focus here miss the two opportunities to greatly improve operational and business efficiency, as well. Five years ago, big data was big news. Today, it’s a given in most businesses. The ability to analyze vast amounts of data to spot meaningful trends and changes can revolutionize risk analysis and operational efficiency in insurance, but insurers must first have the digital platform necessary to capture and analyze data. Customers are willing to provide more data to get seamless digital service. They’re also willing to pay more for service on a strong, integrated digital platform — up to 21% more to get, it, according to Ameyo’s Shaista Haque. A digital platform also makes it easier for insurers to streamline service, not only to customers but also within the organization itself. For instance, when products are developed in a streamlined digital environment, much of the inefficiency caused by in-person meetings, incompatible or un-editable digital documents, checking details or numbers by hand and other prolongations of the product development cycle can be minimized or defeated. This increased internal efficiency improves the ability to provide customers with products that meet their changing needs on a timetable that encourages customers to adopt them. See also: Digital Insurance 2.0: Benefits   Questions to Ask When Seeking the Right Digital Tools Insurtech developments appear almost daily, which can leave insurance leadership feeling overwhelmed. What are the best tools for the particular challenges you face? How can you identify top inefficiencies, and how will you know you’re choosing the right digital platform capabilities to optimize them? Deb Miller, director of market development for business process solutions at OpenText, identifies four operational efficiency optimization strategies that are being employed by an increasing number of insurance companies:
  • Improving operational efficiency by driving for leverage across silos
  • Scaling to address demand for specific products and across a broader geographical range
  • Expanding distribution channels while improving or maintaining excellent customer service
  • Automating case management tasks to reduce time to resolve in claims, as well as reducing paper and other resource waste
Knowing which strategies to prioritize, however, means knowing where your particular organization’s inefficiency pain points lie. A McKinsey & Co. white paper recommends that managers seeking to improve operational efficiency ask questions like:
  • How are we delivering value to the customer? How do we do so efficiently?
  • How do we work? What are some better ways to perform that work?
  • How do we connect goals, strategy and meaningful purpose? How do we communicate these to our teams and to our customers?
  • How are we enabling people to lead at their fullest potential?
Questions like these can help insurers find inefficiencies. The answers can also help digital platform providers identify which tools will be most effective for a particular insurer.

Tom Hammond

Profile picture for user TomHammond

Tom Hammond

Tom Hammond is the chief strategy officer at Confie. He was previously the president of U.S. operations at Bolt Solutions. 

Why Risk Management Is a Leadership Issue

Ten or 15 years ago, no companies had a chief risk officer. Risk was barely mentioned. Today, risk has to be on every board's agenda.

From product scandals to data breaches to natural disasters, companies are dealing with constant risk. But how they prepare for those risks can make the difference between riding the roughest wave — or drowning in it. The field of risk management, once an afterthought for many companies, is getting renewed attention with a new book by two Wharton professors who want to help business leaders think more deeply about worst-case scenarios. Michael Useem, management professor and director of the Center for Leadership and Change Management, and Howard Kunreuther, professor of operations, information and decisions as well as co-director of the Risk Management and Decision Processes Center, recently spoke with the Knowledge@Wharton show on SiriusXM channel 111 about their book, Mastering Catastrophic Risk: How Companies Are Coping with Disruption. An edited transcript of the conversation follows. Knowledge@Wharton: How did the two of you come to collaborate on this book? Useem: If you think about the two terms that Howard has referenced, risk and leadership, they go together in this case. Often, we think of those as something separate. Risk — we’ve got to be analytical and disciplined, and it’s often technical. Leadership — it’s all about having a vision and setting a strategy. But we concluded, after talking with quite a few people and companies’ directors, executives and senior managers that the time has come for the conjoining of these two terms. Many companies now are self-conscious about appraising risk, measuring risk, managing risk and ensuring the company is ready to lead through a tough moment the risk has caused. Knowledge@Wharton: Is this a recognition that has developed recently, compared with the executive mindset of the 1950s, ’60s, and ’70s? Useem: Yes. I think what really got us going on the book in terms of the timing is exactly what you’ve referenced. Ten or 15 years ago, no companies had a chief risk officer. Risk was barely mentioned. The term “enterprise risk management” (ERM) was not even around. But if you look at any trend line out there, what do people worry about when they get together at watering holes for senior management? Risk now is on the agenda just about everywhere, for good reason: Because the risk that companies have faced in recent years has gone up. The catastrophic downside of big risk also has increased. More risk, more downside, more people are paying attention. Kunreuther: One of the really interesting issues associated with the study and our interviews with senior management is that, before 9/11, there was very little emphasis by the firms on low-probability events — the black swan events. Starting with 9/11 and continuing through to today, these issues now have become more important, and black swans are now much more common than before. As a result, firms are paying attention. When we interviewed people, they were very clear with us that now that the events have occurred, they are putting it high on the agenda. As Mike has indicated, the boards and all of senior management are now paying attention to it, so it’s a big, big change. Knowledge@Wharton: Certainly, 9/11 was an impactful event on the country, but it was followed a few years later by the Great Recession. How did that change the view of risk? Useem: We raised the question in these in-depth interviews with people inside the company, whether on the board or in the management suite, and they consistently said that four events became a wake-up call or an alarm bell. First, 9/11 got us thinking about the unthinkable. A couple of hurricanes came through, including Sandy, which was a huge event. The recession or the near-depression back in 2008, 2009. Who thought that the Dow was going to lose 500 points in a day? Who thought Lehman was going to go under? But it all happened. And finally, the events in 2011 in Japan with the enormous tsunami after a 9.0 earthquake that left probably 25,000 people dead and set a fire in a nuclear plant. Even if you were a company that was not touched, just look at the four points on a graph. The costs are high. Many companies are impacted. Everybody thought, let’s get on with enterprise risk management. Let’s make it an art. See also: How to Improve ‘Model Risk Management’   Knowledge@Wharton: How have business leaders changed their thinking about risk management because of those four events? Kunreuther:  Leaders are now saying, “We have to put risk on the agenda. We have to think about our risk appetite,” which they hadn’t thought about before. “We have to think about our risk tolerance.” Financial institutions played that role, and they were very clear about that right after the 2008-2009 debacle. They had to ask themselves very explicitly that question. But I think this is now much broader than that. Leaders have recognized that they also have to think longer-term. This is one of the issues. We have a framework that we’ve developed in the book that tries to combine some of the work that has come out of the literature that Daniel Kahneman has pioneered on thinking fast and slow — by indicating that intuitive thinking is the mindset that we often have. Thinking myopically. Thinking optimistically. Not wanting to change from the status quo. Leaders have now recognized that they have got to put on the table more deliberative thinking and think more long-term. That is a change, and they tie that together with risk. One of our contributions, with respect to the book, is to try to put together a framework that really resonates with the leaders and the key people in the organization so that they can respond in a way that makes sense. Useem: We asked a lot of people who are in the boardroom, if they go back 15 years, was risk, cyber risk or catastrophic risk in board deliberations? The answer typically was no. Ask the same people about today, and they say, “Of course.” We watched with horror what has happened with some of the cyber disasters at Target and elsewhere, and no board worth its pay is these days unconcerned about risk. Now, you’ve got to be careful. The board works with management, sets the vision, does not micromanage. But what boards are increasingly doing is saying to management, “Let’s see what your risk tolerance is. Let’s see what your risk appetite is. Let’s see what measures you already have in place. Nobody wants to think about the unthinkable, but let’s think about it.” Knowledge@Wharton: The fake accounts scandal at Wells Fargo and the emissions controversy at Volkswagen are two recent examples of risk that you document in the book. Can you talk about that? Useem: We don’t mean to pick on any company, and we don’t mean to extol the virtues of any company. But we can learn from all. Howard and I took a look at the events at Wells Fargo, which were extremely instructive. No. 1, the company put in very tough performance measures. They told employees, you’ve got to get results, otherwise you’re not going to be here in 12 months. But there was not a recognition that very tough performance indicators without guardrails against excess of performance was a toxic mix. We’ve seen what happened to Wells Fargo. They’ve paid billions in fines. The Federal Reserve has a stricture right now that Wells Fargo cannot accept one more dollar in assets until it can prove to the Fed that it has good risk measures in place. We also document in the book the events with Volkswagen, which had the so-called defeat devices intended to report if a VW vehicle was brought in for an inspection, that the emissions were meeting U.S. standards. In fact, the software just simply was fooling the person looking at the dials. That, apparently, went all the way up to the top. We’ll see what’s finally resolved there. Wells Fargo and Volkswagen took enormous hits in terms of reputation, brand, stock price and beyond. We also document a bit the BP problems in the Gulf…. They’re instructive. Kunreuther: We didn’t interview anyone with respect to Volkswagen, but we did have public information, and it’s included in the book. The reason that we felt it was so important is that VW felt that this was a low-probability event that they would be detected, and they put it below their threshold level of concern. They emphasized the optimistic part of this, which was to say, “Let’s see what we can do as a way of really improving our bottom line.” What we do in the book is give a checklist to people, to companies and to individuals. We see it as a broad-based set of checklists on how they can do a better job of dealing with that. What we really say is: Pay attention to these low-probability events. If you think not only in terms of next year but over the next 10 years, what you can see as a very low-probability event would actually be quite high over a period of time. If you begin to think long-term, which is what firms want to do, you pay attention to that. Knowledge@Wharton: There’s such an economic impact on the company when these issues can’t be resolved quickly. Toyota, for example, has been dealing with its airbag problem for several years. Kunreuther: You tie the issue of getting companies and directors to pay attention to the low probability, and then you say to them, “Construct a worst-case scenario.” Put on the table what could happen if it turns out you were discovered, or if there is an incident that occurs, or an accident, as Mike was saying on the BP side. What’s going to happen to the company? What will happen to its reputation, its survival, its bottom line? Our feeling is that, if you can begin to get people to think about the appetite and tolerance in the context of these low probabilities that could be quite high, then I think you have an opportunity for companies to pay attention. And they’re doing that, as Mike and I have found out in our interviews. Knowledge@Wharton: What about when the disaster is a natural phenomenon, such as the volcanoes in Hawaii and Guatemala? Companies have to be prepared, but they can’t control what happens. Useem: As we’ve watched the events unfold in Hawaii and Guatemala, it’s a great warning to us all that the impact of natural disasters worldwide is on the rise. There’s just no other way to describe it except a graph that’s going up, partly because people are living closer now to some of the places that historically are seismic. Hurricanes are possibly being intensified by global warming. There are more people along the Florida coast. All that being said, natural disasters are obviously in a much bigger class of disasters. [Since] we wrote this book for people to be able to think through their own catastrophic risk management, we offered [examples] from the experience of other large companies, mainly in the U.S. We have a couple of German companies that we focused on: Deutsche Bank, Lufthansa and so on. We suggest that the vigilant manager, the watchful director, ought to be mindful of 10 separate points. One is, be alert to near-misses. What we mean by that is, “There but for the grace of God go I.” If I’m an energy producer, watch what happened to BP in the Gulf. Let’s learn from what they went through. The A-case for me is Morgan Stanley, which had been in the South Tower of the World Trade Center when 9/11 hit. Because of the events eight years earlier — in 1993, a bomb had gone off in the basement of the World Trade Center — the risk officer at Morgan Stanley said, “Who knows what else might happen? That was a near-miss.” Rick Rescorla, [vice president for corporate security,] insisted that Morgan Stanley every year practice a massive drill of evacuating the tower. When 9/11 occurred, the North Tower was hit first. Morgan Stanley is in the South Tower. Rescorla said, “Let’s get out of here,” and he managed to evacuate almost all 4,000 people. He was one individual who did not get out. He went back in to check. He is a hero for Morgan Stanley and many other people, but the bigger point taken from that is: Learn from the world around us, because these developments are intensifying. The threats are bigger. The downside is more costly. See also: 3 Challenges in Risk Management   Kunreuther: Near-misses are important in any aspect. But the other point that I think is important for today is another part of the checklist: Appreciate global connectedness and interdependencies. That point really became clear with Fukushima and with the Thailand floods. We asked each company what was the most adverse event that they faced? They had the complete freedom to say anything they wanted. The death of a CEO could have been one. Kidnapping was another. But as Mike indicated earlier, Fukushima was a critical one, and so were the Thailand floods. These were companies in the S&P 500, but they were concerned about how they were getting their parts, so supply chains were very important. They recognized after Fukushima that they were relying on a single supply chain that they couldn’t rely on for a time. Knowledge@Wharton: How can a company prepare for the unexpected death of a CEO? Useem: From looking at the companies that are pretty far into it, all we’re calling for is getting those risks figured out, then having in place a set of steps to anticipate. It’s like insurance. The best insurance is the one that never pays off because the disaster has not happened. The best risk management system is the one that’s not invoked. In the book, we get into the events surrounding a fatal Lufthansa crash. Within minutes, they were in action. Within minutes, they had called the chancellor of Germany. Within minutes, they had people heading to the scene, not because that’s what they do but because they had thought about the unimaginable, and they had in place a system to react quickly. You have to deal with an enormous amount of uncertainty when disaster strikes. Premise No. 1: Be ready to act. Premise No. 2: Be ready to work with enormous uncertainty, but don’t let that pull you back from the task ahead.

Howard Kunreuther

Profile picture for user HowardKunreuther

Howard Kunreuther

Howard C. Kunreuther is professor of decision sciences and business and public policy at the Wharton School, and co-director of the Wharton Risk Management and Decision Processes Center.

How Robotics Will Transform Claims

Robotics process automation (RPA) lets insurers handle high-volume and complex data actions at exponentially greater speed than in the past.

|
Across the insurance industry, claims organizations have made significant progress in modernizing their core processing systems in the last several years. Typically, the objectives of these programs are to increase speed, improve accuracy and reduce risks in all phases of claims handling. Given that claims interactions are “moments of truth” in customer relationships, insurers have good reason to ensure that the experience for policyholders is smooth and satisfying at every step of the process. No matter where insurers are on this continuum, robotic process automation (RPA) can help them achieve their business objectives while leveraging existing technology and boosting returns on previous and current transformation investments. In seeking the best path forward, claims leaders will want to consider:
  • Why robotics is well-suited for use in claims and how it complements other enabling technologies
  • Key components of the business case and value proposition
  • High-priority opportunities and common use cases for deploying RPA
  • Applying the principles and techniques used by successful early adopters as they develop their own implementation approach
Why RPA? Why now? RPA involves the use of virtual workers, or software robots, to perform business tasks similar to human users. The main appeal for insurers is the ability to handle high-volume and complex data actions at exponentially greater speed than in the past. RPA is also notably flexible, which makes it both business-enabling and IT-friendly. It can be deployed alone or with other technologies across the claims value chain. For example, robotics can:
  • Automate discrete tasks or activities
  • Work in concert with other systems on transaction processing, data manipulation, communication and response triggering
  • Facilitate straight-through or “no-touch” processing, working alongside analytics tool sets and other cognitive technologies, such as machine learning and natural language processing
The cost of entry for RPA in terms of financial commitment and deployment requirements is low, compared with other technologies. There is no disruptive “rip and replace” with RPA; proofs of concepts are straightforward to launch, which helps IT and business leaders get past their “not another technology" reluctance. And many benefits can be unlocked without large-scale process re-engineering. See also: Insurtech Presents Major Opportunities More than just overhauling the most routine administrative tasks, robotics creates capacity and expands the art of the possible in claims. While many assume robots simply replace human resources, RPA can – and should – be viewed as an enabler and a win-win for insurers and their workers. RPA ROI: building the business case A significant number of insurers have already implemented robotics, though few have done so at scale. ROI cycles for RPA can usually be measured in months rather than years. Most early adopters start with multiple functional “pilots” or proofs of concept that are completed in as little as 30 to 60 days. Broader, first-generation programs may take six to 12 months. Increased capacity and focus on high-value work: Robotics can free knowledge workers from the burden of routine reporting, documentation and maintenance tasks. Instead, they can focus on areas where they can provide the most value, such as managing exceptions and dealing with high-risk and complex claims. A common approach is to use RPA to support straight-through processing for claims under a certain dollar threshold. RPA may also be used to handle basic data entry tasks for claims of any amount. Industry research has found that turnaround times for these types of claims may be reduced as much as 75%–85%, with 50%–70% of repetitive tasks effectively eliminated. Higher quality and accuracy: Robots processing claims will no doubt be able to increase accuracy and reduce errors, whether related to sophisticated fraud or simple “fat-fingering,” for the vast majority of routine claims. Indeed, robots are uniquely qualified to assist quality assurance (QA) staff, given their ability to scan large quantities of data and transactions almost instantaneously. For example, RPA can help identify potentially fraudulent claims by flagging data outliers. Further, in the realm of compliance, RPA helps strengthen and streamline adherence to standard audit, risk, privacy and security policies and protocols. Increased scalability: RPA is a natural solution for insurers that need to add temporary capacity to deal with seasonal spikes in claims activity or after catastrophes. The virtual workforce can scale to peak loads without overtime and establish 24/7 processing. For example, RPA enables insurers to increase the amount of new loss intake capabilities without a corresponding increase in first notification of loss (FNOL) processing staff. The easy scalability also makes RPA a highly useful tool for insurers exploring shared services models for claims. Higher customer satisfaction: In identifying processes that can be automated, leaders should also look for opportunities to enrich the customer experience. Speed, accuracy, transparency and level of service are what matters most to claimants. RPA helps on all those fronts by allowing claims professionals to focus on the “art” of claims adjusting and customer experience, as opposed to the transactional aspects. RPA can also accelerate innovation programs in customer engagement and experience. Business rules can be configured directly into the robotics to align with customer expectations for personalization and timely communications. Strategic data usage: The quality gains and capacity improvements from RPA enable claims teams to shift from simply processing data to exploiting it for more accurate and timely reporting and insight generation. In this sense, RPA can actually be an empowering force, rather than a discouraging threat, to a claims workforce. RPA in action: where to start the journey The use of robots and automation can take many forms in claims, including both customer-facing and back-office functions and tasks. The following represent the most common and promising use cases across the industry:
    1. Streamlining vendor applications and estimating: Most current estimating processes require adjusters or others to rekey data from one form or system to another. Robotics along with enabling technology such as optical character recognition (OCR) can eliminate that duplicate effort by bridging the gap between claims systems, vendor apps and third-party estimating systems.
    2. Capturing and managing claimant data: RPA can be on the receiving end of claims submissions, especially those that typically include photos from customers. Robots can ensure the right information ends up in the right systems and attached to the right claims. As such, they ensure human representatives have the information they need to move claims forward and respond to customer inquiries. Customers who prefer self-service also benefit when submitted information is more readily accessible.
    3. Streamlining, automating and enhancing communications: Claimant communication remains a largely manual undertaking, requiring adjusters or other claims staff to initiate and, in some cases, monitor the process. RPA can help operationalize smart rules so the right letter (e.g., one required to be sent 30 days after a loss is reported) reaches the right claimant at the right time through the right channel. For instance, robots can pull data from claims submission forms and pre-populate letters that are typically housed in other systems and map distribution to customer preferences.
    4. Scanning, indexing and converting forms and data: RPA has proven especially proficient at pulling data from standard fields on medical bills, from claimant name and address, to provide information to coding details. Standard in name only, these forms are a common source of errors. Similarly, RPA can transfer and convert data across older claims systems that may be used by individual product lines or regions to newer enterprise systems.
    5. Validating payments: Conventional wisdom holds that 3-5% of claims payments are inaccurate, though no one knows for sure, given the difficulty and expense in auditing all claims. The key is robots’ ability to quickly and cost-effectively run QA on entire populations of forms and payments, rather than just a small sample. For example, rather than auditors discovering a $5,000 payment on a $500 settlement months after a customer has cashed the check, robots can flag the disparity beforehand. Further, they can help deliver the information and intelligence so that human analysts can investigate anomalies proactively.
    6. Customer-facing enhancements: RPA can alleviate the need for time-consuming and costly adjuster input by supporting customer-friendly apps for capturing photos of fender-bender car accidents and submitting all claims submission forms with just a few taps and swipes. Chatbots, another automation tool easily integrated with RPA, are already handling many routine communications tasks, including notifications of settlements and customer inquiries into claim status.
    7. Integrating other enabling technologies: RPA will become more prevalent, especially as claims groups adopt other enabling technologies. For instance, AI-powered bots will likely handle the inputs from drones conducting standard property inspections or surveying damage after catastrophic storms. Integrating RPA with machine learning and natural language processing (NLP) can enable the initiation of new claims and issue first notice of loss (FNOL) communications by scanning and analyzing unstructured communications, including emails from agents or even voice interactions. Robots will also be used widely in the real-time review of social media streams to assess claims severity and reduce fraud. RPA will receive and route advanced telematics data (including video imagery) that will be instantaneously captured during automobile accidents and downloaded from the cloud, automatically triggering an FNOL entry.
Suggested approach and lessons learned: following the leaders Significant numbers of insurers are already using RPA in their claims organizations. In designing the business case for robotics, claims leaders should seek an incremental approach, adopting more ambitious use cases once they have built momentum and demonstrated results through initial and targeted deployments. With RPA, there’s no need to try do too much too fast, which may be attractive for insurance executives seeking to minimize risk and disruption in their adoption of enabling technologies. Further, an incremental approach can help organizations overcome their natural wariness toward RPA in terms of its workforce impacts. See also: Robots and AI—It’s Just the Beginning   The following lessons learned come from early adopters: Target the opportunities: In developing a business case and tangible ROI model, specific tactical questions can lead to the right strategy as well as clarify the highest priorities for near-term automation. Finding answers may require a robust assessment of current capabilities and the completion of a cost-benefit analysis, given that the candidates for automation may number into the dozens. Engage IT early and often: To ensure a smooth implementation and integration with other systems, there are many important infrastructure, governance and security questions to address. IT leaders reluctant to deploy another technology in the claims “stack” should consider how RPA can support strategic platform upgrades and those mandated by regulatory change. Most RPA tools are product- and platform-agnostic and work with existing IT architecture. Find the right partner: External vendors and suppliers – including insurtechs, consultants and systems integrators – will be part of the solution, so it’s important to choose wisely. Beyond technical expertise, look for those firms with deep technical and operational claims knowledge, including a clear understanding of how it affects the customer experience. Don’t overlook the organizational factors: As with other “digital” initiatives, claims leaders must invest time and resources in education and, if necessary, evangelization regarding the use of RPA. The delicate matter of robots taking over jobs should be addressed, most likely in the context of the need to reskill claims workers, as the role will evolve to become more analytical and more focused on customer needs and the most complex claims. The bottom line: RPA is critical to the evolving claims process The time for adopting robotics in claims has come, due primarily to the compelling business case and imperative for claims leaders to enhance performance and contribute more value to the business. Robotics can serve as a foundation in supporting true, end-to-end automation when integrated with other advanced technologies, such as OCR, chatbots, machine learning and NLP. Indeed, as multiple early adopters have made clear, RPA is ready to help claims organizations advance and enhance outcomes in the digital era through increased automation, higher productivity and increased capacity and strategic focus for claims professionals. RPA is among the top enabling technologies insurers should consider adopting in claims, as well as other parts of the organization, due to:
  • Low cost
  • The path to ROI
  • Manageable deployment requirements
  • Flexible use cases
For the full report on which this article is based, click here.

Rob Dietz

Profile picture for user RobDietz

Rob Dietz

Rob Dietz is a principal in the advisory services practice of Ernst & Young LLP. He has more than 20 years of experience in property and casualty (P&C) insurance.

How Insurance and Blockchain Fit

Blockchain can accelerate insurance transformation and steer the industry toward digital collaboration and interoperability.

|||||
From better risk visibility and faster claims processing to collectively fighting fraud, blockchain can provide comprehensive benefits across the insurance value chain. Blockchain implementation can enormously accelerate insurance transformation and steer the industry toward digital collaboration and interoperability. Permissioned blockchains deployed in insurance consortia yield comprehensive industry benefits across the value chain in three categories: (1) preventing fraud, (2) championing interoperability in multi-party processes and (3) facilitating consumer trust and ease of auditing through data transparency and immutability. Introduction Insurance is a multitrillion-dollar industry, but the workflow in brokering trust, insuring parties and reinsuring risk items today remains an expensive, slow and fraud-prone process. Although the digital age has inevitably brought about technological innovations, the centuries-old insurance industry seems to still be heavily drowning in paperwork and redundant manual procedures. Layered with the required collaboration from a multitude of parties needed to execute certain industry tasks like enforcing policies, processing claims, underwriting contract items or drawing up contracts, the insurance process remains far from transparent, coordinated or secure. Each new party engaged in a particular insurance transaction — be it insurer, reinsurer, broker, consumer or vendor — adds a compounding set of paperwork and potential for fraud, cyber attack, lost data, misinterpretation and human error. Challenges arise in verifying this data without breaching trust, so auditing is a widely used process to ensure consistency and accuracy. But even still, trust is at an all-time low, according to a recent Edelman industry poll. The current insurance industry landscape in a snapshot:
  • The insurance industry is widely known to be slow in adopting technology and is behind digitally.
  • Legacy systems have perpetuated a closed-off insurance information environment with data silos and resulting operational inefficiencies. These gaps of knowledge between insurance stakeholders are exploitable.
  • In terms of fraud and fraud prevention spending, the numbers are unfortunately astronomical. In addition, human error also finds its place wherever manual entry and paperwork is involved.
The insurance industry epitomizes a blockchain use case. Adoption of blockchain as a standard system of industry transaction can improve collaboration between market participants and streamline market operations — freeing billions of dollars in capital otherwise spent on auditing and administrative costs, lost in fraud or frozen in collateral as a result of low risk visibility. A blockchain is a permanent and immutable ledger of transactional records distributed across a network of participants in a decentralized manner. This network can be unknown and completely decentralized (i.e. bitcoin), or known with permissioned access (consortium). Blockchain’s system of hashing a new transaction by cryptographically tying its metadata to previous transactions gives the ledger its immutable nature — where the entire history of transaction is transparent, available and indelible. Blockchain’s mechanism of arriving at consensus with no central authority allows for the decentralization of data — where no central party can control or manipulate information. This is attractive to many applications that interact with sensitive data; because there is no central authority, DDoS (distributed denial of service) attacks are futile. Blockchain is typically well-suited for environments where transactional records must be time-stamped, immutable, trust-worthy, shared and readily available. These characteristics lead blockchain to be very desirable across the industry spectrum as:
  • A trusted repository of accurate, transparent and updated data with comprehensive read/write access controls
  • An effective measure against fraud, data manipulation and human data input error
  • A champion of interoperability between data systems, thus an enabler of more efficient collaborative processes
  • A facilitator of trust between parties that may have competing interests, different incentives or separate data compliance standards; a mechanism for cross-boundary and cross-industry collaboration on workflows; an eliminator of the need for intermediaries as a trusted central authority
  • An efficient provider of quick and accurate auditing
Within the context of insurance, these features not found in traditional databases have great potential to effectively empower operational efficiency, trusted collaboration, transparency and fraud prevention. As a result, blockchain can help insurers and other insurance stakeholders reduce overhead spending, decrease margins and regain consumer trust. Blockchain can drive the insurance industry shift toward digitizing industry processes, encouraging cross-industry collaboration for visibility and compliance and collectively fighting fraud. Paired with additional emerging technologies such as IoT and smart sensors, blockchain can be a facilitator for increased automation in capturing and acting on claims data, analyzing risk more thoroughly and streamlining payment processing. Let's dive into some areas of impact: Reinsurance & underwriting Streamline Reinsurance and Underwriting Times In reinsurance, each risk in a contract requires individual underwriting — and in many cases, insurers engage with multiple reinsurance parties to secure the best negotiation for each contract item. Each institution has its own data system and standards — and these differences in process can lead to discrepancies in interpretation of the contract. Thus, currently, reinsurance and insurance institutions need to constantly engage in reconciling their books to ensure consistency in interpretation for each individual claim. In sum, the complexity of different data systems and consequent wrangling between multiple third parties to secure individual risk reinsurance leaves the reinsurance process slow, expensive and subject to misinterpretation. Blockchain technology should be leveraged in the reinsurance process to increase interoperability. With a shared digital ledger, no longer can there be the discrepancy in data format, process and standards that currently plague the industry. A permissioned blockchain ledger can be used to streamline communication, flow of information and data sharing between insurers and reinsurers as an available and trusted repository of contract information. This becomes a faster, more efficient and less-risky process as data related to loss records, asset ownership or transaction histories is recorded on a blockchain that is trusted to be authentic and up-to-date. Access to this information can be heavily permissioned with granular access controls, with exhaustive rules governing read and write capabilities per user. Reinsurers can query a blockchain to retrieve updated, real-time and trusted information rather than rely on a centralized insurance institution to report on data relevant to items (i.e. losses or transfer of ownership). This can massively expedite underwriting times. The risk transfer process is delicate: Insurers need to ensure they are appropriately rebalancing capital exposures against specific risks and be confident and calculated in offloading their contracts. The newfound visibility from participating in a permissioned blockchain ledger provides confidence and flexibility in moving capital to other areas of business, as well as a more accurate and expedited risk assessment. If blockchain is leveraged to provide more visibility into risk information, reinsurers can more accurately and confidently take on the calculated risk. Fraud Detection & Prevention The total cost of insurance fraud (non-health insurance) is estimated to be more than $40 billion per year. That means insurance fraud costs the average U.S. family between $400 and $700 per year in the form of increased premiums. The lack of interoperability within the insurance industry doesn’t just kill efficiency — it also hinders progress toward the digital collaboration required to identify patterns, trends and known actors in preventing fraud. These gaps in visibility leave consistent vulnerabilities for fraudulent activity, where brokers can pocket premiums, individuals can make multiple claims for the same loss or capital can illegally move offshore. The centralization of data within the four walls of each institution leaves little room for the industry to collectively fight these common insurance crimes. Blockchain implementation could support this needed coordination, while also providing granular access controls to ensure data security. As an immutable ledger decentralized among all parties in the insurance process, blockchain closes the paperwork gaps and bridges the data silos, allowing for fewer potential areas for exploitation. Blockchain within a consortium of insurance entities could facilitate the sharing of fraudulent claims for heightened visibility into known actors and for better preparedness. Blockchain provides validation and verification on an unalterable ledger, which can be leveraged for the identification of duplicate transactions, repeat actions by suspicious parties, fraudulent movement of funds across borders and more. Pairing this technology with machine learning would make an excellent fraud detection strategy. Blockchain can also be used as a ledger to track ownership of assets through digital certificates, and then be queried to validate their authenticity, ownership and provenance. This can reduce counterfeiting while also improving the efficiency of the entire claims management process. As a shared, transparent and decentralized ledger, blockchain will inevitably discourage future attempts at fraud, as the opportunity for exploitation is smaller and the potential for detection greater. Less fraud = higher margins = cheaper premiums for consumers. A win-win situation, indeed. Claims processing Improve Claims Processing for Property and Casualty Insurance Processing a claim in today’s insurance environment is a complex, multi-party task. To evaluate and process an insurance claim, insurers, regulators and third parties (like a private healthcare institution or an auto repair shop) need to coordinate and arrive at consensus across a host of data points. For example, a car accident between two drivers necessitates a loss assessment that assembles information from an asset database, weather statistics, credit reports, inspections providers, authorities report and other sources. Each driver’s insurance company likely collects and analyzes this data in an entirely different system and process. Because each entity has its own data standards and processing technique, the claims process typically involves significant manual data re-entry and duplication across the value chain. This not only increases needless redundancies and inefficiency, but also widens the opportunity for human error and even fraud. A distributed ledger can be used by insurers and third parties to digitally access and update data relevant to claims for a faster, more secure and less error-prone claims management process. Blockchain facilitates the interoperability needed for this level of collaboration without the associated risk of DDos attacks or falsified transactions. This level of visibility is not only advantageous for institutional efficiency and accuracy but also helps consumers firmly trust in the fairness of the claims process. Paired with streaming data sources, such as sensors, mobile phones or IoT technologies, blockchain can also help significantly streamline a claims submission, reduce settlement time and reduce loss adjuster costs. Adding to the auto wreck example above, an IoT sensor in one of the cars involved could automatically initiate a claim with the necessary reference data. A smart contract could automate coverage confirmation and consequent settlement payouts with programmable code — with essentially no human intervention along the entire payment process. While digital contracts like this exist already, the benefits of a blockchain-power smart contract lies in its transparency and credibility. Auditing & Trust Immutability for Efficient Auditing; Trust Auditors evaluate scores of ledgers — both online and offline — to reconcile reports and data spanning multiple locations and years. Needless to say, the process to ensure consistency and reliability in transactions and generate a compliance certification is lengthy and complicated. Digital signatures, sequences of events and actors of a particular transaction can be easily and efficiently audited if those events were to be recorded on a blockchain ledger. Institutions need to simply add access for an auditing party to their relevant permissioned blockchains. Blockchain immutability and finality guarantees the integrity of the entire transaction history, all in one place. Companies like Docsmore have announced pilot programs for recorded signatures on a blockchain. Identity Management Increase security and share-ability of identity information With recent, massive data breaches from some of the largest institutions over the past few years, improving the security of personal data — and thus customer trust — should be the forefront initiative for insurance institutions. Manual data entry — often repeated — should be replaced with a better, decentralized system with no single point of failure Blockchain is a perfect tool for sharing identity information while ensuring the privacy of consumers. Specifically, KYC (Know your Customer) and AML (Anti-Money Laundering) laws require institutions onboarding new clients to go through expensive and comprehensive steps to ensure compliance with these laws. This is traditionally accomplished internally, with multiple ledgers resulting in multiple certified identity versions across the entire insurance network. However, blockchain technology could provide a secure, distributed ledger for network participants to engage in cross-institutional client verification for KYC/AML compliance. In addition, a simple query of the blockchain can reproduce an immutable history of identity data, making regular compliance checkups and monitoring for changes an easy and inexpensive process. Query permissions could be set in place to ensure that consumer privacy is protected and that access to information is appropriately handled. The distributed nature of the blockchain ledger is also attractive for storing sensitive data — like identify information — because it limits the viability of DDos attacks. This standardization in identity management would require collaboration from not only the insurance industry, but also governments, tax authorities, bureaus, banks and other financial corporations. However, the savings for all would be well worth the coordination. Asset Management Tracking assets along a supply chain As demonstrated comprehensively in our previous blog post, insurance fraud can be prevented when assets along a supply chain are verifiably tracked with blockchain finality. Auditing becomes a breeze, and risk provenance can be proven for better estimates, faster claims processing and a reduction in fraudulent underwriting. See also: Blockchain: the Next Big Wave?   Where FlureeDB fits in As an enabler of consortium blockchains, FlureeDB can provide a single source of truth for harmonized insurance data to be stored, queried and transacted with blockchain characteristics. Data-Centric —Most blockchains operate on the “business logic” tier, where enterprises still need to push data and metadata related to blockchain transactions to a static, centralized legacy system. FlureeDB brings blockchain to the data tier — allowing for an entire database to be distributed across its network. Network participants can query at will and know they have the full data set.

Modern Database Characteristics for Enterprises —FlureeDB is first and foremost a powerful database with familiar, SQL-like syntax. Any development team would be able to set up a blockchain database without having to learn a complex set of new skills. With modern database characteristics like ACID compliance, a RESTful API and a graph-style query structure, FlureeDB is optimized to meet traditional enterprise requirements.

Granular Permission Logic for Access Control —Because insurance information is stored in a decentralized manner as one record, granular and highly functional access/permission models are essential to protecting data security. FlureeDB uniquely builds permission information (both read and write) directly into application data at the most granular of levels. This simple and flexible approach to data accessibility lends itself perfectly to blockchain environments — where a distributed ledger is shared across third parties in a network. Companies using FlureeDB can even hand a customer or vendor a direct line of access to the database without needing to use multiple API endpoints — queries only return the information for which a particular user has explicit read access. Blockchain Immutability —FlureeDB builds every transaction into a block within an immutable, append-only blockchain. This allows for massive auditing savings. Holding a complete and indelible history of transactions also enables institutions to throw highly advanced analytical queries to return increased visibility into practices like fraud prevention measures, internal compliance validation checks or risk assessments. Time Travel —“Time travel” is enabled by the blockchain’s immutable history: Queries can be issued at any point in time, empowering an application to reproduce any instance of the database with no extra development effort. This capability strongly reduces waste in development time and allows for apps to “rewind” to any database state with ease. Composite Consensus —With varying relationships and diverse data, insurers need to partition information to be read by only the appropriate parties. FlureeDB allows data to be segmented onto multiple databases — both publicly and privately held — but join together to query as one set from an application point of view. This means a singular application dealing with insurers, reinsurers, third parties and consumers can keep private information out-of-sight, but still leverage blockchain without having to figure out multiple integrations. Conclusion Blockchain technology, its believers, its vendors and its growth in adoption won’t wipe out the $40 billion-plus fraud, nor will it “fix” the insurance industry in one fell swoop. Such silver bullet claims are overzealous. But blockchain does pose unique characteristics that should be included in the discussion for industry transformation. Blockchain — simply in its very existence — won’t disrupt anything unless it is leveraged by and collaborated on within the insurance industry and with its secondary players and its technological partners. Brokers shouldn’t be paralyzed by blockchain’s potential to disintermediate their industry, but should rather embrace and harness its value to drive costs down and remain competitive. The few entities that take the bold step forward to early adoption will be rewarded with consumer trust, lower margins and larger market share. Now is the time for industry leaders to drive a sweeping transformational agenda with digital collaboration as the key theme and blockchain as the key mechanism.