Download

Picking Winners Among 2017 Innovators to Watch

2017 Innovators to Watch Winners

sixthings

With tens of thousands of early-stage tech startups focused on transforming risk and insurance, it’s understandable if many in the insurance industry struggle to identify potential partners to drive their innovation efforts.

We try to make that less of a struggle.

The core of ITL’s mission is to work with entrepreneurs and incumbent insurers to produce systematic innovation based on technology. Our Innovator’s Edge platform helps insurance industry members define their innovation growth strategies, discover opportunity and execute solutions that scale, while drawing on relevant startups from around the world.

Innovator’s Edge is, quite simply, a growth platform for early-stage companies as well as for industry incumbents, regardless of size.

To help with the vetting, we select six Innovators to Watch each month from the growing population of early-stage companies that take the extra step of completing our Market Maturity Review, or MMR. Starting in 2017, we have published a monthly list of companies with a compelling solution, business model and team. The MMR comprises five modules of information designed to provide a potential insurance partner or investor with additional detail on a company and its solution, beyond the sort of data you can get from those who just track startup funding.

In 2017, we recognized 54 companies as Innovators to Watch. As we prepare to recognize more companies in the year ahead, we looked back at those companies and are excited to report that the information from the MMRs, combined with algorithms running on IE and with our innovation system, leads to an ability to select scalable winners at a ratio that would defy credibility if we published the number.

With full recognition that 54 is a small data sample, we still found some of the data revealing:

As a group, 40 of the Innovators to Watch have generated $436 million in total funding to date, for an average of $10.9 million. (Fourteen of the 54 companies reported a range of funding, as NDAs limited their ability to disclose details.) Since the announcement, Vertafore paid an undisclosed amount to buy one of the companies, RiskMatch, which provides analytics, portfolio management and placement intelligence.

The largest number, nearly 30%, were founded in 2016, followed by 24% founded in 2012. The newest was founded in 2017, and the oldest was founded in 2005.

YEAR-FOUNDED2.png

TechCrunch collaborated with Crunchbase in 2015, then again in 2017, to complete studies of gender diversity among startups in Silicon Valley. In both studies, approximately 17% of all startups included women among their founders. Insurance industry statistics mirror that number: Women hold less than 10% of "C-level" roles and an estimated 17% of the next tier of executive positions. By contrast, 48% of our Innovators to Watch from 2017 include female founders.

The companies are located around the world, though nearly 75% are based in the United States.

COUNTRY2.png

The 2017 Innovators to Watch focus on a wide swath of the insurance value chain. Slightly more than half—52%—provide solutions to broadly benefit insurance companies, from better data and analytics to risk information and enhanced process capabilities. The next largest areas of focus were agents and brokers, 15%, and healthcare/health insurance, 13%.

focus.png

The innovative technologies deployed by these innovators is a long list, but analytics, big data and business intelligence are most frequently cited.

TECHNOLOGIES2.png

We followed up with the Innovators to Watch to learn what the recognition meant for them.

Of the honorees, 87% used the recognition in their marketing and promotional efforts, including social media, web site, email signatures and sending the news to current and prospective customers.

Also, 81% said being named an Innovator to Watch contributed to a positive outcome in their business development, with honorees noting it raised their credibility, added visibility and drove traffic to their web sites.

I2WBENEFITS.png

"Innovators Edge is a leading publication in the insurtech ecosystem. Being named one of the Innovators to Watch gave us a huge credibility boost," one founder said.

"Being identified and published as one of ITL’s 6 Innovators to Watch helped us gain exposure, be taken more seriously with our future technologies and strengthen our relationship with global reinsurers," another said.

We are glad that the recognition program helped. We hope that it is similarly useful to other early-stage tech companies that are wondering how to boost their visibility and relevance to potential insurance industry partners; being part of Innovator’s Edge and completing the MMR is a good start.

For those insurance industry incumbents that are struggling to find meaningful innovators and insurtechs, Innovators to Watch is our chance to share with you some of the more interesting and innovative startups, ones that we believe hold great promise for driving exponential growth. The global community of early-stage companies knows all too well that the best ideas never stand the test of markets, of adoption, without a solid team focused on execution, and we’ll help where we can.  

Stay tuned in 2018 for more Innovators to Watch.

View the complete list of all 54 companies recognized as Innovators to Watch in 2017.

Cheers,

Paul Carroll,
Editor in Chief


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

Taking Care of Small-Medium Business

Succeeding in the SMB market requires understanding the unique characteristics of segments and developing appropriate strategies for each.

In the 1973, Bachman Turner Overdrive hit song, Taking Care of Business, it talks about employees getting annoyed and becoming self-employed, something that is happening 45 years later in the new gig economy. The growth of new small and medium businesses and the fight for talent is creating challenges and opportunities for insurers. And just like the song … in today’s rapidly changing marketplace with new products and competitors, insurers must take care of SMB businesses to grow, let alone survive. Change is being forced on insurers, whether they like it or not. A new insurance paradigm is being crafted, regardless of whether incumbent insurers choose, or are able, to play to compete in a new digital era … Digital Insurance 2.0. Uniquely, SMBs are at the forefront of this digital shift and at the center of business creation, business transformation and growth in the economy. Representing the vast majority of all U.S. businesses, SMBs promise huge market potential for insurers to provide coverage for both traditional and new, emerging risks. However, the business models and products from Insurance 1.0, present during the last 30-plus years and built around the Silent and Baby Boomer generation business owners, will not work in a Digital Insurance 2.0 era that is driven by Millennial and Gen Z business owners. Succeeding in the SMB market requires an understanding of the unique characteristics of individual segments and developing appropriate strategies for each. New Expectations and Behaviors Set a New Competitive Bar To help insurers capture these opportunities and adapt to Digital Insurance 2.0, Majesco recently conducted a new primary research study. The study built on last year’s research, The Rise of the Small-Medium Business Insurance Customer, which revealed increasingly higher participation rates in digital trends and technologies as well as interest in considering new insurance-related digital capabilities, new products and new business models. In this year’s research, we dived deeper into these new products, business models and capabilities to assess their interest and their potential to accelerate the shift to Digital Insurance 2.0 offerings — offerings that could challenge traditional Insurance 1.0 business models. We decomposed a range of new products and business models into individual attributes and gauged reactions to them across both business size and owner generation. The results provide insight on the competitive bar set by these emerging new innovations and competitors. So, what did we find? The new research report, Insights for Growth Strategies: The New SMB Insurance Customer, underscores an acceleration in digital-driven SMB behaviors and expectations, as well as strong openness to considering buying and switching to new products, services, capabilities and business models that reflect Digital Insurance 2.0. In analyzing the data, we found that the differences across the business size and owner generation segments have vital implications for insurers on why and how they must shift to engage and provide innovative, relevant products to an increasingly digitally oriented SMB market. Interestingly, between the 2016 and 2017 Majesco surveys, we found an acceleration of as much as 20% in digital behaviors and the use of new technologies, particularly for Gen X and Boomers. Gen Z and millennials also maintained their digital leadership position, with 80%-90% across all company sizes engaging in at least one trend or technology, but often engaging in multiple trends or technologies. These technology-driven behaviors signal an acceleration across all business sizes and generations in reshaping insurance to Digital Insurance 2.0. Increasingly, these behaviors and technologies are embedded in new products and business models introduced by new competitors, which will attract new customers and encourage existing customers to switch. See also: How Small Insurers Can Grow   The top trends and technologies across all the business sizes and generational groups included:
  • hired a freelancer/independent contractor for a limited period,
  • used cloud-based subscription fee software (e.g. Microsoft Office),
  • paid for something with ApplePay/Samsung Pay,
  • used smart devices within the office/building,
  • worked as a freelancer/contractor,
  • and used an app to monitor the business office or building for security or equipment.
Most striking, the relationships between generation and company size accentuates two significant gaps; a growing generational gap and a widening gap between Insurance 1.0 and Digital Insurance 2.0, whereby participation is greater for the larger and younger segments, and decreases for the smaller and older segments. This sets the stage for customers shifting to fresh, innovative products and business models introduced by new competitive players. New Innovations and Competition to Take Care of SMB Businesses The digital revolution is rewriting the rules of business and, with it, redesigning organizational and business model structures. Insurers are faced with a predictive dilemma: Among all of the new models and products emerging in the insurance market, which ones will gain traction in the market? How fast? And for whom? Based on our research, a number of top-rated attributes with overwhelmingly strong appeal offer immediate options for insurers to test and learn in the market. For example, reducing costs and risks through value-added services, quote and buy and social networking options are areas where insurers can immediately begin to innovate. There are numerous additional possibilities for tailoring combinations of attributes to meet unique generation and company size segment preferences that offer innovative opportunities. In particular, younger generation business owners from the two larger business sizes are the most interested in innovations for insurance and therefore more likely to consider new products and competitors. Many of the new competitors are using different combinations of the 30 attributes we assessed as building blocks to innovate their products and business models. We tested SMB reactions to some of these new business model concepts launched in the market the last several years. The results highlight strong interest in these new business models, particularly among the younger generations and larger SMB segments. Initial reactions to the business models generally showed positive ratings of 50% or higher. But when adding the neutral “swing groups,” interest significantly rose to more than 80%. This suggests a strong openness to consider these new models. The next question is, will they purchase? The answer is, “Yes, they will.” Gen Z and Millennials, in particular, indicate positive “likelihood to purchase” ratings of 50%.  But, when adding the “swing groups,” purchase potential jumps to 70%-90% across most of the segments, emphasizing how rapidly swing groups will likely shift the market. While these new models’ long-term viability is yet to be determined, the growing interest and likelihood to purchase suggests they have significant potential to capture the market opportunity, particularly the next generation of SMB business owners. Generational Transition of Leadership … It’s a Matter of Time and Experience SMB’s market potential is significant: U.S. Census data show that those with fewer than 10 employees represent nearly 80% of all businesses in the U.S. And while distribution economics dictate that the optimal way to reach them is through digital channels, with direct-to-business models via self-service, the research results indicate a “one size fits all” approach will not work. Two powerful forces are compelling insurers to make the transition to meet the needs of this important market: time and experience. First, it’s just a matter of time before traditional operating models will no longer work based on a combination of new factors. By 2020, more than 60% of small businesses in the U.S. will be owned by millennials and Gen Xers — two groups that greatly prefer digital engagement. The 2016 Upwork and the Freelancers Union’s annual survey, Freelancing in America, estimated that 35% of the U.S. workforce is made up of freelancers and independent contractors, the basis of the new gig economy. Furthermore, globally, millennials appear to be more active in the startup space. Dubbed “millennipreneurs,” they are starting more companies per person, managing bigger staffs and targeting higher profits than their baby boomer predecessors did. And finally, a 2016 survey of U.S. young adults 15-32 years old, Gen Z and millennials, showed that 55% expressed interest in starting their own business or non-profit someday. Second, it’s about experience. Digital experience. As the transition of SMB businesses from the baby boomers and Gen X to the next generation of Gen Z and millennials continues to accelerate, digital experience matters more and more. The older generations have extensive experience with Insurance 1.0, but despite their increasing use of new trends and technologies, they lag in digital experience and comfort in embracing Digital Insurance 2.0. In contrast, Gen Z and millennial SMB owners are overwhelmingly embracing and expecting Digital Insurance 2.0 models. The question is … are you ready? Bridging the Business Gap of Insurance 1.0 to Digital Insurance 2.0 to Take Care of SMB Business While Insurance 1.0 preferences are firmly in place with the smallest businesses and older-generation business owners, insurance companies must rapidly adapt and innovate to retain them as the businesses move to a younger generation of leaders. Adding fuel to the shift, the growth in the gig economy and SMBs’ rising Digital Insurance 2.0 preferences are creating a significant business gap in products and business models that insurers need to bridge, rapidly. How to proceed? Insurers can use these findings to strategize, prioritize and develop unique business plans to capture these diverse SMB market segments. With the pronounced differences in patterns across generations and business sizes, different market and product strategies are necessary. To facilitate this thinking, we developed SMB segment playbooks that highlight the attributes (the “ingredients”) that constitute the “ideal” insurance offerings (“the innovations”) for each segment (the “recipe model”).  But regardless of segment, insurers must rapidly move to a new generation of digital insurance platforms that personalize and maximize the customer journey with deeper engagement, enable process digitization, use digital data-driven insights, adapt to rapid market changes or opportunities and enable rapid rollout of new products and capabilities … a Digital Insurance 2.0 platform. See also: Secret Sauce for New Business Models?   It is a new age of insurance — a digital age.  And it’s all about taking care of business … small-medium businesses. For those willing to bridge the business gap from Insurance 1.0 to Digital Insurance 2.0 … join the chorus with the new generation of SMB customers! And we be taking care of business (every day) Taking care of business (every way) We’ve been taking care of business (it’s all mine) Taking care of business and working overtime.

Denise Garth

Profile picture for user DeniseGarth

Denise Garth

Denise Garth is senior vice president, strategic marketing, responsible for leading marketing, industry relations and innovation in support of Majesco's client-centric strategy.

Healthcare Data: The Art and the Science

There is a basic framework required for a data quality initiative, plus some lesser-understood processes that need to be in place to succeed.

Medicine is often considered part science and part art. There is a huge amount of content to master, but there is an equal amount of technique regarding diagnosis and delivery of service. To optimally succeed, care providers need to master both components. The same can be said for the problem of processing healthcare data in bulk. In spite of the existence of many standards and protocols regarding healthcare data, the problem of translating and consolidating data across many sources of information in a reliable and repeatable way is a tremendous challenge. At the heart of this challenge is recognizing when quality has been compromised. The successful implementation of a data quality program within an organization, similar to medicine, also combines a science with an art form. Here, we will run through the basic framework that is essential to data quality initiative and then provide some of the lesser-understood processes that need to be in place in order to succeed. The science of implementing a data quality program is relatively straightforward. There is field-level validation, which ensures that strings, dates, numbers and lists of valid values are in good form. There is cross-field validation and cross-record validation, which checks the integrity of the expected relationships to be found within the data. There is also profiling, which considers historical changes in the distribution and volume of data and determines significance. Establishing a framework to embed this level of quality checks and associated reporting is a major effort, but it is also clearly an essential part of any successful implementation involving healthcare data. Data profiling and historical trending are also essential tools in the science of data-quality management. As we go further down the path of conforming and translating our healthcare data, there are inferences to be made. There is provider and member matching based on algorithms, categorizations and mappings that are logic-based, and then there are the actual analytical results and insights generated from the data for application consumption. See also: Big Data? How About Quality Data?   Whether your downstream application is analytical, workflow, audit, outreach-based or something else, you will want to profile and perform historical trending of the final result of your load processes. There are so many data dependencies between and among fields and data sets that it is nearly impossible for you to anticipate them all. A small change in the relationship between, say, the place of service and the specialty of the service provider can alter your end-state results in surprising and unexpected ways. This is the science of data-quality management. It is quite difficult to establish full coverage – nearly impossible - and that is where "art" comes into play. If we do a good job and implement a solid framework and reporting around data quality, we immediately find that there is too much information. We are flooded with endless sets of exceptions and variations. The imperative of all of this activity is to answer the question, “Are our results valid?” Odd as it may seem, there is some likelihood that key teams or resident SMEs will decide not to use all that exception data because it is hard to find the relevant exceptions from the irrelevant. This is a more common outcome than one might think. How do we figure out which checks are the important ones? Simple cases are easy to understand. If the system doesn’t do outbound calls, then maybe phone number validation is not very important. If there is no e-mail generation or letter generation, maybe these data components are not so critical. In many organizations, the final quality verification is done by inspection, reviewing reports and UI screens. Inspecting the final product is not a bad thing and is prudent in most environments, but clearly, unless there is some automated validation of the overall results, such organizations are bound to learn of their data problems from their customers. This is not quite the outcome we want. The point is that many data-quality implementations are centered primarily on the data as it comes in, and less on the outcomes produced. Back to the science. The overall intake process can be broken down into three phases: staging, model generation and insight generation. We can think of our data-quality analysis as post-processes to these three phases. Post-staging, we look at the domain (field)-level quality; post-model generation, we look at relationships, key generation, new and orphaned entities. Post-insight generation, we check our results to see if they are correct, consistent and in line with prior historical results. If the ingestion process takes many hours, days or weeks, we will not want to wait until the entire process has completed to find out that results don’t look good. The cost of re-running processes is a major consideration. Missing a deadline due to the need to re-run is a major setback. The art of data quality management is figuring out how separate the noise from the essential information. Instead of showing all test results from all of the validations, we need to learn how to minimize the set of tests made while maximizing the chances of seeing meaningful anomalies. Just as an effective physician would not subject patients to countless tests that may or may not be relevant to a particular condition, an effective data-quality program should not present endless test results that may or may not be relevant to the critical question regarding new data introduced to the system. Is it good enough to continue, or is there a problem? We need to construct a minimum number of views into the data that represents a critical set and is a determinant of data quality. This minimum reporting set is not static, but changes as the product changes. The key is to focus on insights, results and, generally, the outputs of your system. The critical function of your system determines the critical set. Validation should be based on the configuration of your customer. Data that is received and processed but not actively used should not be validated along with data that is used. There is also a need for customer-specific validation in many cases. You will want controls by product and by customer. The mechanics of adding new validation checks should be easy and the framework should scale to accommodate large numbers of validations. The priority of each verification should be considered carefully. Too many critical checks and you miss clues that are buried in data. Too few and you miss clues because they don’t stand out. See also: 4 Ways to Keep Data Quality High   Profiling your own validation data is also a key. You should know, historically, how many of each type of errors you typically encounter and flag statistically significant variation just as you would when you detect variations in essential data elements and entities.  Architecture is important. You will want the ability to profile and report anything that implies it is developed in a common way that is centralized rather than having different approaches to different areas you want to profile. Embedding critical validations as early in the ingestion process as possible is essential. It is often possible to provide validations that emulate downstream processing. The quality team should have incentives to pursue these types of checks on a continuing basis. They are not obvious and are never complete, but are part of any healthy data-quality initiative. A continuous improvement program should be in place to monitor and tune the overall process. Unless the system is static, codes change, dependencies change and data inputs change. There will be challenges, and with every exposed gap found late in the process there is an opportunity to improve. This post had glossed over a large amount of material, and I have oversimplified much to convey some of the not-so-obvious learnings of the craft. Quality is a big topic, and organizations should treat it as such. Getting true value is indeed an art as it is easy to invest and not get the intended return. This is not a project with a beginning and an end but a continuing process. Just as with the practice of medicine, there is a lot to learn in terms of the science of constructing the proper machinery, but there is an art to establishing active policies and priorities that effectively deliver successfully.

Ben Steverman

Profile picture for user BenSteverman

Ben Steverman

Ben Steverman is the chief technology officer at SCIO Health Analytics. He has been a leader in software development and architecture for more than 30 years and brings breadth and depth of experience designing large-scale server architectures.

Global Risks in 2018: What Lies Ahead?

The environment has become a major issue, and global risks are now highly interdependent, creating the prospect of cascading problems.

|
What are the biggest risks that individuals, businesses and governments face in the year ahead, and beyond? According to the 2018 Global Risks Report, published by the World Economic Forum, the environment, cyber security and geopolitics are the areas drawing the most concern. The World Economic Forum — which just held its annual meeting in Davos, Switzerland — develops the Global Risks Report in collaboration with Wharton’s Risk Management and Decision Processes Center. “The big message that came out of this report is the tremendous importance of the environment” as an area to watch, said Howard Kunreuther, Wharton professor of operations, information and decisions, and co-director of the Risk Management and Decision Processes Center. “It’s not that that wasn’t [a concern] earlier, but it certainly didn’t have as high a profile.” The other big takeaway from this year’s report is how various types of risks are interdependent, which has implications for preparation and mitigation, Kunreuther said. “You begin to see clear arrows that go from climate change to food security, to natural disasters, to droughts and to a set of things that can happen.” To be sure, cyber security, data fraud and theft don’t necessarily link immediately to something like natural disasters. But if one of those events leads to some larger, overall instability, the outcomes could be worse than expected. “[Risk interdepenency] is a critical aspect that risk managers need to think about on a global scale. One thing can lead to other things and have a cascading effect,” said Jeffrey Czajkowski, managing director of the Risk Management and Decision Processes Center. “It’s critical for people to get their heads around it and start to think about how to better manage these risks.” See also: Global Trend Map No. 7: Internet of Things   The report looks at 30 different risks among five major categories – economic, environmental, geopolitical, social and technological. It draws on surveys of risk experts across the globe. ‘Pushing Our Planet to the Brink’ Heading the list of the risks in 2018 are environmental, cyber security and geopolitical risks. “We have been pushing our planet to the brink, and the damage is becoming increasingly clear,” the report states. “Biodiversity is being lost at mass-extinction rates, agricultural systems are under strain and pollution of the air and sea has become an increasingly pressing threat to human health.” On cyber security, the report says, “Attacks against businesses have almost doubled in five years, and incidents that would once have been considered extraordinary are becoming more and more commonplace.” On the geopolitical front, “rules-based approaches have been fraying,” the report says. “Re-establishing the state as the primary locus of power and legitimacy has become an increasingly attractive strategy for many countries, but one that leaves many smaller states squeezed as the geopolitical sands shift.” This year’s report introduces three new sections — Future Shocks, Hindsight and Risk Reassessment — in an attempt to provide “a new lens through which to view the increasingly complex world of global risks.” The Hindsight section, for example, revisits past reports “to gauge risk-mitigation efforts and highlight lingering risks that might warrant increased attention.” The 2017 Global Risks Report listed “economic inequality, societal polarization and intensifying environmental dangers” as the top three trends that will shape global developments over the next decade. Among the greatest risks that the report focuses on are geopolitical ones such as tensions between North Korea and South Korea, Kunreuther said. The World Economic Forum brings world leaders to the table where such issues could be discussed, he added. “We want to take down a lot of the blinders here and make sure the agendas are such that we can think out of the box.” Cjazkowski pointed out that this year’s report also highlights the need to invest in resiliency. “There’s a big push to make communities, nations and individuals more resilient to a lot of these different risks. But the big question is: How do you pay for that? Where are you going to get the financing to [promote] resiliency?” The report is taking a first step in trying to understand those issues, he said. ‘It Won’t Happen to Me’ Bias plays a big role in how potential risks are evaluated, and the report focuses on that aspect, as well. ”You need to take a long-term view of a lot of these risks and how you’re going to deal with them,” Czajkowski said. “Oftentimes, people, organizations or governments have problems thinking with a long-term view because of short-term incentives or the short-term decisions they’re dealing with.” “It tends [to be the case] that only after an event happens do people pay attention,” Kunreuther said, adding that the report identifies this problem as “availability bias.” “[The report] is suggesting that organizations and individuals pay better attention beforehand. We can’t think of a more important message to highlight.” For example, he says, people may hear that there is a one in 100 chance that a major hurricane may strike their area in a given year. That may seem like a low probability to many. But, “if you’re living in the same house for 25 years, there’s a greater than one in five chance of having something like this happen.” Kunreuther explained how such messaging could lead to some planning. “Can you take steps when you don’t think [a disaster] is going to happen?” he asked. “The issue of black swans gets brought up all the time – ‘It’s such a low probability event that we’re not going to think about it.’” See also: Global Trend Map No. 1: Industry Challenges   Cjazkowski offered an example of how such biases play out. The Risk Center typically classifies natural disasters as “low-probability, high-impact events” for an individual or a community. But on a global scale, such incidents become a high probability, he noted. “It is going to happen — but where that’s going to be is a different question. That is where this cognitive bias comes into play.” Kunreuther highlighted the critical role of local officials in taking the longer-term view. “The Global Risks Report is really trying to overcome the ‘NIMTOF’ acronym – “Not in my term of office,” he said. “[We need] to get people to think there’s a longer term than just getting reelected, and that they have to think about putting money into [areas like] infrastructure.” You can find the article originally published here.

Howard Kunreuther

Profile picture for user HowardKunreuther

Howard Kunreuther

Howard C. Kunreuther is professor of decision sciences and business and public policy at the Wharton School, and co-director of the Wharton Risk Management and Decision Processes Center.

Cognitive Dissonance and the CRO

Chief risk officers need to be able to accommodate two opposing views: the traditional approach, plus some important new tools.

Could F. Scott Fitzgerald have had chief risk officers (CROs) in mind when he wrote, “The test of a first rate intelligence is the ability to hold two opposed views in the mind at the same time and still retain the ability to function”? Probably not. But, based on recent discussions with some leading insurance CROs and my own experience in the industry, there are a surprising number of circumstances where a CRO needs to accommodate two opposing views. Exploring these circumstances can shed some interesting light on how the CRO role has evolved over the last several years and where it may be heading. CROs’ early focus was on the development and implementation of economic capital and a concerted effort to meet enhanced regulatory expectations. It is now more nuanced. Economic capital: A rule that needs to be followed and a model that needs to be questioned The development and utilization of economic capital (EC) is a good starting point to explore the CRO’s cognitive dissonance. Economic capital is a powerful and indispensable concept; arguably the most powerful weapon in the CRO’s arsenal. It allows insurers to quantify many of their most important risks in precise monetary terms that can be translated into precise actions. Like, “add this much to the product price to accommodate its risks” or “buy this asset not that asset because it has a better risk-adjusted return.” For economic capital to do its work, it needs to be a rule that is followed. From its most comprehensive manifestation – the expected level of capital that the insurer should hold – to the tolerances and limits that inform pricing decisions and individual asset transactions, insurers need to build economic capital values into their decision-making fabric. At the same time, the CRO recognizes that the economic capital values are model output. They depend on a lot of assumptions. And the underlying methodology, that risk is best quantified as the upper bound of a high confidence interval such as 99% or 99.5%, is only one of many meaningful options. The CRO should develop insight into how other assumptions and methodologies would affect business decision making. Furthermore, risk managers also need to employ completely different tools, like stress testing. And these could lead to new and conflicting insights that the CRO needs to reconcile with economic capital’s definitive outcomes. The dissonance engendered by economic capital presents a particular challenge for CROs with long experience in insurance ERM. More than any other development, economic capital was the progenitor of enterprise risk management (ERM). Before economic capital, ERM consisted primarily of risk lists and heat maps. Economic capital provided a solid foundation to decision making, particularly related to credit and market risks in the period leading up to and during the last recession. But, as the industry evolves, and credit and market risk taking has stabilized and often declined, new risk and new ways of managing risk need more attention. CROs who grew up with economic capital as the defining feature of their job may need to exert special effort to champion non-EC tools’ decision making potential. See also: Insurance CROs: Shifting to Offense   As Isaiah Berlin noted in "The Fox and the Hedgehog," “A fox knows many things but a hedgehog one important thing.” Considering the importance of EC in the emergence of ERM, it is reasonable to think of the risk function as a very quant-oriented one. Calculating EC is a complex undertaking requiring a high level of mathematical and financial acumen. Certainly it is a great example of “one important thing." However, other, equally important aspects of the CRO role need a much broader vision. In keeping an eye out for emerging sources of risk and new challenges, it would be good to know “many things.” We have noticed that successful operational risk management efforts feature a multifaceted mindset when helping businesses recognize and manage these risks. Contrast this with model risk management where a more singled-minded focus is required. Even within the narrow world of some traditional risk thinking, taking a broader view could yield innovative and profitable outcomes. For example, mortality and longevity risk is almost universally viewed one way: from a retrospective experience perspective, with mortality rates varying by age and gender. Risk values are generated by shocking these rates; upwards for mortality (representing the impact of a pandemic) and downward for longevity (representing significant medical advances in treating deadly diseases). But broader, informed thinking by someone or a group could find an alternative, likely one that looks at underlying fundamentals and uses advanced analytics to develop better and more actionable insight. As ERM continues to develop, both hedgehogs and foxes are necessary. And the CRO needs to be able to effectively communicate with and manage both. Putting a price on priceless information In a business that is all about taking risk, most senior management teams certainly would rank good information about risk as essential to the effective management of their business. To call this information “priceless” would not be an exaggeration. The last recession put great pressure on regulators and, through them, on insurance companies to quickly upgrade their risk capabilities. For many regulators, the cost of achieving these upgrades was much less of a concern than thoroughness and completeness. Both of these forces, business need and regulatory pressure, put significant demands on the risk function. Faced with these demands, it has been fairly easy to put programs and people in place that address acute needs without being unduly constrained by program price. However, the absence of price constraints has obvious negative implications. Any business has limited resources. And, for much of the insurance industry, the trends in customer demands and purchase/service platforms is away from high-margin options. Furthermore, the lack of spending discipline can easily lead to maintaining a status quo that overspends on some areas and ignores others. As priceless as good risk information can be, some is more valuable than others, and some can be produced with the same value but at a lower cost. Implications: Where is ERM heading and how can CROs prepare? The CRO’s role has evolved significantly over the last several years. CROs’ early focus was on the development and implementation of EC and a concerted effort to meet enhanced regulatory expectations. See also: Major Opportunities in Microinsurance   The trend now is more nuanced. CROS are trying to address more qualitative risks and incorporate a business-centric focus. With this in mind, we offer some suggestions:
  1. CROs would do well to take stock of their current ERM program inventory. What are the approximate costs of different programs? Are they meeting objectives and are those objectives still as important as when the programs were initially established? Is there an overlap? For example, does stress testing address only the same risks EC already covers effectively, and if so, would it make sense to deploy resources in a different way?
  2. In taking stock of current benefits, ERM efforts that enhance shareholder value should be receiving high priority. Considerations focused on pricing and new business challenges present a good opportunity to use risk knowledge to add value, not just conserve it.
  3. Lastly, consider if reshaping emphasis across the program portfolio requires some ERM team members to alter their orientation, e.g. behave more like “foxes.” Or, if there’s a need, consider adding new team members with the required skills and mindset.
As ERM continues to develop, both hedgehogs and foxes are necessary.

Henry Essert

Profile picture for user HenryEssert

Henry Essert

Henry Essert serves as managing director at PWC in New York. He spent the bulk of his career working for Marsh & McLennan. He served as the managing director from 1988-2000 and as president and CEO, MMC Enterprise Risk Consulting, from 2000-2003. Essert also has experience working with Ernst & Young, as well as MetLife.

Making China and India Great Again?

Short-sighted immigration policies in the U.S. are driving talented engineers and entrepreneurs back home to China and India.

“Thank you for what you are doing for America; your successes have put India in very positive light and shown us what is possible in India,” Atal Bihari Vajpayee said to me in a one-on-one meeting during his visit to the White House in September 2000. He added that he would love to see Indian-American entrepreneurs return home to help build India’s nascent technology industry. Bill Clinton and George W. Bush granted him his wish with their flawed immigration policies. The U.S. admitted hundreds of thousands of foreign students and engineers on temporary visas but did not have the fortitude to expand the numbers of green cards. The result was that the waiting time for permanent resident visas began to exceed 10 years for Indian and Chinese immigrants. Some began returning home. Now, Donald Trump, with his constant tirades against immigrants, particularly from what he calls “s***hole countries,” is giving many countries the greatest gift of all: causing the trickle of returning talent to become a flood. For India, the timing could not be better. With hundreds of millions of people now gaining access to the internet through inexpensive smartphones, India is about to experience a technology boom that will transform the country itself. And with the influx of capital and talent, it will be able to challenge Silicon Valley—just as China is doing. This is the irony of America’s rising nativism and protectionism. When I met Prime Minister Vajpayee, I was the CEO of a technology startup in North Carolina. Later, I became an academic and started researching why Silicon Valley was the most innovative place on this planet. I learned that it was diversity and openness that gave Silicon Valley its global advantage; foreign-born people were dominating its entrepreneurial ecosystem and fueling innovation and job growth. My research teams at Duke, the University of California at Berkeley, New York University and Harvard documented that, between 1995 and 2005, immigrants founded 52% of Silicon Valley’s technology companies. The founders came from almost every nation in the world: Australia to Zimbabwe. Immigrants also contributed to the majority of patents filed by leading U.S. companies in that period: 72% of the total at Qualcomm, 65% at Merck, 64% at General Electric and 60% at Cisco Systems. Surprisingly, 40% of the international patent applications filed by the U.S. government also had foreign-national authors. Indians have achieved the most extraordinary success in Silicon Valley. They have founded more startups than the next four immigrant groups, from Britain, China, Taiwan and Japan, combined. Despite making up only 6% of the Valley’s population and 1% of the nation's, Indians founded 16% of Silicon Valley startups and contributed to 14% of U.S. global patents. At the same time, I also realized that protectionist demands by nativists were causing American political leaders to advocate immigration policies that were (and are) choking U.S. innovation and economic growth. The government would constantly expand the number of H1-B visas in response to the demands of businesses but never the number of green cards, which were limited to 140,000 for the so-called key employment categories. The result? The queues kept increasing. I estimate that today there are around 1.5 million skilled workers and their families stuck in immigration limbo, and that more than a third of these are Indians. Meanwhile, I have witnessed a rapid change in the aspirations among international students. The norm would be for students from China and India to stay in the U.S. permanently because there were hardly any opportunities back home. This changed. My engineering students began to seek short-term employment in the U.S. to gain experience after they graduated, but their ultimate goal was to return home to their families and friends. Human resource directors of companies in India and China increasingly reported that they were flooded with resumés from U.S. graduates. For students, the prospect of returning home and working for a hot company such as Baidu, Alibaba, Paytm or Flipkart is far more enticing than working for an American company. You cannot blame them, especially given that delays in visa processing will lock them into a menial position for at least a decade during the most productive parts of their careers. This has been an incredible boon for China. One measure of the globalization of innovation is the number of technology startups with post-money valuations of $1 billion or higher. These companies are commonly called “unicorns.” As recently as 2000, nearly all of these were in the U.S.; countries such as China and India could only dream of being home to a Google, Amazon or Facebook. Now, according to South China Morning Post, China has 98 unicorns, which is 39% of the world’s 252 unicorns. In comparison, America has 106, or 42%, and India has 10 unicorns, or 4%. An analysis by the National Foundation for American Policy revealed that 51% of the unicorns in the U.S. have at least one immigrant founder. It is clear how shortsighted the U.S. government has been. With the clouds of nativism circling the White House, things will only get worse. America’s share of successful technology startups will continue to shrink, and Silicon Valley will see competition like never before. America’s loss is India’s and China's gain.

Vivek Wadhwa

Profile picture for user VivekWadhwa

Vivek Wadhwa

Vivek Wadhwa is a fellow at Arthur and Toni Rembe Rock Center for Corporate Governance, Stanford University; director of research at the Center for Entrepreneurship and Research Commercialization at the Pratt School of Engineering, Duke University; and distinguished fellow at Singularity University.

Strategies to Combat Barriers to Insurtech

The industry may be having a technological revolution, but a Valen report suggests that barriers to insurtech adoption and engagement remain.

The insurance industry may be in the middle of a technological revolution, but Valen’s recent 2018 Outlook Report: An Industry Divided suggests that barriers to insurtech adoption and engagement remain. 79% of surveyed insurers believe new functionality and features will make their teams more efficient in the long run, aligning with much of the optimism around insurtech acceleration. EY reinforced this notion in its Fintech Adoption Index 2017, explaining that the surge of adoption is positioned to hit the mainstream, driven largely by consumer demand. In fact, customer expectations for technologically advanced solutions for insurance providers have surpassed those of financial planning or investment platform providers. This reflects positively on the industry for fostering innovation and making smart, data-driven decisions. It should also encourage more insurtech investment, but Valen’s study finds there are many hurdles to overcome. See also: Insurtech Is Ignoring 2/3 of Opportunity   Here are four steps insurers should consider when they’ve made the decision to modernize a workflow or process through insurtech to get beyond these barriers to create a replicable approach to successful rollouts: Identifying the goal One of the biggest reasons why insurtech solutions fail is the wrong approach to thinking about technology. Insurers should begin with the end-goal in mind, identifying the actual business needs they are looking to address before considering which technologies are available to them. This step will help to secure buy-in from employees and streamline the rest of the process for putting new technologies in place. For leaders at insurance companies, examples might be "lowering claims," "identifying/limiting the number of policies with high risks in a book or business" or "improving client retention." Getting the team on board Once a business need is identified, it’s important to secure buy-in from the teams that will be most affected by new technology. For example, when data analytics first began to permeate the insurance landscape, many underwriters resisted the change. The sentiment of “machines taking people’s jobs” was prevalent, but the reality is the opposite. Predictive analytics enables underwriters to have a larger impact on an insurer’s overall business. The challenge doesn’t end with underwriters. 55% of front-line employees are observed to be somewhat or highly resistant to new technologies. Major barriers can include a lack of proof-of-value, understanding of the new functionality, leadership in rolling out the innovation, time for training sessions and user acceptance. These can culminate in resistance to adoption from employees and, ultimately, derail deployments. By taking each of these barriers into consideration, insurers can ease on-boarding and create strategies to simplify the learning curve and ensure adoption of the new functionality. Implementation Once a new technology’s value is clearly established and the need is confirmed by key stakeholders, the implementation process must be carefully planned. Insurers should choose a leader to oversee the roll-out process, as well as the point of contact for all questions and concerns. See also: 10 Trends at Heart of Insurtech Revolution   With regard to training, this leader should create and execute a roll-out plan to foster user acceptance by having the training sessions be fun and interactive. This can be achieved through a carefully crafted presentation, having a charismatic training leader or incentives such as gift cards, free lunch and other perks. Addressing objections Even once implementation is complete, there may still be objections and employees who are hesitant throughout the organization. Overcoming this is a result of communication, highlighting the successes of a technology implementation. Naturally, the ability to communicate success goes back to identifying a specific business goal. With each success, it becomes easier to implement other new technologies. This is where insurers can really thrive from an insurtech perspective, creating an continuing cycle of new technological additions that increase efficiency and drive profitability.

Dax Craig

Profile picture for user DaxCraig

Dax Craig

Dax Craig is the co-founder, president and CEO of Valen Analytics. Based in Denver, Valen is a provider of proprietary data, analytics and predictive modeling to help all insurance carriers manage and drive underwriting profitability.

Blockchain Transforms Customer Experience

Carriers will be able to provide customers with far more accurate quotes, creating more trust and eventually enabling more self-service.

Bitcoin’s unprecedented 2017 surge dominated the year-end financial news and introduced its revolutionary supporting technology into mainstream finance and technology discussions around the world. For many, this recent news cycle was their introduction to blockchain, the distributed ledger technology that enables the existence of cryptocurrencies; essentially, blockchain is a digitized, decentralized public record of transactions stored across a peer-to-peer network–a distributed database that maintains a continuously updated record of ownership and value. I’ll be honest, I didn’t get it at first–not in the sense that I didn’t think it would have a profound impact on the way we live, work and exchange value with each other globally; I mean I fundamentally did not understand it. Over the past two years, I’ve read a lot of definitions about what blockchain is and isn’t, my favorite coming from Steve Nutall via Fifth Quadrant’s CX Spotlight: Think of this digital ledger like a Google Doc. The traditional model of collaborating on a document was to use the "track changes" feature. You'd send me something; I'd open it, modify it, save it and send back to you. But on Google Docs, we can work on the same doc simultaneously, with each change being recorded instantly. Similarly, on the blockchain, we simultaneously see and update the ledger, of which – despite being distributed among many people - only a single version exists. Via this global, shared and trusted network, we can transfer and validate data, value, documents at scale without lengthy processing times, expensive processing fees or intermediaries. The benefits of this digital ledger technology are trust, transparency, speed and efficiency. See also: Insurtech in 2018: Beyond Blockchain   So what? I don’t have Bitcoin. While blockchain was invented to enable the best-known and most valuable cryptocurrency, most believe that blockchain’s potential use far outpaces Bitcoin itself, or any other cryptocurrency. Aegon, Allianz, Munich Re, Swiss Re and Zurich have launched the Blockchain Insurance Industry Initiative, B3i, which aims to explore the potential of distributed ledger technologies. The Institutes have established the RiskBlock industry consortium. Many other carriers and insurtech startups are following suit with their own explorations of how blockchain can transform their businesses. Among insurers, there is a belief that underwriting and claims processing are the strongest immediate applications. But blockchain will also have a significant impact on customer experience in insurance. Security and Privacy Insurance customer data has always been a high-value target for hackers, as it combines personal financial and health data. Insurance providers have sought to prevent data vulnerabilities with strict security-compliance measures, though outpacing the threat of cyber criminals is a difficult race of constant vigilance. Committing budget and human capital to staying on the forefront of cyber security trends is crucial to securing customer data, and, along with these efforts, the blockchain’s inherent structure can further protect the security and privacy of insured parties. A central component of blockchain technology is private key encryption, which for this article we will broadly reference as cryptography. Cryptography is used to create a secure digital identity reference between customer and carrier. Additionally, blockchain-based security is predicated on distributing the evidence among many parties, which makes it impossible to manipulate data without being detected. The combination of private key encryption and decentralized storage enables increased security and protection of data and identity, further reducing the barriers along the complete customer journey. Trust and Transparency As we’ve discussed elsewhere, trust is a major barrier to online insurance purchases. A full 80% of people do NOT believe online insurance quotes are accurate. Running Comprehensive Loss Underwriting Exchange (CLUE) reports can be expensive for carriers, and, without them, customer and carrier are often working with incomplete or inaccurate information that undermines the integrity of their quote. Without knowing how information is being used or why it’s being requested, customers lose trust quickly, and they need to speak with an agent to complete their transaction. The free flow of information enabled by the blockchain and its cryptography will engender a higher degree of trust and transparency on both sides, with clear benefits to both customers and carriers. Equipped with accurate customer data, carriers will be able to provide customers with far more accurate quotes that will in turn create more trust with their customer and eventually enable more customer self service. Once customers start trusting online quotes, they will soon stop picking up the phone to complete their transactions with an agent. We’ve also learned from our research that the more transparent insurance carriers are about why and how they will be using customer data, the more comfortable customers are sharing their data and completing transactions online. Because blockchain can eliminate the need for third parties to validate data, it can facilitate more direct communication between customer and carrier throughout the quoting process, giving insurers more opportunities for transparency. If an entire claims transaction, including payment, is supported on a carrier’s blockchain, financial intermediaries are removed and the carrier has complete control over the customer relationship. On the carrier side, blockchain will lead to a reduction in insurance fraud. When blocks are built, the data shared is immutable, reducing fraud and the need for third-party intermediaries. As reported by Bernard Marr in Forbes, an estimated 5% to 10% of all insurance claims are fraudulent. An insurer's ability to record transactions on a blockchain throughout a policy’s lifecycle, from quoting and binding to claims and other servicing, enables an immutable and auditable record of activity. By maintaining the integrity of any transaction’s history, blockchain technology can minimize counterfeiting, double booking and document or contract alterations. See also: Collaborating for a Better Blockchain   While fraud reduction provides an obvious benefit to carriers, it also benefits customers, who, without insurance carriers having to account for potential fraud in their pricing, can expect to save on insurance premiums. Speed and Efficiency The second half of improving quotes and claims processing lies in speed and efficiency. With current claims processing bogged down by an inefficient exchange of information, the excessive use of middlemen, fragmented data sources and an overly manual process for review and processing, the blockchain promises a streamlined capability for reducing the overhead and risk of claims processing, dramatically improving its overall speed. The blockchain does this through the use of smart contracts. Smart contracts are self-executing contracts with the terms of the agreement between buyer and seller being directly written into lines of code. Smart contracts allow trackable and irreversible transactions without a third party. Smart contracts can communicate with other smart contracts, creating a chain of efficiencies whose resolution time is extremely fast–converting paper-based processes that may have taken weeks or months to a matter of seconds. Mark Bloom, global CTO at Aegon, outlined one such use case tackled by the aforementioned B3i blockchain coalition in CIO Applications: As the first project, the group decided to focus on reinsurance as that part of the insurance value chain is currently dominated by paper contracts…in the current situation, the focus is on a paper document that is then translated to digital metrics that are entered into a computer system. Distributed ledger technologies such as blockchain allow us to start with the key metrics in a digital shared ledger and turn that into a legal document…This greatly reduces the work, associate risks and needs for further manual reconciliations. These digital contracts are smart because they contain operational logic and to some extent can execute themselves, which further increases efficiency and reduces operational risk…In terms of time, it could mean that the processing of all relevant data as premium and/or claims payments between insurer and reinsurer can be a matter of seconds instead of the months that the traditional paper process can take. Furthermore, smart contracts can automate the process of engaging repair and assistance providers (such as towing or autobody professionals) to fulfill claims, enable an automatic protocol for human consultation for complex and unique risks and provide automatic payment and an immutable, transparent proof of claim settlement. __ We are only now beginning to see insurers invest in and adopt blockchain technology, and in this article we’ve only touched on a few of the many transformative possibilities for improving the industry’s customer experience. I personally believe the blockchain will revolutionize the way customers interact with all financial and federal institutions in ways we haven’t yet contemplated–to the same extent that we’ve seen the global adoption of the internet fundamentally change the way we interact with each other. I would recommend that every carrier deeply investigate the promising possibilities of building with the blockchain and leverage the power of this disruptor to improve their products and services and the way they do business as a whole. The original article appeared here.

Tim Angiolillo

Profile picture for user TimAngiolillo

Tim Angiolillo

Tim Angiolillo is a strategy lead at Cake & Arrow, a customer experience agency providing end-to-end digital products and services that help insurance companies redefine their customer experience.

How to Fight Growing Risk of Wildfire

The California fires amplify the vulnerability to wildfire to even urban environments and highlight the need for new thinking.

The U.S. West Coast is regularly confronted with the risk of conflagration if a wildfire is sparked. An estimated 3.6 million residential properties in California are situated within wildland-urban interface (WUI) areas, with more than one million of those residences highly exposed to wildfire events, according to a 2010 federal study. However, the Tubbs wildfire – which rolled up with other California wildfires to result in some of the largest global reinsurance recoveries during 2017 – spread into the Coffey Park neighborhood that was situated outside WUI areas. This disaster amplifies the vulnerability to wildfire to even urban environments across the state and highlights the importance of continually mitigating wildfire exposure to protect people, homes and businesses. But what can be learned from this wildfire to enhance disaster planning and communications? Mitigation approaches range from using generational lessons garnered through prior wildfires, as melded with science and technological gains, to adherence to public policy at the state, county and community levels. These strategies can unravel if absent of preventative measures taken by individuals at home or work to, for instance, expand defensible space at the property location and surrounding environment, combined with installing more effective fire retardant materials (i.e., roofing, siding, fencing, decks, etc.). All these measures are underpinned by the research and resources available through agencies or organizations such as CALFIRE, FEMA, IBHS and Firewise. While the existing preventive measures did save lives and mitigate damage within Sonoma County, conditions and the speed of a widely spreading event combined to overwhelm emergency services and even many fortified structures that were eventually swept away by a wildfire which often flowed faster than water running downhill. To elaborate upon such conditions, the humidity level was extremely low, westerly winds (Diablo) were fierce and swirling once engaged with wildfire, and embers the size of footballs were carried up to more than a mile away. The wind-driven conflagration ignited and leapt over a major highway into Coffey Park, an urban community situated on a flat setting and woven with tightly aligned residences often connected by fences, common shrubs and a canopy of overhanging trees. Enabled by such a common urban setting, the wildfire ignited into Coffey Park like a fast-burning fuse – the scope and speed of which can be seen in Berkeley Fire Department’s video as posted on the KTVU website. Evacuation efforts were the highest priority for first responders who also established a perimeter, aided by wider firebreaks on multiple sides, to eventually contain the wildfire flaring within Coffey Park. But the destructive wake of the wildfire ravaged Coffey Park, leaving behind nearly two dozen victims and the ruin of roughly 1,300 properties. See also: Time to Mandate Flood Insurance?   Without delay, and while functioning within areas situationally under control of authorities at the outset, the insurance industry moved swiftly and visibly to respond to honor obligations; immediately providing additional living expense funding, including for evacuation, and, when possible, settling property claims. However, the rebuilding campaign and P&C claims settlement process for the Tubbs wildfire, in an area with an extremely low vacancy rate prior to the disaster, will take many years to complete. Lessons learned:
  1. From the public policy perspective, governmental entities and citizens will revisit evacuation planning, especially in light of officials deciding to issue an e-mail alert and not a widely dispersed emergency cell phone alert due to concern that such a widespread alarm would hamper emergency efforts. However, when the wildfire expanded beyond worst expectations, the unintended consequence of such a decision led to a frenzied situation in Coffey Park, with citizens making a critical difference by racing from door to door to awaken neighbors, while staff at facilities in Santa Rosa resorted to using personal vehicles to clear patients from a hospital and residents from a senior care center.
  2. A "back to the future" siren system, as a basic supplemental measure, might merit consideration.
  3. The Public Utilities Commission may need to revisit how electrical power is supplied to mountainous areas where, during violent winds, swaying electrical lines or fallen transformers / poles often result in arcing wires that spark fires that easily spread.
  4. With saving lives, most understandably, serving as the highest priority for emergency services, more insurers may consider contracting with private firefighting operations that have a superior ability to traverse steeply sloped and winding roads to reduce chances of property damage. These basic actions could involve, for example, clearing vegetation and combustible fencing with chain saws, accessing water sources with pumps, covering vents, clearing gutters and applying fire retardant foam spray to structures.
  5. Underwriting teams commonly access software tools to address wildfire exposure as part of the selection and pricing processes, as well as review output generated through portfolio models to monitor aggregate exposure. However, while steadily improving and evolving through lessons learnt, these tools offer imperfect guidance. For wildfire models, what is expected is burning along the Wildland Urban Interface with minor encroachment into other areas, with such expectations being met during the past 30-plus years [model era] subject to the notable exceptions of the Oakland Hills [1991] and Coffey Park [2017] situations. Insurers, in view of the recent dimension of an urban tragedy striking Coffey Park, may choose to (re)explore "model miss" options that could influence spread of risk and reinsurance design strategies..
  6. The industry will benefit through supporting community awareness campaigns to inspire customers to be vigilant about wildfire exposure; especially as another Coffey Park-type event could potentially recur in California or even happen elsewhere under similar conditions.
See also: 2018 Workers’ Comp Issues to Watch   Following the series of US catastrophes, Aon visited the sites of Houston for Hurricane Harvey, Puerto Rico for Hurricane Maria, Florida for Irma and Northern California for the wildfires. The team surveyed the damage and assessed how each event evolved to affect both people and properties with the goal of enhancing catastrophe models and identifying lessons for the future. I was joined by Dan Dick, Steve Bowen, Steve Jakubowski, Jeff Jones and Weston Vosburgh.

Agents, Brokers Are Dead? Not So Fast!

The independent distribution channel is wrangling its own destiny in a new direction using data, business intelligence and analytics.

For the past several years, a significant number of individuals and new-breed companies have emphatically stated that independent agents and brokers are dead. They don’t serve the needs of today’s consumers. They are behind the times from a technology standpoint. They are difficult to deal with. And a good number of insurtech startups are staking their future success on displacing agents and brokers. At SMA, we do not believe this. We believe that independent agents and brokers have a critical and, in many cases, an unmatched role in the insurance ecosystem. It is one thing to make this statement, but it is another to provide support for the statement. SMA recently did a study in this area. In the ensuing report, Data, Business Intelligence, and Analytics in Insurance – Agent View, there surfaced some compelling details that point to the independent distribution channel taking action to wrangle their own destiny in a new direction using data, business intelligence and analytics. See also: 5 Predictions for Agents in 2018   In prior blogs, I have asserted that data and analytics is actually one word – dataandanalytics! Insurers are on a fast-paced track to bring more and more data and analytics into their organizations. In fact, SMA 2017 survey results reveal that data and analytics was the No. 2 strategic initiative for insurers of all sizes – only three percentage points lower than the No. 1 initiative of customer experience. For many readers, this might be a case of “OK, tell me something I don’t already know.” However, there is a good probability that many believe that data, business intelligence and analytics initiatives are the domain of insurers only – that agents and brokers are not focused on these types of initiatives.  SMA survey results disprove this and show that 79% of agents are investing in BI for reporting, 44% in dashboards and scorecards and 40% in analytics tools. CRM technology is a top technology for agents of all sizes. Large agents – those with more than $10 million in premium – are additionally focused on customer segmentation, customer lifetime value, campaign analysis and channel performance. This sounds suspiciously like an insurance company! We are in a world of data-driven decision making – this isn’t going to change. Many agents and brokers totally understand this and are responding. However, not all agents are on board. SMA survey results indicate that smaller agents are trailing in some initiatives. The real hitch is that the pace of change in the insurance industry is exponentially increasing. To successfully compete, agents and brokers need to move more quickly to grow their data and analytics competencies and technology adoption. Insurers can be true partners in these endeavors by doing such things as facilitating data and analytics technology integration with agency-based technology and ensuring that data can be exchanged seamlessly. See also: Global Trend Map No. 5: Analytics and AI   There is no doubt that the insurance industry is in the midst of unprecedented transformation. Those agents and brokers who are bringing data, business intelligence and analytics tools into their organizations to gain insights about the changing risk landscape and customer preferences and needs are well on their way to maintaining their vital role in the industry. I, for one, am looking forward to the day when no one is talking about the death of the independent agent and broker, but rather talking about the high worth they bring to customers who value personal advice, risk management and protection! For information on the research report: Data, Business Intelligence, and Analytics in Insurance – Agent Viewclick here.

Karen Pauli

Profile picture for user KarenPauli

Karen Pauli

Karen Pauli is a former principal at SMA. She has comprehensive knowledge about how technology can drive improved results, innovation and transformation. She has worked with insurers and technology providers to reimagine processes and procedures to change business outcomes and support evolving business models.