Download

True Value of Net Promoter Score

Net Promoter Score has its fair share of critics, but they often overlook one of the metric’s greatest benefits: its ability to define corporate culture.

 

“Net Promoter Score” sure has its share of detractors these days. But those critiquing the measure are overlooking one of its greatest (if not widely discussed) benefits.

Net Promoter Score (NPS for short) was conceived by Fred Reichheld (a Bain & Co. consultant) and introduced to the world in 2003 via his seminal Harvard Business Review article, “The One Number You Need To Grow.”

NPS was heralded by Reichheld and his colleagues as the quintessential metric for gauging customer loyalty across many industries. It was a simple measure yet demonstrated a strong correlation with repeat purchases and referrals and, consequently, business growth.

In recent years, Net Promoter’s popularity has surged, becoming one of the most widely used customer experience measures, used by small businesses and billion-dollar corporations alike.

The “likelihood to recommend” question that’s at the heart of Net Promoter is a now ubiquitous query in customer surveys, and one with which most all consumers are familiar (even if they’ve never heard of NPS).

Net Promoter’s nomenclature – its “Promoter/Passive/Detractor” shorthand for characterizing customer loyalty levels – has become standard vocabulary in the halls of many organizations, not to mention annual reports and earnings presentations.

With greater adoption, however, has come greater scrutiny of Net Promoter. This was perhaps best illustrated by a decidedly mixed review of the measure in a recent Wall Street Journal article (“The Dubious Management Fad Sweeping Corporate America”).

See also: Is ‘Net Promoter’ Really ‘Not Promoter’?  

No performance metric is perfect, Net Promoter included. Many critiques of the measure, however, target weaknesses that relate less to the metric itself and more to how organizations have chosen to (incorrectly) implement it.

[If you’re interested in Net Promoter implementation, read this post for 10 Tips to help ensure success.]

But what gets lost in the maelstrom of Net Promoter critiques is the business philosophy Reichheld has long cited as the inspiration behind the metric: the idea that excellence in business comes from “enriching the lives we touch,” be it customers, colleagues, employees or any other stakeholder.

The structure of the Net Promoter scale, and the methodology used to calculate the NPS score, perfectly reflect that philosophy. The goal is not to satisfy those with whom you interact (“Passives” in Net Promoter nomenclature). The goal is to impress them – to create a “Promoter” by delivering an interaction so intensely positive that it all but guarantees people will want to come back for more (and tell others about the experience).

How exactly does one do that? How does one foster such a positive reaction that cultivates intense loyalty?

You guessed it – by enriching the lives of the people with whom you interact. By shaping every interaction, inside or outside the workplace, so people feel better after they’ve encountered you, as compared with before.

This is the true value of properly implemented Net Promoter programs (and one that so many critics – and even some adopters – of NPS overlook). It’s the behavioral guidance that the measure provides. It’s the picture it paints of what “right” looks like. It’s the motivation it delivers to go the extra mile.

The sheer power of that aspect of Net Promoter became clear to me years ago, thanks to a personal lesson delivered by none other than Fred Reichheld himself.

It was 2008, and I was preparing to launch what would become Watermark Consulting, the customer experience (CX) advisory firm I lead today. In an effort to better understand the market for CX consulting services (and whether there was a place for a new entrant), I did what any good entrepreneur does – I networked. I reached out to key people in the industry, seeking their advice and counsel, hoping to learn from those who had tread the path before me.

Most of the people I reached out to never responded (giving me a master class in e-Snubbing). Among the few who did reply was the most renowned luminary whom I had the audacity to contact: Fred Reichheld.

I had divined Reichheld’s Bain & Co. e-mail address, as any good sleuth would, and within 24 hours of my sending him a message, up came his response in my in-box. He hadn’t delegated the reply to someone else; it was clear he had personally written it. He appreciated my inquiry and wrote a couple of paragraphs with suggestions for me – advice that was genuinely helpful.

Here was this celebrity in the study of customer loyalty, a man famous around the world for his thought leadership, and yet he took the time to personally and thoughtfully respond to a message from me – a nobody.

Why on earth would he choose to do that (especially considering I was shunned by so many CX experts who were far less eminent than Reichheld)? It’s because he was walking the Net Promoter talk, trying to enrich the lives of everyone with whom he interacted.

From that day on, I became a “Promoter” of Fred Reichheld – a raving fan, if you will. I’ve never met the man, I’ve never communicated with him since. But what stands out in my memory is the simple kindness that he demonstrated in responding to my inquiry and sharing some helpful advice. The interaction was, in a word, enriching.

See also: How Fine Print Ruins Customer Experience  

As Reichheld, the father of NPS, demonstrated so convincingly to me, this is the true value of Net Promoter, and it’s something that gets overshadowed by the endless debates over the accuracy, relevance and predictive power of the measure.

Net Promoter is about orienting an entire organization (and individual behaviors) around the noblest of purposes: to enrich the lives of the people around you.

Who can possibly find fault with that?

You can find this article originally published here.


Jon Picoult

Profile picture for user JonPicoult

Jon Picoult

Jon Picoult is the founder of Watermark Consulting, a customer experience advisory firm specializing in the financial services industry. Picoult has worked with thousands of executives, helping some of the world's foremost brands capitalize on the power of loyalty -- both in the marketplace and in the workplace.

Selling the Urgency of Life Insurance

To ensure that people have all the life insurance they need requires insurers to better express the urgency of financial safety.

The most essential things are not always the essentials people have or know they need to buy. 

Life insurance is one such thing not enough people have, given that the lives and livelihoods of many depend on the security that insurers can provide. 

To provide for the survivors, to care for a man’s widow and his orphan, is not an act of charity but a declaration of independence; that the living will have the liberty to protect themselves from poverty; that they will have the means to live without fear of eviction or exile; that they will have the freedom to pursue happiness.

To make these promises a reality—to ensure that people have all the insurance they need—requires insurers to better express the urgency of financial safety. 

According to David Albanese of Ameraquest Financial Group

“Insurers need to remind people about the safety life insurance offers. Whether they issue reminders for the second or third time, or for the first time in a long time, what they tell people must be clear and compelling. Anything short of that standard is a loss for everyone.”

As a scientist, I can speak to Albanese’s point about clarity of communication. I do speak to his point, in my own way, whenever I speak to nonscientists about biology or chemistry; which is to say I speak to persuade, I speak to inform, too, so I can get people to join my efforts or support my work.

Before they send a reminder to current or potential clients, insurers need to remind themselves of the importance of clarity of speech.

If people do not know why they need life insurance, if they do not comprehend the value of comprehensive coverage, if they do not know what they should know, then insurers have a duty to explain themselves.

See also: Pricing Right in Life Insurance  

Insurers have a duty to educate us about life insurance. That duty starts with a campaign that has a clear message and a consistent theme, so there is no confusion among those who see or hear the message, so the right people—those who need life insurance—get the point and spread the word, so people may buy all the life insurance they need.

This campaign must include traditional media and social media, because people receive messages through multiple outlets. We send and receive messages by email, voicemail, text, video and chat. 

The conversations we have, the news we share, the comments we post and the posts we publish—all of these things have the power to influence how we act.

If life insurance is to be a topic of conversation, if we are to talk about this subject among our friends and family, if we are to do more than talk, then insurers must campaign to earn our trust.

Transparency is a good way to earn that trust.

Free of ambiguity and devoid of the slightest uncertainty, insurers can improve the world by proving to consumers that life insurance is a necessity.

2020 Outlook for U.S., Americas

Lack of trust in institutions and declining policy sales are forcing insurers to redefine their value propositions to stay relevant.

The past decade’s low growth rates, lack of trust in institutions and declining policy sales are forcing insurers to redefine their value propositions to stay relevant for new generations of consumers. Optimizing costs while investing in the right technologies and talent are also top agenda items. The willingness to take bold action will separate the leaders from the laggards. It will also enable some insurers to convert significant opportunity today into significant value tomorrow.

The unique mix of risk and opportunity is at the heart of the annual EY US and Americas Insurance Outlook. The report represents EY’s perspectives on the issues shaping the US and Americas insurance industry in the near term (next 3-5 years).

A complex environment and challenging fundamentals

The insurance industry is still feeling the effects of a low-growth decade. Economic inequality coupled with lack of trust in institutions is driving more lawsuits, larger jury awards and broader definitions of corporate negligence. For insurers, that translates to more claims, higher loss ratios and the need to raise premiums. It’s not surprising, then, that the number of policies sold has fallen.

Rising expectations for better customer experiences

Consumers expect intuitive, personalized experiences. But many insurers are still playing catch-up compared with digital leaders. Innovative firms will develop full customer lifecycle journeys by incorporating better data and richer insights and applying lessons learned from the most successful tech companies.

Shifting demographics

Though populations are not changing in the Americas as dramatically as in other parts of the globe, insurers are still susceptible to large-scale socioeconomic change. Mass retirements are looming. And insurers can’t take for granted that younger generations will automatically purchase conventional insurance products – particularly as they delay traditional milestones like marriage and home ownership.

See also: Innovation — or Just Innovative Thinking?  

Persistent barriers to growth

Low interest rates remain a big challenge, especially for life insurers. Flat productivity, low inflation and low savings propensity are also dragging down the industry’s prospects. New value propositions, such as those related to financial wellness, and a shift toward fee-based products are two ways insurers should respond.

A looming recession

The current fear of recession and lack of overall macroeconomic confidence threaten the recent run of successful results. A slowdown will affect life insurers as ROI dries up and consumer saving falls. Non-life insurers will be hit as government and private spending drops, affecting trade, consumption and overall economic activity.

Scarce talent

Both life and non-life insurers need more “digital people” – that is, those who know how to use advanced technology. Forward-looking executives recognize that the right talent and skills are necessary to generate strong returns on investments in technology and transformation.

See also: Insurtech 2020: Trends That Offer Growth  

How insurers should move forward

Insurers have understandably focused on upgrading technology in response to continuing margin pressures. But, technology is just one variable in the equation for successful long-term change.

A more holistic approach incorporates talent and cultural factors, as well as the emphasis on product innovation and new business models. Tomorrow’s market leaders will be technology-enabled, data-driven and operationally efficient – but also people-powered and purpose-led, with strong cultures that are adaptive, engaged and capable of rapid change.

You can read the full report here.


Ed Majkowski

Profile picture for user EdMajkowski

Ed Majkowski

Ed Majkowski is EY’s insurance sector leader for the Americas and is responsible for EY’s consulting businesses, markets and clients in this region.

Predictions for AI Adoption in 2020

For one, transfer learning, in which machine learning algorithms improve based on exposure to other algorithms, will expand its foothold.

AI-based technologies reached a new level of adoption in 2019 as businesses learned more about what exactly AI could do for them. 2020 promises to be even more exciting, with AI systems continuing to mature and companies extending usage and applications to address highly specialized needs. In the year ahead, organizations will be empowered to allocate resources even more wisely while achieving greater efficiency.

Here are my top predictions for AI in 2020:

AI Adopted More as an Assistant Than a Replacement

There has been some cross-industry concern that, as AI continues to improve, its resulting applications will take over human jobs and displace workers. Certainly, AI is being leveraged in new and interesting ways, but, rather than replace the human workforce with machines, AI-based technologies instead will become humans’ assistant.

AI and machine learning can analyze thousands of data points in seconds to yield insights that humans never could achieve alone. These insights will be used to make human decision-making easier and alleviate workers’ most mundane, time-consuming tasks so that they can concentrate on higher-order problems that don’t fit neatly into algorithms. Look for AI-based technologies to be applied strategically this year to help employees become more efficient and valuable in their roles.

Transfer Learning Becomes More Prevalent

Transfer learning, in which machine learning algorithms improve based on exposure to other algorithms, will become a more widely used technique in 2020. To date, it has been leveraged primarily with image processing, but we will see transfer learning applied to areas like text mining continue to improve.

The benefit of transfer learning is that a wider range of industries will be able to use AI to create highly specific applications based on small data. As less data is required, organizations can create state-of-the-art solutions that are faster, more accurate and better tailored to their specific needs.

The Cloud of the Black Box Continues to Lift

For a long time, AI has suffered from a lack of transparency. With machines developing more self-learning capabilities, developers might not know exactly why a machine learning system arrived at certain conclusions. When processes are hidden, behaviors can give pause to users who wonder if they should trust data generated by such a system. To combat this problem, more interpretable models are coming to the forefront.

In 2020, the differences between data explainability, traceability and determinism will become realized in AI. What is needed at which circumstances will also be clarified. As computing elements make complex predictions more understandable, solutions can be created that help explain those predictions. By removing the mystery of the black box, organizations can refine or expand queries to deliver more valuable information.

See also: Untapped Potential of Artificial Intelligence  

Demand Will Rise for AI as a Service

Traditionally, machine learning models have not been straightforward to deploy for data scientists and engineers. This will change this year as AI is delivered more like a service. AI models will be executed in cheaper, easier ways in the cloud.

This is a significant development on multiple fronts. By shifting to serverless deployment in the cloud, a machine learning model does not consume the same amount of computing resources as on a server. This results in a much different level of efficiency. This in and of itself will make AI as a service more popular. Moving AI to the cloud also improves the delivery model. Instead of coming in the form of a very heavy solution, an API can be created and shared.

These are just some ideas of where AI could go soon. AI and machine learning are advancing at a rapid pace, and companies are both eager and nervous to pull the trigger on new solutions. But the current momentum behind AI will continue to drive innovation, and organizations will evolve as they reap the benefits of machine learning systems.

As first published in Data Science Central.


Ji Li

Profile picture for user JiLi

Ji Li

Ji Li, Ph.D., data science director at Clara Analytics, has leadership responsibility for organizing and directing the Clara data science team in building optimized machine learning solutions, creating artificial intelligence applications and driving innovation.

Are You Innovating, or Chasing the Leader?

Strategic planning seldom yields bold changes. Capital allocated to each business unit from one year to the next is nearly identical.

It’s the relay runner’s nightmare: You just can't seem to catch up. Maybe you're in the lead, but you can't shake the person on your shoulder. How do you get ahead and stay ahead?

In insurance, whether you're looking over your shoulder or trying to catch up, you need to know as much as possible about the market and the competition. That’s why Majesco helps insurers assess their technology positioning with our Strategic Priorities surveys. Here are some highlights from this year's report:

Competitive Position — Recognizing Leaders, Followers and Laggards

Too often, strategic planning does not yield the bold changes needed because insurers do not rapidly move into a leading position by going from knowing to doing. This year’s research shows an ever-widening gap that is defining a new era of leaders.

The June 2019 McKinsey article, “How to win in insurance: Climbing the power curve,” emphasizes the gap between leaders and followers or laggards. McKinsey’s research shows that the capital allocated to each business unit from one year to the next is nearly identical – rather than reallocating capital to make bold changes for the future.

Capital shifts indicate priority shifts. They also point to investment strategies. This is consistent with the growing Knowing-Doing Gap emerging in the industry, highlighted by our Strategic Priorities research over the last five years, a gap that is putting some companies at risk given the pace of change and limited resources. Investments aren’t necessarily being made where they are most needed. Many insurers still aren’t recognizing that investments today may result in long-term reductions in the need for technology investments due to platform efficiencies. 

See also: Insurance Innovation’s Growth Challenge  

Taking decisive action around strategy is crucial, particularly with the pace of change and rapidly evolving competitive landscape. As the McKinsey article points out, strategy is about playing the odds, increasing the amount of “doing,” even if some plans fail, to ensure overall success. Insurers must focus on both optimizing today’s business and boldly creating tomorrow’s business – a two-speed strategy.

Strategic Priorities Report Highlights

This year’s research highlights how leaders have replaced legacy, expanded their channels, introduced products and business models and produced higher growth. Even more important, they see greater growth over the next three years. 

If your organization isn’t currently in the leadership position, you CAN catch up. If you know where leaders' investments have been paying off, you have a guide for transformation, optimization, innovation and growth.

Here are some key insights from this year’s report:

  • When we asked insurers about the state of their business (growth, systems, products, models and channels), last year was challenging for laggards, which had a 41% gap to leaders, and for followers, which had a 15% gap.
  • Leaders are laser-focused on both speed of operations and on speed of innovation. This is reflected in their work on legacy replacement, channel expansion, new products and new business models; followers and laggards are primarily concerned with speed of operations.
  • Leaders’ replacement of legacy core is greater by 75% than laggards, and by 20% than followers – putting leaders at a clear advantage
  • Leaders are creating products and business models nearly 55% faster than laggards and 20% than followers – enabling leaders to capture market share and revenue more quickly.
  • Leaders are expanding channels at a staggering rate of 19% higher than followers and 88% higher than laggards – expanding leaders' market reach and their ability to acquire and retain customers and revenue.
  • Over the next three years, laggards and followers will drop even further behind leaders.

Bold moves to optimize today’s business and create the future business substantially increase an insurer’s potential for success.  Leaders are blazing trails with new business models, channel expansion, new products and core system replacement while followers are attempting to do a few things and laggards are primarily watching.

Platforms, Production and Products

One of the most fascinating portions of the Strategic Priorities report is what Majesco found regarding platform planning, development and use. We wanted to understand where insurers were in their core system transformation, defining four answers within two simple concepts: Platform and Non-Platform.

  • Platform was defined as cloud-enabled, API and SaaS-based solutions or next-gen that were cloud-native, API and microservices solutions.
  • Non-Platform was defined as old, monolithic, legacy and modern on-premise solutions.

Consider these two, related findings.

  • P&C and L&A and group insurers are operating with Non-Platform core solutions at a staggering 60%, affecting their ability for innovation, speed and agility.
  • Insurers introducing new products and services are more likely using Platform-based solutions in the range of 60%-70%, a complete flip from the existing business and reflecting the growing focus on greenfields and startups.

So, many insurers are missing out on agile product development processes that can be found through platform vs. traditional core systems.

Response to Regulatory and Rating Agency Developments

This year, Majesco added an area of focus to cover insurer responses to the rapid advancements in the regulatory arena during 2019. The adoption of the AM Best innovation rating and the introduction of sandboxes by state regulators to test new products in a more rapid, managed manner, are certain to have a growing impact on insurers' innovation timelines. The question we asked was, “How actively is your company responding to these recent regulatory developments?”

For the most part, we found a lack of understanding and planning around these highly important changes within certain market segments. For example, L&A and group insurers lag significantly behind P&C and multi-line insurers in preparation for the AM Best innovation rating, with a gap of nearly 30%. Multi-line insurers outpace both P&C and L&A and group insurers by upward of 35% in the use of sandboxes. For more insight, check out the replay of this webinar, titled, “The Future of Insurance and Regulation:  Optimization, Growth and Innovation” that features individuals from AM Best, Ohio Insurance Department of Insurance and a former first deputy commissioner for Iowa. Their insights highlight the growing need for insurers to be innovating.

See also: How to Innovate With an Agency Partner  

In Summary

Insurers must gain clarity on how to succeed in the future of insurance, which is coming faster than most realize. Insurers must lay the groundwork of a new digital insurance business model that embraces customer, technology and market boundary changes with vision, energy and speed.

How do your strategies align to what leaders are doing? What specific plans can you take to improve your odds of success? How can you rapidly move from knowing to doing?

Your answers will determine your readiness in a new decade and the future of insurance.


Denise Garth

Profile picture for user DeniseGarth

Denise Garth

Denise Garth is senior vice president, strategic marketing, responsible for leading marketing, industry relations and innovation in support of Majesco's client-centric strategy.

What Happens When an Industry Goes Digital

sixthings

Now that innovation in insurance seems to have found a rhythm, many may feel that the pace is manageable and that digital disruption perhaps won't produce the sort of drama imagined a few years ago. So, now seems like a good time to trot out some thoughts from a talk I give occasionally on what happens when an industry goes digital—because that moment still awaits the insurance industry and because the change will be profound.

The example I use of an industry going digital is photography because it's, well, the most visual. Let's start there, then see what the photography experience might say about insurance's future.

A common misconception about digital innovation is that it happens suddenly. In fact, it follows the sort of track described by a character in Hemingway's "The Sun Also Rises," who, when asked how he went bankrupt, said, "Two ways: gradually, then suddenly."

What we think of as photography began in the mid-1820s—nearly two centuries ago—when a French inventor built a camera and took a picture of the rooftops at his estate in Burgundy.

Innovation took a steady path for the next 150 years. Big, clunky cameras became available for professional photographers by the 1850s, and Mathew Brady made photography famous with his images of the Civil War. George Eastman came along (experimenting by baking chemicals into various film substrates in his mother's kitchen) and by the late 1880s had simplified technology enough that amateurs could use cameras. His company, Eastman Kodak, soon produced the first film that could be used in a roll, rather than in hefty plates. and came out with the small, wildly popular Brownie in 1900. Kodak came to dominate photography, and there were huge profits even in this "gradually" phase as the mass market of amateurs began to experience what came to be known as a "Kodak moment." 

Digital technology came into existence with the invention of the transistor in 1947 and soon found its way into use with images. Bing Crosby, of all people, financed a lab that produced a digital sensor in 1951 that could record televised video, and the TV industry nurtured the technology. 

Cameras made the leap to digital in 1975, when an engineer at Kodak—yes, Kodak—developed a sensor that captured still images. Kodak set the invention aside, because it threatened the industry's sales of film, chemicals, paper and cameras, and Kodak had 85% to 90% of the U.S. market. Kodak more or less got away with stifling the digitalization of photography for more than two decades.

Then the "suddenly" moment happened.

In 1997, another French-born inventor, Philippe Kann, was in the hospital in Santa Cruz, CA, because his wife was getting ready to deliver their daughter. He had a digital camera and intended to upload photos of the newborn to his computer, then email them to friends and family. But Philippe (whom I've known for going on 35 years, because he founded an early PC software company) is not a patient man. He also had a cellphone, and he wondered why he couldn't just send the photos via the phone from the hospital. So, he did.

He rigged up a way to attach his camera to his phone, and sent this image:

Image result for philippe kahn daughter

In that moment, after many decades of "gradually," the photography industry became truly digital, and photography was stripped down to its bare essentials. Those turned out to be, merely: a lens, a storage mechanism and a means for viewing an image.

Despite the legacy of the industry and Kodak's best efforts to protect its profits, there was no need for film, chemicals or paper, nor for all those one-hour kiosks that processed film and produced prints. There wasn't even a need for a separate camera, because the lens and some related transistors and software could be embedded in a phone (or almost anything else) at a cost of a dollar or two. 

When people write about the digitalization of photography, they tend to focus on Kodak, which stumbled into bankruptcy in 2012, and, more generally, on the destruction that ensued. Fair enough. I've written extensively on the topic myself. But there's a flip side to the destruction: Once an industry has been pared down to its simplest, digital parts, those parts can be reassembled in any number of new ways and can be melded with other capabilities to form previously unimaginable products and services. In the case of photography, digitalization created far more value than it destroyed—it's just that the new value was captured by Facebook, Google, Instagram and a host of other companies that figured out what to do with images once they were freed from their physical constraints. 

With insurance, there are only three essentials: a contract, a yes/no mechanism that determines whether a payment is made, and capital. Just as with photography, the structures currently built around those essential parts don't have any particular claim on a fully digital future. There don't have to be agents and brokers selling those contracts. Those contracts don't have to be priced by underwriters and actuaries. The yes/no decision doesn't have to involve adjusters or any other aspect of today's claims process. Capital doesn't have to be put up by insurers or even reinsurers; it could come straight from capital markets or even from novel forms of pooling by individuals. 

You can argue that all the current players will continue to exist, just in modified form, but don't limit yourself. Focus on those three core elements—the contract, the yes/no mechanism and the capital—and think in two directions. First, how can you deliver those three elements as cost-effectively as possible to customers? Going digital lets you reinvent your cost structure. Second, what new business models can you imagine once insurance takes fully digital form, a la what Facebook et al. did with digital images? A new model could mean layering services, such as advice on prevention, on top of digital insurance, or it could mean embedding insurance in previously separate products—including home insurance with the home, auto insurance with the auto purchase, life insurance with wealth-management products or with the purchase of a business or building, etc. 

The key is to take advantage of that "gradually" phase, for however long it lasts, and position yourself for a future based on those three core elements. Because "suddenly" is still out there.

Cheers,

Paul Carroll

Editor-in-Chief

P.S. For those of you who somehow aren't now following every race of Mikaela Shiffrin, the skier I focused on last week because of how competitors are monitoring and trying to copy her every move, here is an update on last weekend that underscores my point about the need to benchmark against others and not against yourself.

Shiffrin has dominated the technical events, especially the slalom, with its quick, precise turns, as much or more than any skier in history but had faltered recently, "only" finishing second and third in the last two slaloms on the World Cup circuit. Perhaps all the camera crews following her every move in training were helping competitors uncover her secrets. But, even as competitors are coming after her, she's moving into new territory, increasingly training for and competing in the 75mph, hair-on-fire, soaring-100-yards-through-the-air-off-jumps speed events. And, over the weekend, she won two speed events, a downhill and a super-G. Lindsey Vonn is known as an all-around skier because, over a 17-year career, she won six events outside her specialty—and Shiffrin won two outside hers in a single weekend.

The competition is always adapting....


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

The Cloud Concept That Many Miss

Even as many start to move to the cloud, they miss a key cloud concept: How do you conduct maintenance once you've moved?

Ten years ago, the idea of moving to the cloud was just a vision for insurance companies. Five years ago, the cloud concept became a trend. Today, it’s a necessity. Increasing numbers of insurers realize that the cloud enables organizational agility and digital transformation, two key factors in outstanding customer experience. 

The countless articles and blogs about the cloud seem focused on migration and innovative concepts, such as DevOps and Big Data. But not enough attention is given to a key cloud concept: maintenance. 

Let’s face it, maintenance isn’t sexy, even though most insurance companies spend 10 times more on maintenance than on the transition. What’s more, maintenance contracts with suppliers dictate long-term relationships that last long after the transition has been completed and the go-live euphoria has dissipated. 

This is why it’s critical for insurance companies to choose their software vendor carefully. The functionality of the software product, as well as the delivery capabilities of the integrators, are, of course, important. But you should also examine the quality of service for the business as usual (BAU) period that starts once the software is up and running on the cloud. What are the criteria that should guide the insurer’s CIO when choosing a software vendor providing cloud-based solutions? 

The first thing you should check is if your supplier has an end-to-end operating model. In many cases, software development, integration and maintenance are provided by three separate organizations. Multi-functional vendors (also known as one-stop-shop vendors, or OSS vendors) will take full responsibility for the solution. An OSS vendor leverages its multidisciplinary expertise in IT, database, security and applications to provide end-to-end coverage for any incident that might occur.

The second thing you should look at is the service level agreement (SLA). The SLA must include tangible service level targets and penalties for breaches. Make sure it addresses all metrics relevant to your business, such as availability, performance, incident response and resolution time, customer service window and service continuity (RPO and RTO). 

See also: The Cloud’s Vital Role in Digital Revolution  

It’s important to understand all the service entitlements that represent the extent or frequency of service actions, which could, for example, include an annual disaster recovery drill, a weekly performance audit, quarterly database improvements and running the nightly batches on a daily basis. Also crucial is a monthly report that includes the actual performance against the service level targets and a list of corrective means taken to eliminate or minimize any underachievement.  

The next thing you should consider is multi-cloud expertise and experience. Many insurance companies have hybrid IT architecture, with some of their electronic assets available on-premise and some on the cloud. To remain as flexible as possible, it is essential to choose a supplier with multi-cloud expertise that can cope with a hybrid architecture or CSP switch.

Lastly, the insurance industry is a highly regulated sector, so naturally you should choose a vendor that complies with all the relevant regulatory requirements related to information security and privacy applicable in your country, such as ISO 27001 or GDPR. One way to ensure that your supplier is a professional organization is to examine how the supplier follows industry best practices of IT service management. You could verify this by asking for copies of the documentation of their processes, or by checking if the support managers are all ITIL-certified.

Choosing the right software vendor is not easy, especially because this choice establishes a long-term relationship between the two parties. Make sure you don’t place too much focus on your wedding to cloud transition – it is more important to think of what will happen once you return from the honeymoon.


Ronen Ram

Profile picture for user RonenRam

Ronen Ram

Ronen Ram is the head of Sapiens managed services units. He possesses more than two decades experience delivering managed services to Fortune 500 companies.

3 Keys to Better Collaboration

More demanding customers, the rise of insurtech and data privacy issues require that insurers move toward better collaboration.

The insurance industry is facing a flurry of challenges as consumer needs and expectations evolve with a new age. Although collaboration isn’t always common practice in the industry, it’s time that teams learn to work together.

Collaboration Is Key

Insurance companies are beset by three major challenges that demand collaboration.

First, consumers themselves are emerging as a disruptive force. They have an on-demand mindset and expect more from their financial service providers. But the insurance gap continues to grow in the U.S., creating new challenges for consumers and the insurers who serve them.

Next, the introduction of insurtech has shifted the nature of insurance itself. Insurtech has evolved to meet the needs of consumers — meaning more tech, more data and more focus on customers — and has received $10 billion in investments over the past five years. Insurance companies that want to adapt to the market should prepare to join forces with new entrants.

Lastly, data security and privacy regulations are top priorities as the digital age progresses. Yet it's more difficult than ever for companies to avoid regulatory and legal risks because the nature of compliance in the industry shifts almost daily. The General Data Protection Regulation in Europe, for example, inspired a similar privacy act in California. It was rolled out on Jan. 1, 2020, requiring organizations to make changes if they want to stay compliant and keep customers happy.

With market disruptions coming from all directions, insurance organizations must reimagine the industry. Collaboration will be crucial to evolving and meeting these challenges.

Barriers to Collaboration

It seems like collaboration would be a given in such a complex and people-driven industry, but it isn't. In addition to being a highly regulated industry, insurance has practices that have been in place for years — and collaboration hasn't always been a priority.

Legacy tech systems, for one, still reign supreme in the insurance industry and are notoriously sluggish at reacting to real-time needs. These systems weren't built for today's expectations of immediacy. As a result, they make it difficult for employees to collaborate while leading to less efficient and effective work processes.

Aside from legacy technology, many teams are stuck in top-down structures. These types of organizational structures disable cross-department communication and make teamwork a chore. Even goals and incentives are siloed in these structures, so employees often lack a collective sense of purpose.

Teams need a more modern, dynamic way of working if they want true collaboration. They need to be able to adapt quickly, make decisions based on shared knowledge and transcend departmental barriers so their companies can remain competitive.

See also: Model for Collaboration and Convergence  

Collaboration Starts at the Top

Leaders have a duty to position collaboration as a key tenet of success in their organizations. It can be an overwhelming task, especially if your team has been working within legacy systems and structures since its foundation. Leaders who embrace collaboration will see their companies thrive, though.

Above all, it’s important for leaders to approach collaboration intentionally. Give it time and attention. Make sure you acknowledge its importance and model the way forward for employees by lowering any borders between management and teams. This will enable leaders to put strategies in place to inspire people to come together in the spirit of collective innovation while better positioning their organizations for success in today's challenging market.

Now that you have leadership's involvement, here are three other must-haves for a collaborative environment:

1. A set of common goals.

If your departments are going to start talking and working more with each other, they’re going to need a uniting factor. A company vision crafted with a set of common goals helps people unite with purpose. Without this sense of purpose, employees can become confused, misguided and stressed. With common goals, though, they can focus on the big picture: providing the best service to customers.

2. A team-based structure.

As mentioned, legacy systems and structures can inhibit progress. When there’s so much change in the market, your team needs to be nimble and dynamic to adapt. A team-based structure can eliminate company silos that impede collaboration. As a result, communication will be more free-flowing and effective — leading to a more collaborative environment.

3. A shared incentive system.

Once you set common goals and a more open structure, you can fuel teamwork by introducing shared incentives. Use your goals and create incentives to reach them. That way, employees must actively collaborate to succeed. Shared incentives will align every team member — top to bottom — toward goals that reinforce the company's vision. That said, some people in teams contribute less work than they would individually. To overcome this, you'll also want to incorporate an individual component.

See also: 5 Ways to Build Team Capacity to Think  

The insurance industry must evolve if it wants to keep up with the growing number of market changes. That means it’s time for a clear out. Rid your team of its limiting legacy systems and break down its siloes. Only then will you discover what collaboration can do for your company — and your consumers.


Ann Dieleman

Profile picture for user AnnDieleman

Ann Dieleman

Ann Dieleman is the executive director of PIMA. She is an active member of the insurtech community and has 20-plus years of executive leadership working with startups, small businesses and the Fortune 100.

5 Questions That Thwart Ransomware

Ransomware attacks are surging, especially for small businesses that use a managed service provider (MSP) for their IT needs.

This past summer was something of a perfect storm for small businesses, which weathered an increase in ransomware attacks, which in many cases started with an IT vendor or managed service provider (MSP).

Ransomware incidents reported to our company were up 37% in the third quarter when compared with the first three months of the year, and 24% were confirmed to be caused by a vendor or MSP.

Those statistics are bad news for small businesses that manage their IT resources with the help of a MSP and worse news for small businesses that outsource their entire IT operation to the MSP, which includes everything from building the network and managing applications to servicing any and all IT requests.

In fact, in the first nine months of last year, 63% of all the ransomware incidents reported to our breach response unit came from small businesses, many of which rely on an MSP. Why is that figure so high? MSPs make ripe targets for ransomware attacks.

They have to balance, on the one hand, a need for speed and convenience when it comes to being able to respond to clients and, on the other hand, the need to have the right security controls in place. Too often, speed and convenience win out over security controls.

For example, in many cases, MSPs have reused credentials across clients so that MSP employees can service multiple clients more quickly. Similarly, MSPs might not enable multi-factor authentication (MFA) on the remote access point they use to pivot to client environments.

See also: How Municipalities Avoid Ransomware  

In many incidents in the third quarter, attackers exploited the remote management application that connects the MSP to the client. The same MSP user account would log into multiple client environments and install ransomware. If the MSP had set up individual user accounts for each of its clients, it is more likely that the exploitation of the single set of credentials would have only enabled unauthorized access to a single client’s environment, diminishing the risk to their clients.

Further, an MSP user account often has to have full administrative access to assist with regular IT functions, so, when credentials were compromised, the attackers had full administrative access to clients’ environments.

So, why the increase in MSP ransomware attacks this summer? According to Bill Siegel, CEO and co-founder of ransomware response platform Coveware, hackers have found a way to magnify the attacks on MSPs. Specifically, developers of Sodinokibi ransomware are now using techniques employed originally by GandCrab ransomware to make the attacks on MSPs more profitable.

These MSP ransomware attacks over the summer exposed incident response challenges. For small businesses that completely rely on outsourced IT, a massive ransomware attack across clients draws on the MSP’s resources and inevitably leaves many businesses in the dark. Small business owners without a technical background struggle to understand and assist the external legal and forensics vendors who are hired to help them respond to the attack.

The response is further complicated when the MSP itself is also infected with ransomware. Where an attack group knows it has hit an MSP, and infected downstream clients, the group may refuse to negotiate with the end clients and instead only respond to the MSP to increase ransom demands. This tactic can also leave clients with little to no control over their data software recovery.

For all of these reasons, we urge small businesses to ask the following important questions when vetting a potential MSP:

  1. Is there a security program in place, including periodic risk assessments to identify areas for improvement?
  2. Is there continuing security awareness training across the organization?
  3. Is there a SSAE 18 SOC 2 Type II report or similar type of report available to customers, attesting to security control environment?
  4. If access to personally identifiable information or protected health information is necessary, how is this protected at the vendor (e.g. encryption, secure remote connections, restricted access, logging and monitoring)?
  5. Are security and availability requirements enforced in master service agreement contracts (e.g. sensitive data protection, up-time guarantee/service level agreements, security incident reporting/coordination, regulatory compliance requirements)?

Our third-quarter statistics clearly show that small businesses and MSPs are big targets for hackers. It is absolutely critical that small businesses are working hand-in-hand with all their IT vendors to prevent ransomware attacks from happening in the first place.

What Robots Mean for Workers' Comp

While there are downsides, robots will take over the mindless, thankless (and dangerous) jobs and likely lead to a safer workplace.

History provides interesting insights into the debate around automation and employment. In 1632, King Charles I of England banned casting of buckets, for fear that allowing it would ruin the livelihood of the craftsmen who were making the buckets the old-fashioned way. In 1811, the Luddites in England started a movement where they smashed machines that they viewed as threats to employment. These examples have occurred with increasing frequency since the industrial revolution began. Not coincidentally, the per capita income in the world doubled every 6,000 years prior to the revolution and every 50 years afterward.

According to a Pew study, 52% of Americans think that much of our work can be done by robots, but only 38% believe that it could replace the type of work that they do. Additionally, 76% of Americans believe that robots would increase the inequality between the rich and poor.

But standing in the way of change, when viewed through the lens of history, has rarely worked. The key is to focus on the dislocated individuals and provide training to make sure that they can move into new positions. Historically, new positions tend to be more highly compensated, fueling an upward cycle.

It is clear that the pace of change and automation is increasing. In January, the parent company of Giant, Martin’s and Stop & Shop said it would introduce 500 robots to its supermarkets this year. Sure enough, if you Google “Marty the Robot” – a large, grey cone with a bright smile and “googly” eyes -- you will find out that he is hard at work at 40 Stop & Shop’s in New Jersey, finding and reporting spills in the aisles and calling for a mop.

It will be interesting to see what retailers follow suit. Walmart has given robots a thumbs up. Target? A thumbs down.

The pros and cons of automation are widely written about. The pros: eliminating mindless tasks, saving money on employee costs, having a safer working environment. The cons: reducing human contact with the customer, eliminating jobs for people who need then and decreasing flexibility in the workplace as automated tasks occur at programmed times.

See also: What’s Beyond Robotic Process Automation  

As providers of workers’ comp insurance, we are watching the rise of automation in the workplace closely. One of the ways we do this is to analyze actual claims that are submitted by our insureds, which are most often small to medium-size businesses. In the restaurant sector, we analyzed over 84,000 claims. and in the retail area we looked at more than 20,000.

One area where we are convinced automation could help reduce worker injuries is in coffee shops. Workers who operate espresso machines eight hours a day are reporting repetitive motion injuries akin to “tennis elbow.” In fact, so-called “Barista Wrist” is now a recognized medical condition. Our study of workers’ comp claims in the restaurant industry found that cafés had more lost time due to injuries than any other restaurant type. And the cause of the highest number of days needed to return to work in cafés – 366 days – was due to wrist injuries.

In a parallel from the retail sector, workers in hair salons are reporting hand, wrist and arm injuries from drying hair with a blow dryer, setting the stage for a new condition that could be called the “Brazilian Blow Out Arm.” Perhaps an innovative, automated “Robot Blow Out” could eliminate these repetitive motion injuries.

Among the most dangerous and expensive injuries in our retail analysis (which includes some wholesale) came from workers engaged in the preparation of meat, poultry and fish, which involves cutting hazards caused by sharp tools and machinery. The average paid claim for a worker who sustains a cut ranges from $4,200 - $7,800, depending on whether it was caused by a non- powered tool, by a powered tool or by being caught in or between machinery.

But once again, repetitive motion injuries in meat, fish and poultry preparation are by far the most expensive at $16,200 for the average paid claim. Clearly, this is an area where more automation would be helpful.

See also: How Robotics Will Transform Claims  

All of this gets us back to our original thesis that history has shown that automation is a net positive for workers which, over time, leads to people taking higher-paying jobs. Yes, jobs are eliminated, for sure. With machines, come risk and injuries, that’s undeniable. But it is also clear that robots will take over the mindless, thankless (and dangerous) jobs and likely lead to a workplace that is safer overall.

With all that said, there is one robot that I don’t want to see and that’s: “Matt, The Workers’ Comp Insurance Executive.”


Matt Zender

Profile picture for user MattZender

Matt Zender

Matt Zender is senior vice president of workers’ compensation strategy at AmTrust, with more than 25 years in workers’ compensation insurance.