Download

An Agent, Underwriter and Adjuster Walk Into a Bar...

sixthings

After witnessing the Cal football team's disheartening loss to Oregon State on Saturday, I spent an hour having a beer (okay, two) at Pappy's in the heart of Berkeley, decompressing... and thinking about what the busy sports bar had to say about insurance.

I didn't intend to ponder insurance, but I guess watching SMU complete its smackdown of Temple on the TV screens didn't exactly absorb all my attention, and my somber mood probably made me more likely to notice two glitches that reminded me of inefficiencies that still plague insurance, despite all the investments in technology and attempts to reinvent processes from the viewpoint of the customer.

The big glitch occurred when, at the height of the post-game frenzy, there was a shift change behind the bar. One bartender kept serving drinks, but all the others just focused on closing out tabs and ignored all the customers stacked up several deep, trying to catch the attention of someone—anyone—behind the bar to place an order. 

During the shift change, the bar-backs came out to clean up and replenish supplies, which was surely necessary but which amplified the frustration among customers. These assistants can't serve drinks, but how is a customer supposed to tell the difference between a bartender and a bar-back? From the standpoint of the customers jostling for attention, the bar looked like it had twice as many bartenders as it had had minutes before, yet almost nobody could get a drink. 

If the shift change had come just half an hour later, it could have gone smoothly, because the crowd had thinned out so much by then. But the change came hard on 4, not 4:30. Pappy's management likely didn't notice how many people walked out the door in frustration after giving up on getting a drink, or realize how many will decide next time that they should try one of the many bars just down the street. After all, the place did a booming business among Cal fans drowning their sorrows. But a tiny change in process would have led to more business Saturday and more repeat business from happy customers on future weekends.  

Yes, delivering an insurance policy is far more complicated than sliding a Sierra Nevada across the bar, but I'd bet that, off the top of your head, you can think of several tweaks in insurance processes that would remove frustrations for customers. We're making progress as an industry, but, given all the paper we still shuffle around, insurance remains a target-rich environment for those trying to kill inefficiency, and many tweaks are as simple as moving a shift change back a half-hour.

My second observation reinforces the first. The Pappy's bar was a machine when it came to churning out pints and pitchers of beer and at least three drinks—Moscow Mules, Jack and Cokes and Margaritas. (Man, people drank a lot of Moscow Mules, so many that Pappy's ran out of copper mugs.) But woe unto you if you ordered something out of the ordinary, like a Manhattan cocktail. The bartenders were pros and were running around so fast they were sweating, but an unusual order threw them completely out of rhythm. 

The vast majority of insurance policies have to have a fair amount of complexity to them, but not every piece of a policy or every process needs to be such a one-off, and, the more we can turn into the equivalent of a Jack and Coke, the more efficient we can be.

Have a great week. And Go, Bears.

Paul Carroll
Editor-in-Chief


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

Measuring Success in Workers’ Comp

Traditional metrics, such as the number of cases per adjuster, may be losing their importance. Metrics should evolve.

The adage says, "What gets measured gets managed." What does that mean for workers’ compensation? What can we measure to truly improve outcomes for everyone in the workers’ compensation system, from injured workers, to employers, to insurers and others? Are we measuring the most important metrics, or are we as an industry spending too much time measuring less relevant issues and excluding those that can have a real impact? How can we drive optimal outcomes by adjusting what we measure and the analytics we use to measure success? Finally, what should we measure as the industry moves forward? There is certainly no shortage of data in the workers’ compensation system, but what we do with it, how we measure it and, most importantly, how we apply it are key. Four prominent industry thought leaders joined us to discuss this important issue during our most recent Out Front Ideas webinar, which also served as the opening keynote session for the 74th annual Workers’ Compensation Educational Conference in Orlando:

  • Anna Hui, director of the Missouri Department of Labor and Industrial Relations
  • David North, president and CEO of Sedgwick
  • David Stills, vice president of global risk management for Walmart
  • Joan Vincenz, managing director of risk management for United Airlines

Our speakers did not necessarily agree on which specific metrics are the most important to measure. But they were unanimous on what they considered the most important element of any program – treating injured workers well and getting them back to health and work as quickly as possible. Anything measured should be considered a tool to accomplish that in the most caring and efficient ways possible. Process Versus Outcome Measurements It is easy to get absorbed in our processes and to overlook the most important thing – helping the injured worker. The definition of "success" may differ for an adjuster, the employer and the medical provider. Collectively, we need to be creating measurements that will ensure a focus on providing the best outcome for the injured worker. Defining success and deciding what to measure is not a black-and-white issue in workers’ compensation. As one speaker said, “It’s not as if we are counting widgets; these are individuals, all with different issues and backgrounds.” Processes and outcomes must go hand in hand. While optimal outcomes are the goal, the processes used to get there are important. Measuring each, to a certain extent, is important. For example, measuring the timeliness of first reports of injury is valuable if it helps a large number of injured workers. Commonly Used Metrics Three-point contact, adjuster caseloads and the speed of communicating with the injured worker all are metrics that are, or have been, extensively measured. But are these still relevant? Does success for these measurements lead to better outcomes, or should they evolve? There is no definitive answer except that measurements are important if they ultimately lead to better outcomes. For example, two of our speakers disagreed on the importance of the adjuster orally communicating with the injured worker within the first 24 hours. One believed it is absolutely critical to the employee’s experience and the outcome of the injury to speak with the adjuster quickly because the contact shows that the employer cares. Also, injured employees have a tendency to say things that they might not write in an email or text, and they are more likely to remember events of the incident closer to the timing of it. However, another panelist viewed 24-hour contact as an “age old requirement” that does not fit with the adoption of mobile technology. In addition to possible logistical problems, such as being unable to contact the injured worker as he is getting medical attention, there is little, if any, information to share with him at that early point. Allowing metrics to evolve with advancing technology is important, the speakers explained. For example, the first communication with the injured worker (whenever it occurs) may be a very different conversation than it was just a short time ago. Some employers can now view employer video footage within hours of a claim being reported. That means the first conversation with the injured worker does not need to focus on discovering exactly what happened, because that is already known. See also: Social Determinants of Workforce Health   The number of cases per adjuster is another metric that may be losing its importance. Despite some calls for an industry standard, many feel it is less relevant than it was several years ago. Technology has made medical-only claims much easier to handle. Also, factors such as the skill level of the adjuster and the complexity of each claim should dictate the number of cases each adjuster handles. Measuring savings from managed care may be less significant than it once was. Savings from bill review, and looking at a 12-month rolling average, do not show how well an organization is doing as opposed to a comparison over time or to another benchmark. Some employers are investing heavily in onsite clinics to make sure injured workers get quality medical care as soon as possible. One panelist said the most important metrics to measure are those that focus on these core factors:

  • The speed with which the claim goes through the system.
  • The quality of the medical care.
  • Efforts around return to work.
  • Keeping a balance on the metrics measured.

Using Metrics Appropriately A criticism of the workers’ compensation system is the inability to make certain measurements meaningful, collecting all sorts of data but not using that information to improve. One panelist pointed out that, while the industry extensively measures first report of injury, it does not measure or adequately discuss the small percentage of disputed claims that actually go to trial. They noted that the number of settlements has gone up dramatically, while the number of awards is decreasing. While that may indicate improvements in the workers’ compensation system, the information is not being used to improve processes. According to the panelists, sharing data is one of the biggest opportunities to improve the system. Safety information and workers’ compensation data are being gathered in the same building but not shared among the departments collecting them. Breaking down the silos within the regulatory side of the industry has also been a major focus as this can result in more aggregate information available to all stakeholders. Regulators are also providing more data to the industry, which can allow employers and injured workers to explore trends and see types of injuries by county, injury type and industry. Understanding what is being measured, and relating it back to how the system functions, can improve performance and outcomes. Finally, the panel explained that maintaining actuarial predictability is very important for employers. If you are making significant changes to your program, be sure to discuss these with the actuaries so they can adjust their modeling. Employers should meet with actuaries frequently to monitor trends that are affecting their program. It is important to show actuaries how all the metrics work together to see the whole story, rather than looking at any one metric in isolation. See also: Bridging Health and Productivity at Work   The Future The industry will need to learn to trust the metrics more, the speakers said. Sometimes the data that is captured through predictive analytics and other new tools contradicts what we think is correct. But it is time for stakeholders to rely on the technology, rather than what their guts are telling them. At the same time, it is crucial to make sure the customer is happy, regardless of what the measurements show. Measurements do not matter if the injured worker has a bad experience. There is no single measure that will guarantee success. The metrics can lead to positive outcomes as long as they are viewed as one factor in the overall injury management program and organizations are willing to evolve and change with technology and the experiences of injured workers. At the end of the day, it is how well we care for injured workers that is the most important thing. You can view the complete presentation at https://www.outfrontideas.com/archives/.


Kimberly George

Profile picture for user KimberlyGeorge

Kimberly George

Kimberly George is a senior vice president, senior healthcare adviser at Sedgwick. She will explore and work to improve Sedgwick’s understanding of how healthcare reform affects its business models and product and service offerings.

Why Hasn't Cyber Security Advanced?

The global approach to cybersecurity has remained the same for decades: Respond and recover. New technology can prevent attacks.

Despite major advancements in technology, the global approach to cybersecurity has remained the same for decades: Respond and recover. The same vulnerabilities are repeatedly exploited in similar ways, and this trend shows no signs of slowing because current security tools do not actually address the roots of attacks. Even new artificial intelligence and machine learning techniques have mostly been aimed at improving response efficiency – as though there is nothing we can do but prepare for the worst and recover as quickly as possible. To slow the frequency of dangerous and costly cyberattacks, companies should be shifting their efforts toward focusing on disrupting adversaries. In May 2000, one unwitting user on a computer in the Philippines used Microsoft Outlook to open an email message with the subject line “ILOVEYOU” and an attached file named “LOVE-LETTER-FOR-YOU.txt.” The attachment contained malicious code – a worm that moved through the user’s computer, overwriting random files and sending an email with a copy of itself to every address in the user’s Windows Address Book. Just 10 days later, the ILOVEYOU worm had spread around the world, infecting over 50 million computers. Far from spreading love, the worm caused more damage than any previous cybersecurity incident: an estimated $5.5 billion to $8.7 billion worldwide. See also: Quest for Reliable Cyber Security   Seventeen years later, Microsoft announced a vulnerability within a resource-sharing protocol in widespread use across versions of the ubiquitous Windows XP operating system. The company quickly released a patch to fix the vulnerability, but countless systems had not been patched by the time the WannaCry ransomware attack began two months later. WannaCry eventually affected more than 200,000 computers across 150 countries and caused estimated damages ranging from hundreds of millions to billions of dollars. Despite significant changes to the state of technology and the internet between 2000 and 2017, these two cyberattacks were very similar. Both propagated by using relatively simple vulnerabilities in Microsoft operating systems, and both were successful because of a lack of proactive cybersecurity. Not only do the same kinds of attacks continue to work, responses to cyberattacks have not changed much, either. Despite all our advances in technology, cybersecurity is generally still focused on response and recovery: Identify the infected computers, take them offline, rebuild or replace them, file a data breach disclosure... rinse and repeat. Is it impossible to defend against these vulnerabilities? Are hackers just too smart? Not necessarily. Response and recovery treats cyberattacks like natural disasters – inevitable, unstoppable and caused by forces outside our control. Meanwhile, adversaries continue to enjoy the fruits of their illicit labors. Popular methods of attack remain consistently successful because cybersecurity has failed to evolve to a posture of true prevention. A needed evolution The continued success of these cyberattack methods (and others) hinge on exploiting a limited number of key vulnerabilities that are commonly known to good and malicious actors alike. So, why are they still effective? The answer is that standard cybersecurity methods are designed to respond to incidents but rarely, if ever, actively disrupt the adversary’s attempted attack. For example, cyberattacks or intrusions are generally handled via alerting after a breach has already occurred. The alert triggers a cybersecurity team to go in and clean up the affected systems, then set up or modify a firewall to block future attacks. Afterward, companies are required by law to file a data breach notification. The most common approach to blocking attacks involves analyzing past and current threats, then distilling the results into indicators of compromise (IoC), such as IP addresses, domain names or hashes of known bad files. These IoCs are fed into available cybersecurity tools, which are wielded like a giant hammer, blocking or denying any traffic associated with them. This method aims to prevent attacks that are exactly like prior attacks. This perpetuates the problem, however, as hackers know about these protection methods and adjust. Subsequent attacks are designed to differ just enough from prior attacks to elude being blocked and ensure they avoid the new protections. This process is repeated time and time again. The cybersecurity industry is almost entirely locked in this detect-respond-recover approach, with little to no effort being made to actually prevent cyberattacks in the first place. Embrace a new approach to stem the flow of cyberattacks Rather than treat the cyber threat like a natural disaster, the cybersecurity industry needs to embrace a fundamentally new approach. Instead of relying solely on insufficient incident response and recovery methods that have been used for many years, a more sophisticated approach is needed to prevent current cyberattacks and to predict and prevent future ones on a meaningful scale. At Trinity Cyber, for instance, we provide this preventive assurance by invisibly monitoring threats outside a network’s perimeter and adapting to the adversary’s techniques to intercept and neutralize cyberattacks before they get in. Operational challenges in the cyber realm have been exacerbated by an ever-growing landscape of disparate endpoints, heightened sophistication of cyber-attackers and an increasing number of cybersecurity tools. While these challenges led to the development and implementation of SOAR platforms, which add efficiencies, most tools remain inherently reactive. See also: Best Practices in Cyber Security   Cyberattacks or attempts at compromising a system are human-made events within a human-made environment. By focusing on disrupting the adversary’s methods, analysts can determine tactics, techniques and procedures (TTP) and then use those to develop solutions that provide truly preventive cybersecurity.

Finally Realizing the Promise of AI

Augmented automated underwriting, or AAU for short, is destined to become one of the key talking points in insurtech.

It’s almost inevitable. Spend your working life identifying, analyzing, quantifying and ascribing monetary value to risk, and you’re likely to have a fairly strong aversion to it. More accurately, an aversion to undertaking new endeavors with inadequately understood consequences. The insurance industry is, on any number of levels, the very definition of risk-averse. Yet, for all the commentary suggesting otherwise, insurance still has an appetite for innovation. If the insurtech sector is any indication, then an interest in and requirement for new solutions is being recognized and slowly addressed. Insurance may not employ the language of disruption that runs through the wider fintech market, may be short a few unicorns and may be unable to boast some of the record-breaking funding rounds, but a quiet tech evolution has been building in insurance, nonetheless. Hence the advent of automated underwriting facilitated by more advanced algorithms and data analysis. Where insurtech does overlap with its more vocal fintech counterparts is in the greater use of artificial intelligence (AI) and machine learning to solve age-old problems around data analysis and interpretation. It’s about five years or so since AI first became a topic of conversation in insurance. Since then, despite the intensity of the debate, it has often felt like a reality that is always just over the horizon – a destination that kept moving even as more and more efforts were directed toward it. But recent research suggests that the journeys made so far have not been in vain. We are at a point where embracing AI is about to step up a gear. The global value of insurance premiums underwritten by AI has reached an estimated $1.3 billion this year, as stated by Juniper Research; and they are expected to top $20 billion in the next five years. As a destination, AI is closer and more attainable than ever before. See also: Untapped Potential of Artificial Intelligence   However, AI is not an island. Its promise of $2.3 billion in global cost savings to be achieved through greater efficiencies and automation of resource-intensive tasks will not be achieved in isolation. AI remains part of a more complex ecosystem of data gathering and analysis. It can apply new technologies to get the best out of the already established and still-emerging data sources that feature in underwriting offices around the world. It emphatically does not require these existing investments to be ripped out, replaced or downgraded. It is more helpful, therefore, to see AI as the differentiating factor in the latest generation of insurance IT: augmented automated underwriting, or AAU for short. AAU lets underwriters spot patterns and connections that are, frankly, either invisible to the human eye or that take normal, human-assisted processes unfeasible amounts of time and resource to identify. Whereas earlier generations of automation were able to pick up the low-hanging fruit of insurance markets – the individuals whose driving history fit into clearly delineated boxes, for example – AAU can take into account all of the rich complexity of the human experience. It can spot the nuances and individualities that populate the life market, for example, and translate those into accurate policies. That’s good news for both underwriters and their customers. AAU can significantly reduce the need for separate medicals, repeated questions, and lengthy decision-making processes and drastically increase the speed at which a potential insurer can get a quote and cover – while continually improving the way risk is calculated and managed. AAU can make sure the decision-making process remains in the hands of underwriters rather than IT departments, enabling them to set and update the rules and parameters as befits their preferred business model. It consequently makes advanced, complex and precise decision-making available to a broader range of underwriting businesses – which is good for those businesses, good for customers and ultimately good for the entire industry. See also: Strategist’s Guide to Artificial Intelligence AAU – augmented automated underwriting – is an example of the realization of AI’s promise. As such, it’s set to become one of the key talking points and disruptive technologies of the insurance industry. And this time, AAU is both a journey and destination that all progressive insurance organizations need to be considering.

Intersection of Tech and Holistic Health

Holistic healthcare providers see technology as depersonalizing. In fact, tech is at the heart of moving from diagnosis to prevention.

There is a misconception in the healthcare field that technological innovation is opposed to the philosophy of holistic care. Tech is viewed as artificial, manufactured and impersonal — it values human experience only for the sake of developing better algorithms and treats the physical body of the patient without care for their personhood. In contrast, holistic care is seen as organic and natural, elevating the physical, emotional and spiritual needs of an individual and seeing, not a patient, but a person with a particular and unique social and cultural background. It is easy to see why those invested in whole person care have resisted the march of innovation or flat out rejected it. But to see these two things in tension is to ignore the character of technology — that is, as a tool rather than an end in itself. Take the smartphone. It can be used to ignore interpersonal connections, while endlessly scrolling through social media feeds or web content — but it has also provided the means of keeping up and strengthening interpersonal connections, through video chat, text messaging, phone calls and countless other forms of communication. The lesson of the smartphone generation? Tech is what you make of it. And that lesson should be applied to tech’s role in healthcare. Tech is not a threat to holistic care, but rather a means to scale its interpersonal experiences, and we need to start viewing it in that light. Tech overcomes the real threat to holistic care — human limitation. In an ideal world, caregivers would be available 24/7 to hand deliver personalized care to their patients. In a tech-enabled world, we can come close to realizing this vision, as digital tools provide the means to extend care into a person's home while overcoming the physical obstacles of distance and time exacerbated by the shortage of caregivers. See also: Can InsurTech Make Miracles in Health?   Of course, the philosophy of holistic care is not based solely on interactions between patient and caregiver. In fact, if holistic care is successful, the need for a caregiver drops away, as the goal is to create independent individuals who can sustain and manage their own health. Tech can foster this independence through delivering accessible educational content to patients right where they are, enabling them to take an active role in their own health and become self-reliant. From this angle, the patient is not just another diagnosis, but an individual who with the proper means can be empowered to control their own health and wellness. Tech can aid that patient to learn about and be encouraged to pursue healthy lifestyle choices that fit their individual needs, and can keep them off medications and out of the hospital: a primary goal of holistic care. This type of individualized care plan is made possible by “big data” — the very algorithms that at first glance seem so impersonal, reducing the patient to a statistic. Far from depersonalizing care, these are at the heart of driving the shift from diagnostic to preventative care, revealing novel insights and helping to create plans that are tailored to the individual person and encourage patients to view themselves as uniquely structured individuals whose care management should reflect that, and not be delivered through a one-size-fits-all approach. See also: Social Determinants of Workforce Health   Imagine a world where the human connection of whole person care isn’t limited to physical touchpoints, where, instead of reaching five or 10 patients in a day, a care provider can reach hundreds. A world where hospitals are empty from preventable admissions and the drug industry unnecessary except in exceptional cases, where redundant protocols don’t exist. This is the promise that technology has fulfilled in other industries, and it’s time that the champions of holistic healthcare set aside their skepticism to take a second look

D2C Model Needs New Customer Approach

Customers need constant reassurance that things will be explained, the next steps will be clear and the company is there for them.

According to home insurance provider Hippo, over half of insurance customers would rather go to the dentist than communicate with their provider. This type of sentiment provides a big business opportunity, however, as insurance increasingly becomes a direct-to-consumer (D2C) business. In the next few years, many believe that the large amounts of marketing and ad dollars traditionally spent to drive traffic to mobile apps and websites will struggle to turn web visits into customers. Insurance carriers, now more than ever, are afforded the opportunity to address friction within the customer journey as customers expect a transparent and more intuitive experience. Today’s insurance consumer embraces the right engagement at the right time. Providing certainty and clarity to customers reduces anxiety and hesitation and drives success for the customer and the business. In 2013, Geico’s marketing budget topped $1 billion, with a majority of spending on advertising. Not much has changed. D2C newcomers have acquired early customers with design-first thinking, an emphasis on lower prices and more modern policy terms. But the approaches are meant to acquire customers; neither focuses much on engaging the customer. According to a recent Watermark customer experience survey, CX leaders outperformed the broader market, generating a total return that was 45 points higher than the S&P 500 Index. And customer experience leaders generated a total cumulative return that was nearly three times greater than that of the "Customer Experience Laggards." Those are numbers any CEO of a traditional insurance company or founder of a major insurtech can rally behind. How insurance can embrace a different type of customer acquisition For legacy insurers and a more D2C model, customer experience represents a fundamental and essential shift in mindset. By providing the lower-friction, more customer-centric experience that today’s consumers prefer, legacy insurers and insurtechs can modernize their position in the market. This can all be done by guiding the customer’s experience online through engagement. See also: How to Earn Consumers’ Trust   Tom Super, director of J.D. Power’s insurance practice, recently noted that, “According to our 2019 J.D. Power Digital Experience Study, 37% of consumers have never spoken with their agent, and one in 10 consumers report they have never interacted with their insurance company at all.” This clearly leaves a lot of room to grow customer engagement, and the insurance industry should look more closely at how it calculates customer lifetime value (CLV). There is an opportunity to understand where exactly customer engagement produces sales. For insurtechs and legacy providers alike, the question has become: How do you think about engagement when your customers don’t really want to be there or don’t understand exactly where to go? This is a lot different than typical direct-to-consumer marketing and brand challenges. Creating a more direct customer experience To win in this fast-evolving insurance marketplace, providers of all types will need to move quickly beyond branding and focus on the customer experience. The first few seconds are critical—as are all the seconds that follow. Customers will need constant reassurance that things will be explained, the next steps will be clear and the company is there for you. The key to Guided Digital Commerce is automation for the majority of contacts and preserving your live channels for more complex inquiries. If you can give the customer the right answer to a concern the overwhelming majority of the time, you can deploy a profitable engagement solution that can reach all of your customers instead of just a few. In sometimes opaque insurance products, this is key to building effective customer engagement that supports money spent on the brand. In practical terms, this means providing relevant guidance to help customers complete their onsite journey quickly and easily. Site design and golf tournament sponsorships are only the beginning. From the moment the customer lands on the home page, the provider should watch for signs of hesitation, struggle or opportunity. Site analytics can help the insurer understand the nature of hesitance as well as how to address it. If visitors tend to get stuck at a given point in the process, offer relevant information in context, explaining what to do and what to expect next. Keeping the customer within the digital channel and increasing self-sufficiency is good for the customer and good for business. Anticipate and act on customer behavior in real time. In a sense, be the best possible kind of insurance agent: one who’s clear, helpful and attentive to the customer’s needs but never pushy. See also: How Insurtechs Can Win Consumers’ Trust   Branding, data science, risk-pricing, terms, customer reviews—these are all part of the mix for competitive success. But none of it matters if you can’t keep customers on your site long enough to see what sets you apart. By offering a new and better kind of engagement experience, insurance can start changing customer perceptions from the moment they arrive. When customers are guided to the information they need to make confident buying decisions, they’re more likely to bind policies, give accurate information to enable accurate risk calculation, update their coverage and generate revenue for the business. And that sure beats a trip to the dentist.

7 Questions on Taking Online Payments

American consumers prefer to pay bills online, but, for the businesses, it’s not quite as easy. Here's some advice.

When American consumers go to pay their bills, they prefer going online to mailing a check because going online is simple. But for the businesses, it’s not quite as easy. Here's some advice. One reason online payments are so complicated is that eight players are involved. There’s the cardholder, the credit card company, the merchant, its bank, the payment gateway, yet another bank, the credit card network and, finally, the Federal Reserve. When you sit down with a payment provider, here are seven big questions you should ask: What are some reasons to accept payments online?  You never want to turn down a customer payment. Customers like to pay online, using either a credit card or a checking account number, because it’s quicker than hunting around for the checkbook and a stamp, and they can be certain when the payment has arrived, avoiding late fees or cancellations. In fact, the timing issue was the main reason people have been willing to pay fees for online payments. But business owners like them, too. They speed up your payables, allowing you to be more productive and avoid any difficult situations where a customer is having trouble paying promptly. What are some drawbacks to accepting online payments?  It’s tricky. Many vendors add processing fees, set-up fees, chargeback fees and other hidden fees that can eat directly into your already tight profit margins. Integrating payment into your website can also be costly, especially if you need custom coding. There are also security concerns that require an understanding of proper handling of customer data and banking information. And, in the insurance industry, there are additional state regulations to meet. See also: Important Perspective for Insurance Agents   Why is compliance important?  Running your operations with disregard for, or ignorance of, the law is never a good business model. Card rules violations may lead to card companies taking away your ability to accept payments or in some cases levying fines of up to $25,000 per violation. What can I do to ensure my payment system is secure?  Look for PCI Level I validation, the highest level of security, from any vendors you choose. Ask how long they have been in business. Read the contract to find out who is responsible for a breach: you or the vendor. Does every state have the same payment processing regulations?  States have varying regulations with payment processing that prohibit the agent from charging additional fees. For the insurance industry, states also have rules on how the premium fund must be handled, while credit card companies also have specific regulations. With multiple layers involved in being compliant, it is important that you choose a payment provider that ensures you are operating within the rules, at all times. Does my state allow people in the insurance business, such as agents, to charge additional fees?  While other businesses have the ability absorb processing costs as a standard part of their pricing models, typically this is not the case in the insurance industry, but it varies from state to state. Does my state have a “convenience fee” law? A convenience fee is an additional charge to your consumer on top of the payment due. It is referred to as a “convenience” fee because your business has provided the consumer with another avenue to make a payment outside of standard ways of paying. The rules on convenience fees are state laws, not guidelines, and violating these could come with significant consequences. To make things more complicated, the status of convenience fee laws may have current legal action pending in your state. Make sure your payment provider satisfies the requirements of these laws. See also: Find Your Voice as an Insurance Agent   By understanding the laws and regulations for your state, you can confidently run your agency knowing what you can and cannot do when it comes to payment processing. Not all payment processors are created equal, and a poor decision could cost your business in custom development work, unnecessary features, hidden fees and insecure data, and even land you in big legal trouble.

3 Steps to Achieve a Digital Architecture

Moving from a legacy to a digital architecture requires: fixing the core, organizing the data and extending the architecture.

|
Progressing from a legacy application architecture constraining organizational strategy to a cutting-edge, digital-first application architecture driving organizational strategy is a goal for many insurers. To best achieve the transition, several steps are advisable. This article proposes a framework categorizing these steps into three distinct phases: fixing the core, organizing the data and extending the architecture. Fixing the core The first step requires addressing debt, both technical and operational, in an insurance application architecture, taking it to a baseline upon which the insurer can build. A number of elements should be considered. First, each application in the as-is architecture should be assessed to identify candidate applications for de-duplication, extension and replacement. De-duplication is justified where two or more applications perform overlapping functions. A common example is having claims applications that each cover a stage of the journey, even though each application could handle the entire claim. Another example is having multiple integration solutions, each covering a specific part of the architecture, such as policy, billing and claims. Extension is advisable where there is benefit in maintaining a legacy application, but where significant improvements are required to specific elements. An example is a core policy-handling mainframe application carrying benefits with regard to cost and stability of operations, but imposing significant constraints on usability. The application could be extended by implementing a modern front end for users. A key point to consider is that extending legacy applications may introduce architectural complexity, increasing maintenance costs. Where the underlying technology or functionality of a legacy application is significantly misaligned with the application strategy of the insurer, or where technical debt is so high that refactoring is no longer an option, then application replacement should be considered. Consult the insurer's application road map; If none are based on technologies akin to those of the application in question, then the application is likely not in line with the strategic direction. Technical debt, on the other hand, can be seen by analyzing the run cost of an application year on year; if the costs increase without a matching increase in supported functionality, then technical debt is accumulating. See also: Digital Insurance, Anyone?   Second, operational processes should be analyzed to determine whether there are effort-heavy activities being performed as workarounds for technology limitations. An example is handlers manually keying claims data into a policy-handling application, then having to key the same data into a claims application because the integration between policy and claims does not work correctly. Another example is underwriters having to access multiple applications to find the data required to validate a policy renewal, because there is no single application where the data can be found. Where it becomes apparent that an operational process is merely acting as a band-aid for a technology limitation, then the relevant applications should be modified, extended or replaced. Organizing the data The second step is to organize the data held within the application architecture so that its value can be maximized:
  • The insurer should analyze whether it has a single view of its customers across all applications, or whether records exist separately in multiple applications. If the latter, then the risk is that updates made in one application only may cause misalignments between party records. One option is to implement a Master Data Management (MDM) solution holding the golden record of the customer, then having all other applications refer to the golden record.
  • The insurer should look at where its books of business are residing, and whether data migrations are required to move policies, claims, billing and other records from the source applications they reside on, to the insurer's strategic applications. If so, the insurer should determine the approach by which the data migrations will be performed, with the key options being automated Extract Transform Load (ETL) solutions, robotics applications and manual re-keying. If the source and target applications are multiple and varied, or if complex data transformations are required, then the migration process may require significant effort and planning.
  • The insurer should consider how it can enrich its data through integration with third-party services, such as those providing credit score reports, or those providing specialized data such as real-time flood risk for specific postcodes. The insurer should keep in mind that every integration with a third-party application carries a financial, complexity and performance cost; the value of the data obtained should be greater than the cost of obtaining it.
Extending the architecture Having built a solid platform, and having organized the data on it, the insurer can focus on extending the application architecture to achieve competitive differentiation. This can be done in innumerable ways; below are some ideas:
  • The insurer could consider whether technology may support different operating models. For example, an insurer transacting through brokers alone may consider building a web front end allowing customers to perform quote and buy transactions. Similarly, an insurer lacking a customer-facing claims portal may consider building one.
  • The insurer could consider whether its existing operating model could be extended through new channels. For example, a chatbot solution for claims First Notification of Loss (FNOL) could be built and integrated with a claims handling application. Similarly, an insurer could choose to integrate with a niche aggregator website to sell business that it previously only sold through brokers.
  • The insurer could consider advanced analytics solutions. For example, the insurer may evaluate building an insights engine extracting data from key applications, normalizing and cleaning it to remove inconsistencies and duplication and presenting the data in management dashboards.
  • The insurer could look at how to extend its application architecture to fit into its ambitions with regard to new technologies. For example, could anything be done to extend the architecture to accept data from IoT devices? Is artificial intelligence a real possibility on the current platform, or would structural changes be required? Robotics for operations is well established, but could robotics be applied to other use cases with the existing application landscape?
See also: Digital Insurance 2.0: Benefits   Analysis and Overview Although the three proposed stages are an approximation, and the individual steps within them far from comprehensive, the hypothesis is that an order in which activities should be undertaken on the road to insurer application architecture digitization does exist. Supporting this hypothesis are the numerous cases in which insurers have embarked on large-scale digitization programs leveraging cutting-edge, architecture-extending, solutions, only to find years down the line that, because their cores had not been fixed and their data not correctly organized, their digitization efforts were hampered, if not blocked. Key Points:
  • The journey to insurer application architecture digitization is a multi-step process, and the steps can be categorized into three phases: fixing the core, organizing the data and extending the architecture.
  • Fixing the core requires removing technical and operational debt associated with an insurance application architecture, taking the platform to a stable baseline.
  • To organize the data, an insurer should evaluate whether it has a single view of its customers, whether data migrations are required and whether to integrate with data enrichment solutions.
  • To extend its architecture, an insurer could consider enabling new operating models through technology adoption, implementing new technology-enabled channels, building advanced analytics capabilities and embedding into the architecture capabilities related to new technologies such as artificial intelligence and IoT.
  • The hypothesis is that, although not rigid, an order of activities does exist, and insurers should consider fixing their cores and organizing their data before embarking on large-scale architecture extension.

An Answer for California's Power Shutdowns

Insurers can look at the wildfire problem from the standpoint of the customer and find a series of ways—both big and small—to help.

sixthings

The awarding of the Nobel Peace Prize last week to Ethiopia's prime minister brought back the memory of a time I interviewed Isaias Afwerki, one of the key factors in this year's prize. It was 1991, and Afwerki had just emerged from the bush in Eritrea. He led a guerrilla movement that, after 30 years, won the country's independence from Ethiopia. In the process, his guerrillas helped depose a brutal Communist government in Ethiopia. The tall, high-cheekboned freedom fighter cut quite the figure, driving by himself to his first press conference in a Jeep that looked like it had just rolled out of the mountains. He charmed the four or five of us reporters who happened to be in the area for the impromptu gathering—and, I'm sorry to report, has since become one of the world's most notorious dictators. Ethiopia's prime minister, Abiy Ahmed, won the Peace Prize largely because he may have figured out how to restore peace with his country's northern neighbor. Afwerki did not share in the prize; he is a big part of the problem, not the solution.

Eritrea comes to mind because of another problem, too: the power blackouts being staged by PG&E to prevent wildfires in California, where many of us on the ITL team live.

When I stayed in Asmara, Eritrea's capital, the electricity in my hotel cut out often. I had hot water an hour a day. Now, here I am in California, one of the world's most dynamic economies, the home of Silicon Valley, and the main utility in Northern California just turned off power to 2 million people because it can't figure out how to keep power lines from arcing and starting fires (or is just too lazy to effectively clear away the brush near the lines that provides fuel for fires).

Eritrea at least had an excuse for its outages in 1991. It is one of the world's poorest countries and had just survived a war that killed 100,000 to 300,000 people. The conflict was so wrenching that if you asked, say, a 50-year-old his age, he'd say he was 20, because the 30 years of civil war didn't count as living. 

But what's California's excuse?

It turns out that everybody has an excuse, or at least someone to blame. The feds blame the state for poor forest management. The state blames the utility, PG&E, for poor management and general indifference to the fire problem. PG&E blames the state, saying it should assume much of the liability for its forests and citizens. Republicans blame Democrats for being tree-huggers and trying to prevent the kinds of fires that would thin out forests and avoid the monster conflagrations that ravaged the state last year. Democrats blame climate change. And so on.

No matter how the blame eventually gets assigned, the one sure thing is that the citizen/consumer will eventually foot the bill. A tax is coming. It may be in traditional form, with the state collecting money to pay for forest management, to absorb liability from PG&E, etc. Or, the tax may be collected through higher electricity rates. But a tax is coming, and it will be heavy.

Maybe it's time for insurance to ride to the rescue.

We won't be able to do anything much this time around, but maybe can do more to prevent a next time. 

At the moment, insurers are fleeing from the wildfire issue. That's natural. The amount of uncertainty is enormous, and so are the potential damages. But, in a day and age when every company claims to be customer-centric, perhaps we can, in fact, look at the problem from the standpoint of the customer and find a series of ways—both big and small—to help.

Think about the blackout issue from the standpoint of, say, the venues in Napa where weddings were scheduled over the weekend. It's hard to have a reception without electricity. That band doesn't sound so great without an amp, and not many people will dance if they have to gather around the DJ's iPhone—for as long as the battery lasts. Maybe those venues lined up generators and ran up other expenses to make sure they could pull off the event. But maybe the venue had to cancel and refund the money to the devastated couple. 

Now imagine the wineries that lost temperature control in the vats recently filled with this year's grape harvest. Or the restaurants that had food go bad as it sat in warming refrigerators. Or all the facilities that lost tourism business because of uncertainty. Or the parents who had to skip work because their kids' schools were closed, then check into a hotel because the power was off at home. Or...or...or.... 

The only folks I know of who were happy were Cal kids who saw that lights were off on the Berkeley campus, meaning that the threatened blackout had hit and that they could stop studying for mid-terms. 

Some of those affected will be covered by some form of business interruption insurance, but policies weren't written with today's California in mind. What would qualify as an event triggering a policy? The fact that the utility deliberately turned off your power? Hmmm. Many policies require an interruption of a certain length, such as 48 hours, but the blackouts were often shorter—long enough to cause uncertainty and mess up a business but maybe not long enough to trigger a claim. 

The wildfire/blackout problem isn't going away, even if the state government and PG&E—the two big villains, in my book—get their acts together. So, people and businesses will need and welcome help adjusting to the risk of outages for years to come.

Some of that help will come in the form of insurance: policies adapted to a world of occasional blackouts. But some can be provided through means that will feel unusual for insurers.

Perhaps information services can alert people sooner about impending blackouts—PG&E's communications were lousy this time around. Maybe risk management services can help line up backup so that the fish doesn't rot in the refrigerator and the wine harvest doesn't spoil. Businesses could find ways to help parents whose kids' schools have closed—Bring Your Child to Work During a Blackout, anyone? The answer for many could even just be a battery.

Whatever the answer, the idea is to think through the blackout issue from the standpoint of all those millions of individuals who have been affected and who will be affected by future blackouts. Then we can see what, in the name of providing peace of mind, the insurance industry can do to help, even if that means stretching beyond traditional boundaries. 

There won't be a Nobel Peace Prize in it for you, but there will likely be plenty of profit streams, and there will certainly be lots of grateful clients.

Cheers,

Paul Carroll
Editor-in-Chief


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

How Climate Change Is Increasing Rates

More climate change-related claims means higher insurance costs and insurers becoming stricter about who even gets coverage.

||
A growing number of policymakers, advocates and experts predict that extreme weather may lead to higher costs for home and flood insurance. Some analysts are even predicting that the effects of climate change may make home insurance and flood insurance unaffordable for many Americans. Home insurance companies charge higher premiums to cover property associated with higher risks. Added insurance costs could lead to lower home prices. "As insurance rates rise commensurate with increasing risk related to weather hazards, and property taxes rise to cover the costs of climate mitigation and adaptation, real estate values for properties in vulnerable areas will fall," predicts Donna Childs, author of the book "Prepare for the Worst, Plan for the Best: Disaster Preparedness and Recovery for Small Businesses.” "The insurance premiums and property taxes for these properties become higher,” Childs said. Daren Blomquist, senior vice president at ATTOM Data Solutions, observes that natural disaster risks have affected home prices. Blomquist notes that home price appreciation in cities with the highest flood risk was half that for the U.S. housing market overall during the past decade. It’s been one-third that for cities with the highest hurricane surge risk. "The broader market has also outperformed appreciation in cities with the highest wildfire risk during the last decade, although the gap is much narrower,” Blomquist said. Climate change top insurer issue Many insurance experts consider climate change as one of the most pressing issues. That concern may lead to higher insurance costs for homeowners. The riskier the property, the more an insurer charges. The result — more climate change-related claims means:
  • Higher insurance costs
  • Insurers becoming stricter about who even gets coverage
"It could limit coverage availability in vulnerable areas that have not taken appropriate mitigation/adaptation measures," warns Childs, founder and CEO of Prisere, a software developer providing technical assistance and training for climate and disaster resilience. Todd Teta, chief product officer at ATTOM Data Solutions, was recently affected personally when an insurer rejected him for a homeowners policy in California. The reason: wildfire risk. Teta said the community suffered a small fire three years earlier, but no structures were destroyed. However, the insurer was still concerned about potential risk. "Insurance companies are outright rejecting entire ZIP codes because of wildfire risk, even in areas they previously wrote policies in,” Teta laments. See also: Role of Big Data in Fighting Climate Risk   Wildfires more common One likely byproduct of climate change is more forest fires. The Center for Climate and Energy Solutions said research shows that climate change, particularly earlier snow melt, leads to hot, dry conditions and more fires in the summer. The U.S. Department of Forest Service forecast that an average annual one-degree Celsius increase would increase the median burned area by as much as 600% in some forests. There were 58,083 wildfires in the U.S. in 2018 and 71,499 wildfires in 2017, according to the National Interagency Fire Center. Roughly 8.8 million acres burned in 2018, compared with 10 million in 2017. The Insurance Information Institute (III) estimates that insured losses from the 2018 Butte County “Camp Fire” will ultimately reach between $8.5 billion and $10.5 billion. Home insurance typically covers wildfire damage. However, if your area is prone to forest fires that spread to homes, your insurer may exclude covering that damage. Look through the exclusion section in your home insurance policy, so you know if wildfire damage is excluded from your coverage. Flooding claims About 90% of natural disasters in the U.S. are tied to flooding, according to the Federal Emergency Management Agency (FEMA). There is a lack of consensus on whether climate change is leading to more flooding. The Natural Resources Defense Council (NRDC) recently said that it’s tricky to connect the effects to flooding. However, the Intergovernmental Panel on Climate Change noted in its special report on extremes that it's becoming clearer that climate change "has detectably influenced several of the water-related variables that contribute to floods, such as rainfall and snow melt." Flooding complicates things when it comes to insurance. Home insurance doesn’t usually cover flood damage. Instead, you need a separate insurance policy for flooding that comes from outside your home. FEMA’s National Flood Insurance Program (NFIP) administers flood insurance. Federal flood insurance is available "where the local government has adopted adequate floodplain management regulations under the NFIP -- and many communities participate in the program." Avoiding coastal areas and flood zones won’t necessarily protect you from flooding. III indicates that 20% of flood claims come from areas with low to moderate flood risk. "Recovering from just one inch of water inside your building can cost about $27,000," Janet Ruiz of the III explains. Insurers are bracing themselves for more flooding claims in the coming years. More flooding claims will result in higher rates and can even affect home purchase prices. Those who own homes in higher-risk areas are seeing their values increase at a lower rate than the national average. Here's how flooding claims have increased in recent years. Despite the increase in claims and average flood claim amounts, flood insurance policies are purchased less frequently today than they were a decade ago. In 2009, insurance companies sold 5.7 million flood insurance policies. In 2017, the number dipped to slightly more than 5 million. Tornadoes, hurricanes and climate change The Center for Climate and Energy Solutions says some areas, such as the North Atlantic, have seen more hurricanes over the past three decades. Scientists predict Category 4 and 5 hurricanes will increase in the coming years, though the overall number of hurricanes may decrease. “Although scientists are uncertain whether climate change will lead to an increase in the number of hurricanes, warmer ocean temperatures and higher sea levels are expected to intensify their impacts,” according to the Center for Climate and Energy Solutions. States prone to hurricanes feature hurricane deductibles. If your home gets damaged in a hurricane, you’ll have to pay a hurricane deductible after filing a claim. These deductibles are different from regular home insurance deductibles. Depending on an area’s risk, hurricane deductibles are based on a percentage of a home’s insured value. It’s usually between 2% and 5%, but Florida allows insurers to charge up to 10%. Whether your home policy covers you for hurricane damage depends on the fine print. You may need to get a windstorm rider to cover hurricane damage, such as lost siding, shingles or shattered windows. Combating climate change and rate hikes Childs said taking preventive actions can lower risks. "For example, when I purchased my home, the land on the western side slopes downward at a 30-degree angle, and the basement windows are flush with the ground, with the result that water would come downhill, creating the risk of water intrusion into the basement,” Childs said. See also: Parametric Solution for Wildfire Risk   Childs trenched this area and inserted a serrated pipe that connects to the sewer system. She also made a significant energy retrofit that reduced her utility bills by 40% and protects against the risk of extreme heat. Childs said home buyers should factor in climate risks when purchasing a home, including figuring out whether to buy flood insurance, even if you’re not in a high-risk area. When buying a home:
  • Shop around for insurance and know what you’re buying. If you need additional coverage, ask the insurer about riders and other coverage.
  • Take precautions to protect your home. If you’re building a new home, talk to the builder about the materials being used. If you live along the coast, check on storm shutters. Explore fire-suppression systems. All of these additions could lead to lower rates and even home insurance discounts.
You can’t completely safeguard against climate change-related weather damage. But it’s wise to take precautions and know how you’re covered to minimize later problems. You can find the original article published here on Insure.com.