Download

Solution for Biggest Cyber Risk Is Emerging

New payments technologies will remove the risk of identity theft at the point of sale, the focus of most cyber policies.

As the demand for cyber insurance has skyrocketed, so, too, has the cost. One broker estimates that sales in 2014 will double from the $1 billion premium collected in 2013. Much of the increase in demand and cost has been a result of the widely publicized hacks of the point-of-sale systems at large retailers, and the primary emphasis of most cyber policies is to address liability arising from such events. New payment technologies, however, will change the need for this type of cyber insurance. American Express recently announced a token service; Apple incorporated ApplePay into its new iPhones; and a group of retailers, the Merchant Customer Exchange, is working on the release of a new payment technology, as well. These technologies, although different in detail, eliminate the need for merchants to collect unencrypted payment card information from customers, significantly reducing the risk created by point-of-sale malware. These technologies work by generating tokens or cryptograms for use at the point of sale. Financial institutions are able to determine whether the tokens or cryptograms are associated with a customer's account, even though it is virtually impossible for a third party possessing the token or cryptogram alone to identify the account. The specifics of the technologies vary, but the result is that the merchant does not need access to the customer's unencrypted account information, and any data obtained through the point-of-sale malware becomes virtually worthless. As these payment technologies become prevalent in the U.S., the need for cyber insurance protecting retailers against point-of-sale malware should plunge. There still will be a need for coverages protecting against other cyber risks, including other forms of malware and security breaches as well as against business interruptions arising from cyber events. However, the need and demand for cyber insurance covering privacy breaches should be reduced and the pressure on much of the current cyber insurance market removed. This article first appeared on the Privacy and Information Security Law Blog.

Lon Berk

Profile picture for user LonBerk

Lon Berk

Lon Berk’s practice focuses on counseling and assisting clients with complex insurance recoveries. Lon assists clients in resolving insurance disputes relating to mass torts, catastrophic events and cyber security issues. He advises clients on liabilities arising out of emerging technologies, including issues concerning Internet security, and provides advice regarding insurance covering such exposures.

Document, Document, Document, Document

In handling workers' comp claims, the first rule is, essentially, "If it ain't written down, it didn't happen."

When I was a fledging claims examiner, I was taught the universal file mantra: res ipsa loquitor ("the thing speaks for itself"). In claim terms, it means the file should tell the story. Not a difficult concept if you think about it. I suppose a less classy version would be, “If it ain’t written down, it didn’t happen.” But res ipsa loquitor is more than just jotting down crib notes in a file. It’s about documenting the direction in which you are headed and why. A second, equally important mantra came from a former claim manager, who said, "If you were hit by a bus, someone should be able to pick up your file and know exactly where you left off.” He went on to say that could only be achieved through clear, copious and comprehensive documentation. As a senior work comp claims consultant, I often dinged examiners for poorly documented files and a dearth of a memorialized thought processes. Occasionally, their response was akin to, “Well, obviously you don’t know Alaska comp!” (or whatever state I was auditing).  While I realize they were just being defensive, they missed the point entirely. I wasn’t criticizing them for something statute-related; I was simply telling them they failed the res ipsa loquitor test, which is the most critical common denominator in all the various best-practice file criteria available. An employer always has the right to know how and why a certain decision was made. Pronouncements made by insurance companies and third-party administrators (TPAs) – which obviously affect the employer - should never be made in a vacuum. In my claim manager days, I came up with a whimsical way for my staff to visualize the documentation process. I called it HIPPO. History: What happened? Who/what/where/how? Issues: How serious was the injury? Is there time-loss involved? Plan of Action: What is the examiner’s initial plan? Is return-to-work possible? Procedure: How will the examiner begin the compensability process? Outcome: What is the examiner’s best guess at this point? Obviously, HIPPO would need to be revisited from time to time as the file matures and becomes more complicated. And, of course, the compensability aspect of the claim would be much different than a plan of action for continuing medical management. But you get the drift. And, yes, I found little plastic hippopotamuses that I placed on their desks to help them remember! For the younger generation, who seem compelled to use social media to document what they had for breakfast, perhaps this concept will come easier. Then again, documenting what you ate is a far cry from explaining why you chose that particular food item; why it was the best choice over all others; why it made financial sense; and what gastronomical ramifications might ensue because of the choice. Bon appétit.

Daniel Holden

Profile picture for user DanielHolden

Daniel Holden

Dan Holden is the manager of corporate risk and insurance for Daimler Trucks North America (formerly Freightliner), a multinational truck manufacturer with total annual revenue of $15 billion. Holden has been in the insurance field for more than 30 years.

Little-Known Loophole Inflates Health Costs

The federal 340B program, designed to help the poor afford drugs, is being exploited by rich hospitals and driving up premiums for all.

The rising cost of insurance is putting a squeeze on American families. And this problem could get even worse if lawmakers don't fix a little-known federal drug program called "340B." Created by Congress in 1992, 340B was originally intended to provide low-income people access to needed medications. This program allows hospitals, clinics and other healthcare providers serving large numbers of poor and uninsured patients to buy drugs at a deep discount. The idea was that these facilities would pass along those savings to their patients. But 340B is not working as intended. Instead, it's being manipulated by hospital systems to increase profits. It isn't helping the poor. And this exploitation is driving up health insurance costs for all Americans. Price Disparity The program's major flaw is it doesn't actually require healthcare providers to pass along those drug discounts to low-income patients. Participating facilities are free to buy huge volumes of cheap medicines and then sell them at full price to insured patients -- and pocket the difference. That's exactly what many participants are doing. Duke University Hospital has accumulated $280 million in profits from 340B over the last five years. The drug chain Walgreens is projected to make a quarter of a billion dollars off the program over the next half decade. Established hospital systems have increased their revenue from 340B by buying up specialty clinics. These smaller practices often use a high volume of expensive drugs. By acquiring these clinics, hospitals can purchase even more discounted medicines through 340B and further boost profits. In 2012, hospitals enrolled more clinics in 340B than in the previous 20 years combined. A new University of Chicago study shows that most of these clinics are located in relatively affluent areas. In other words, they aren't even pretending to serve the low-income and uninsured populations 340B was intended to help. Unfortunately, lawmakers have not responded to these abuses by fixing 340B's structural flaw. Instead, they've blindly expanded the program. Back in the early '90s, just 90 health care facilities participated in 340B. Today, that figure is more than 2,000. The acquisition of smaller clinics, precipitated by 340B, will seriously drive up insurance costs for average Americans. Large, established health providers tend to charge more than smaller, independent clinics. And insurance companies respond to these higher treatment expenses by raising premiums. Indeed, a study from three Duke University researchers published in the October issue of the journal Health Affairs looked into the price disparity between key cancer drugs provided at both corporate hospitals and clinics. Researchers noted that, between 2005 and 2011, the proportion of cancer services administered at independent clinics dropped by 90%. They found that the price gap between the two settings can be as much as 50%. Pharmaceutical manufacturers are now incurring heavy losses from 340B abuse. In 2010, this program cost the industry $6 billion. By 2016, that's expected to more than double, to $13 billion. Simple economics forces firms to compensate for losses by raising their prices, leading to higher medical expenses for average patients. Noble Purpose 340B has a noble purpose. But it's not fulfilling its mission to provide vulnerable patients with discounted drugs. Instead, 340B is being exploited by rich hospitals to boost their bottom lines. And these abuses are leading to higher insurance costs for everyone else.

Sally Pipes

Profile picture for user SallyPipes

Sally Pipes

Sally C. Pipes is president and chief executive officer of the Pacific Research Institute, a San Francisco-based think tank founded in 1979. In November 2010, she was named the Taube Fellow in Health Care Studies. Prior to becoming president of PRI in 1991, she was assistant director of the Fraser Institute, based in Vancouver, Canada.

How to Avoid Paying for Hospitals' Errors

An employer pays an average of $8,000 for every error by a hospital -- but it doesn't have to be that way.

There’s been a lot of talk lately about value-based purchasing and price transparency in the U.S. healthcare system. With the proliferation of high-deductible health plans, consumers and payers are now actively chasing “value”— high-quality care at the right price. But what happens when “value” calculates to a grand total of zero—or even less than zero? Only in healthcare is that even possible. “Zero value” occurs when healthcare is harmful—and you, the patient or purchaser, pay extra for the privilege of that harm. This is the issue currently facing employers and other purchasers paying out of their own pockets when a hospital commits an error that results in injury, infection or other harm to a patient. It’s backwards and incomprehensible, but healthcare purchasers are at the mercy of these zero value “hidden surcharges.” The payer gets the bill for the added length of stay and treatment of the infection or the medication error, even if they were preventable. This is common, and it’s not cheap. The Leapfrog Group created the Hidden Surcharge Calculator, which estimates that, on average, an employer pays approximately $8,000 per hospital admission for errors, injuries, accidents and infections. The calculator was recently awarded  a “Certificate of Validation Seal” by the Care Innovations Validation Institute, an organization established by Intel and GE to rate healthcare tools, plans and vendors to help industry consumers make educated choices. The Hidden Surcharge Calculator is free and allows plans and employers to determine surcharges they pay for their covered lives. To build on the findings from the calculator, Leapfrog crafted additional tools to help purchasers apply their leverage with hospitals in their communities, communicate effectively with their employees about patient safety and try to reduce some of these shocking surcharges. So we launched the Hospital Safety Score Purchaser Toolkit, also free, created with the support of a grant from the Robert Wood Johnson Foundation. The toolkit is being released at a crucial time of year—the beginning of open enrollment season. We know that employers want to help their employees make the best decisions about their healthcare, and we hope that our toolkit will foster genuine conversations on these issues. We include downloadable “plug-and-play” communications, including newsletter articles, internal memos template emails and even sample tweets. Messages educate employees about the problem of patient safety and what they can do to protect themselves and their families. It provides background and instructions for using the Hospital Safety Score, letter grades that assess the safety of general hospitals. There’s also a series of whiteboard videos that explain the issues in plain language and can be downloaded at no cost. Just as importantly, we want to encourage purchasers to use their own leverage to effect change. Despite the harm to employees and expense to the bottom line, patient safety is rarely observable in claims data. Purchasers have to rely on hospitals to voluntarily report on safety to the Leapfrog Hospital Survey. By putting pressure on hospitals to publicly report to Leapfrog, healthcare purchasers can ensure that transparency and accountability are at the top of every hospital’s agenda. The toolkit offers suggestions on joining local business coalitions on health to maximize regional leverage, communicating with hospitals and getting needed provisions in contract language with plans. Value-based purchasing is nonsensical when value is less than zero, so plans and purchasers need to be more aggressive on patient safety. Otherwise, payment reform loses its raison d’etre. Because the safety problem is so large and hard to pinpoint, many payers just give up. The Purchaser Toolkit, Hospital Safety Score and Surcharge Calculator are meant to provide them with concrete steps that will make a difference immediately.

Modernization: The Key Role for HR

Modernization is not just about changes to processes and technology. There also are potentially profound organizational and talent-related changes.

Insurance modernization results in core business and administrative functions using commonly trusted sources of data to inform and enhance business decisions and reporting. Benefits of this transformation include greater efficiencies, improved decision-making and better risk management. However, modernization is not just about changes to processes and technology. There also are potentially profound organizational and talent-related changes. Given the broad impact insurance modernization will have throughout an organization, HR has a key role during the journey. The case for change Insurance modernization is not simply another transformation program that affects select capabilities within specific organizational functions. Rather, the changes that insurance modernization brings are far more broad and systemic. HR executives, in particular, understand how modernization reshapes their organization’s talent agenda, influences priorities,and changes how HR invests in services that enable the transformation. Many insurers already face critical skill shortages, particularly in the areas where insurance modernization makes the greatest impact: risk, actuarial, financial and technical competencies. Modernization efforts also hinge on an organization’s ability to attract and retain people who have the skills to drive transformational change. Combined with an aging workforce, the case for adjusting talent strategy is clear. Modernization also requires stronger links between performance management and critical regulatory and risk management objectives. Changes in organizational structure, governance and decision rights may be necessary to strengthen accountability for modernization goals and objectives. Characteristics of a modernized HR function In the journey toward a modernized organization, HR leaders are uniquely positioned to drive workforce changes for positive impact. Leading HR functions draw on their capabilities and services to provide support by:
  • Driving organization structure, governance and decision rights changes to strengthen accountability for modernization goals and objectives. HR will help manage changes to the work processes and systems that drive a substantial redistribution of work effort by redeveloping job families or job design and making logical and lasting changes to interaction models.
  • Shifting key talent management strategies – ranging from recruitment to career development to succession planning – to address gaps in critical actuarial, financial and technical capabilities. The modernized organization will require talent with new and enhanced skills. In some roles, for example, employees will devote less time to activities such as data scrubbing and more time to analysis and consultation.
  • Motivating broader cultural change through employee engagement initiatives that promote modernization vision and goals. Leading HR organizations will leverage insurance modernization as a key growth opportunity for both the organization and for high performers who demonstrate skills development and leadership capabilities.
As a key partner in the transformation, HR leaders sit alongside business leaders to plan for critical modernization initiatives and develop strategies to drive desired outcomes. At the same time, other important HR capabilities, such as performance management and goal-setting processes, will be critical to driving organization-wide performance outcomes. The benefits As a steward of organizational capabilities, the leading practice HR function is a key supporter of insurance modernization. The main challenge for the HR function is to assess organizational needs during the transformation while focusing on broader organizational objectives at the same time. Engaging HR early in the transformational journey sets the foundation for realizing objectives:
  • Defining program impacts to both people and the organization and developing a holistic plan for protecting and expanding the organization’s brain trust and key capabilities.
  • Directing organizational development, including organizational redesign and job analysis activities, by redirecting limited resources toward critical needs and forecasting and managing HR capabilities and capacity in concert with demands.
  • Reshaping talent strategy for risk, actuarial and finance roles, as well as select IT roles, to address shifting skill profiles and talent requirements
  • Designing a change management plan from an HR perspective that aligns culture-related change management and communications activities to promote organizational focus on modernization’s most critical objectives.
From an HR perspective, insurance modernization enables the institutionalization of key capabilities, rather than the loss of key assets as talent moves within and outside the organization. Modernized IT services and platforms enable both performance and scalability by facilitating more efficient use of talent. Critical success factors The degree of HR’s impact on modernization hinges on the degree to which it can partner with key business leaders. HR can provide critical support when it is engaged early on and acts as counsel to risk, actuarial and finance leaders on the people and organizational impacts stemming from modernization. The following factors apply:
  • Early involvement to design appropriate governance, oversight and decision-making processes, as well as evaluate the impacts of technology and process changes that can drive changes to talent, structure and performance requirements.
  • Sustained participation in decision making to enable thorough discussion of broader organizational- and talent-related impacts, as well as redirection of activities as needed to align decisions to broader organizational goals, drive accountability and protect against critical talent loss.
  • Active engagement with talent at all levels of the organization to understand the impacts of the transformation on roles, employee satisfaction and engagement within the organization, as well as the talent risks that may need to be managed.
HR also often has a cross-functional view of impacts and can identify key talent from other parts of the organization to lead transformation efforts. With an eye on the future needs of the organization, HR can provide advice to keep leaders on the same page and help provide solutions to issues that arise along the modernization journey. Next steps To be successful, transformation requires discipline and continued leadership commitment to achieve goals. HR is well-positioned to help drive progress and momentum toward achieving higher levels of organizational maturity. HR also can help to keep leaders connected, as well as monitor and maintain alignment between organizational and individual goals. It also provides a people-focused perspective to modernization objectives and initiatives and works with business leaders to protect the organization’s talent assets.

Elaine Miller

Profile picture for user ElaineMiller

Elaine Miller

Elaine Miller is a managing director in New York and a leader of PwC’s financial services advisory people and change practice in the U.S. Miller has more than 25 years of management and consulting experience leading numerous projects to help clients design and implement strategic programs to build organizational capabilities and improve business performance.

A Wakeup Call for Benefits Brokers

The Aetna acquisition of bswift shows that the rules of the game are changing -- and you don't get to make the rules.

More news from the technology front: Aetna acquires bswift. , shortly after Hodges-Mace announced the purchase of SmartBen. Last year, it was Towers Watson buying Liazon. Next year, it will be someone else. Is this just beginning of the dance where everyone in employee benefits needs to choose a partner? What does this mean for the benefits market and the benefits broker?

For some, the Aetna acquisition of bswfit may be strange. Aetna buys a company that provides technology that is used by its competitors and that handles enrollment for many employers that don’t have Aetna insurance. Similarly, Towers Watson bought a company whose products and services are distributed by its competitors, other brokers. What most people aren’t realizing is that the world has changed. If you view this acquisition in the old world, where competitors don’t work together, you may see it one way, but in a new world it may look a little different -- in many industries, companies that compete in one segment may be partners in another. My message to brokers on this is to start thinking differently. Those who don’t will get left behind. The rules of the game are changing, and you don’t get to make all the rules. I have been fortunate to have worked in some capacity with Mark Bertolini, CEO of Aetna, and Rich Gallun, CEO of bswift. Both are outside-the-box thinkers. Aetna has invested billions in technology preparing for what it views as a consumer-centric healthcare model. Aetna wants to reinvent the patient experience. To quote Bertolini, "We're going to begin to change the healthcare industry by giving people tools they can put in the palm of their hand.” Here is another quote from Bertolini that would make brokers pause. When asked about the future of healthcare, Bertolini responded: “There wouldn't be plan designs. You wouldn't need them. What you would do is invest in all those things that are necessary to keep people healthy.” You can see a full overview of the Aetna model by viewing this presentation from its 2013 investor conference. Some may see the bswift acquisition as a benefits enrollment platform for Aetna. But I see this as another step by Aetna to execute on a plan to compete effectively in a new healthcare world. A world where consumers are in more control. Where provider systems are engaged in a patient's wellness and not just proving treatment after the fact. Where health information and communication is moved via Web and mobile. Bswift made a strategic move into the consumer-centric world through private exchange technology, with individual rating and decision support tools. Now it has paid off. This made bswift attractive to Aetna. Congratulations to bswift for a job well done. So what does this mean for benefits brokers? A few weeks ago, I wrote an article titled “Does Apple’s HealthKit signal the end of employer-based insurance?” Some may not relate Apple’s investment to the Aetna acquisition of bswift; however, I think they are related. Apple is clearly one of the top consumer technology vendors in the market. Aetna is driving consumer-centric healthcare. They are pieces of the same puzzle. It is a puzzle benefits brokers need to pay attention to because the market is changing around them. A carrier buying an enrollment vendor says one thing, Aetna’s and Apple’s investments mean something different. The healthcare world is changing in a way that most brokers are not recognizing. Consumer-centric; mobile; doctors as wellness facilitators; employers out of the risk business? Maybe. So get ready.

Joe Markland

Profile picture for user JoeMarkland

Joe Markland

Joe Markland is president and founder of HR Technology Advisors (HRT). HRT consults with benefits brokers and their customers on how to leverage technology to simplify HR and benefits administration.

Innovation in Insurance Begins to Refocus

Study finds more focus on core systems, as the foundation for innovation, and use of perspectives from outside the industry.

With today’s fast pace of change, innovation is no longer a nice-to-have initiative, but rather a must-have, strategic mandate that is defining a new era for insurance – and separating future winners and losers. Today, it is not any one thing that is creating change, but the convergence of many things that are creating a seismic shift. Strategy Meets Action (SMA) has actively tracked and promoted innovation in the marketplace for several years and has been publishing formal innovation research since 2012. In our latest report, Innovation in Insurance: Expanding Focus and Growing Momentum, we see continued progress, but with a refocus. SMA believes this reflects the realization that modernization of core systems is a foundational requirement for innovation. At the same time, insurers’ innovation approaches and efforts are broadening. Insurers are getting outside-in views, engaging in open innovation and developing an ecosystem of outside resources to fuel the innovation journey. This move reflects a best practice from outside the industry: acknowledging that no business can expect to harness the future and all its conceivable possibilities on its own. Within their ecosystems, insurers are primarily engaging with agents, business partners, software partners, customers, other insurers and a supply chain as catalysts for innovation. However, more outside-in relationships with high tech, other industries, futurists, venture capital firms and academia are beginning to take shape, as well. The ecosystem is gaining importance because leading insurers recognize that day-to-day operational demands mean there is a lack of time and resources for tracking, assessing and putting the implications for insurance into context. Also, the whole network benefits from the integration of new thinking as the input of the outside organizations helps to break down legacy assumptions. The expanding focus and growing momentum for innovation is reflected in some key survey results, including:
  • More than a fourth of insurers (26%) have focused on innovation for five years or more, and 33% have focused on it for two to five years. That puts 59% of insurers focused on innovation for the last two years, highlighting the growing momentum. A majority of further 32% have made innovation a focus for two years or less.
  • Innovation leadership and organizational approach takes many different forms. Only 7% of insurers have a dedicated innovation area. More than half of insurers (51%) have no single area of the organization leading innovation. Nearly 28% of insurers have their strategy or R&D leadership/areas lead innovation. SMA believes this reflects the resurgence of strategy and R&D to provide an enterprisewide approach for innovation, maximizing the strategic impact and value of innovation initiatives to the organization’s Next-Gen Insurer vision and strategy.
  • Encouragingly, more insurers believe their investments are positioning them well ahead on the innovation journey as market leaders (9%) and movers (33%) as contrasted with those that are at the early stages of the journey as mainstreamers (22%) and those at the very beginning stages or not focused on innovation as laggards (9%). SMA believes this reflects the broadening focus of continued implementation of modern core insurance systems and innovation.
  • The top four industries influencing insurance in the next year are: healthcare (46%), with the potential influence of the healthcare insurance exchanges; high tech (45%), with the potential of Google, Amazon and Apple entering or disrupting insurance; telecom (32%), with the race for the customer’s connectivity, data and services; and government (32%), with some states aggressively piloting new technologies such as driverless vehicles.
  • The focus and business drivers for innovation are changing, reflecting the shifting landscape of influencers, threats and competitors for insurance. Enabling growth (42%) and profitability (30%) moved into the top spots, up from second and fourth in 2013. But a bigger shift has also emerged, reflecting the demands of the digital revolution and outside industry influencers. In 2013, improving existing products and providing great service were in the top six. In 2014, there is a shift in focus to developing new products and engaging and strengthening customer relationships, which is directly related to meeting the new expectations of customers.
A new future is rapidly unfolding, and the pressure is on. Innovation can never cease. It must advance with urgency. Each and every day, insurers must recommit to their innovation journey and the culture they have created for it – and avoid falling into an operational trap. As Charles Darwin said: “It is not the strongest of the species that survives, nor the most intelligent that survives. It is the one that is the most adaptable to change.” Innovation will be a journey of great disruption, great opportunity and great change. Have you started your innovation journey?

Denise Garth

Profile picture for user DeniseGarth

Denise Garth

Denise Garth is senior vice president, strategic marketing, responsible for leading marketing, industry relations and innovation in support of Majesco's client-centric strategy.

Riding Out the Storm: the New Models

This article is the second in a series on how the evolution of catastrophe models provides a foundation for much-needed innovation in insurance.

|
In our last article, When Nature Calls, we looked back at an insurance industry reeling from several consecutive natural catastrophes that generated combined insured losses exceeding $30 billion. Those massive losses were a direct result of an industry overconfident in its ability to gauge the frequency and severity of catastrophic events. Insurers were using only history and their limited experience as their guide, resulting in a tragic loss of years’ worth of policyholder surplus. The turmoil of this period cannot be overstated. Many insurers went insolvent, and those that survived needed substantial capital infusions to continue functioning. Property owners in many states were left with no affordable options for adequate coverage and, in many cases, were forced to go without any coverage at all. The property markets seized up. Without the ability to properly estimate how catastrophic events would affect insured properties, it looked as though the market would remain broken indefinitely. Luckily, in the mid 1980s, two people on different sides of the country were already working on solutions to this daunting problem. Both had asked themselves: If the problem is lack of data because of the rarity of recorded historical catastrophic events, then could we plug the historical data available now, along with mechanisms for how catastrophic events behave, into a computer and then extrapolate the full picture of the historical data needed? Could we then take that data and create a catalog of millions of simulated events occurring over thousands of years and use it to tell us where and how often we can expect events to occur, as well as how severe they could be? The answer was unequivocally yes, but with caveats. In 1987, Karen Clark, a former insurance executive out of Boston, formed Applied Insurance Research (now AIR Worldwide). She spent much of the 1980s with a team of researchers and programmers designing a system that could estimate where hurricanes would strike the coastal U.S., how often they would strike and ultimately, based on input insurance policy terms and conditions, how much loss an insurer could expect from those events. Simultaneously, on the West Coast at Stanford University, Hemant Shah was completing his graduate degree in engineering and attempting to answer those same questions, only he was focusing on the effects of earthquakes occurring around Los Angeles and San Francisco. In 1988, Clark released the first commercially available catastrophe model for U.S. hurricanes. Shah released his earthquake model a year later through his company, Risk Management Solutions (RMS). Their models were incredibly slow, limited and, according to many insurers, unnecessary. However, for the first time, loss estimates were being calculated based on actual scientific data of the day along with extrapolated probability and statistics in place of the extremely limited historical data previously used. These new “modeled” loss estimates were not in line with what insurers were used to seeing and certainly could not be justified based on historical record. Clark’s model generated hurricane storm losses in the tens of billions of dollars while, up until that point, the largest insured loss ever recorded did not even reach $1 billion! Insurers scoffed at the comparison. But all of that quickly changed in August 1992, when Hurricane Andrew struck southern Florida. Using her hurricane model, Clark estimated that insured losses from Andrew might exceed $13 billion. Even in the face of heavy industry doubt, Clark published her prediction. She was immediately derided and questioned by her peers, the press and virtually everyone around. They said her estimates were unprecedented and far too high. In the end, though, when it turned out that actual losses, as recorded by Property Claims Services, exceeded $15 billion, a virtual catastrophe model feeding frenzy began. Insurers quickly changed their tune and began asking AIR and RMS for model demonstrations. The property insurance market would never be the same. So what exactly are these revolutionary models, which are now affectionately referred to as “cat models?” Regardless of the model vendor, every cat model uses the same three components:
  1. Event Catalog – A catalog of hypothetical stochastic (randomized) events, which informs the modeler about the frequency and severity of catastrophic events. The events contained in the catalog are based on millions of years of computerized simulations using recorded historical data, scientific estimation and the physics of how these types of events are formed and behave. Additionally, for each of these events, associated hazard and local intensity data is available, which answers the questions: Where? How big? And how often?
  2. Damage Estimation – The models employ damage functions, which describe the mathematical interaction between building structure and event intensity, including both their structural and nonstructural components, as well as their contents and the local intensity to which they are exposed. The damage functions have been developed by experts in wind and structural engineering and are based on published engineering research and engineering analyses. They have also been validated based on results of extensive damage surveys undertaken in the aftermath of catastrophic events and on billions of dollars of actual industry claims data.
  3. Financial Loss – The financial module calculates the final losses after applying all limits and deductibles on a damaged structure. These losses can be linked back to events with specific probabilities of occurrence. Now an insurer not only knows what it is exposed to, but also what its worst-case scenarios are and how frequently those may occur.
Screenshot-2014-11-13-14.50.41 When cat models first became commercially available, industry adoption was slow. It took Hurricane Andrew in 1992 followed by the Northridge earthquake in 1994 to literally and figuratively shake the industry out of its overconfidence. Reinsurers and large insurers were the first to use the models, mostly due to their vast exposure to loss and their ability to afford the high license fees. Over time, however, much of the industry followed suit. Insurers that were unable to afford the models (or who were skeptical of them) could get access to all the available major models via reinsurance brokers that, at that time, also began rolling out suites of analytic solutions around catastrophe model results. Today, the models are ubiquitous in the industry. Rating agencies require model output based on prescribed model parameters in their supplementary rating questionnaires to understand whether or not insurers can economically withstand certain levels of catastrophic loss. Reinsurers expect insurers to provide modeled loss output on their submissions when applying for reinsurance. The state of Florida has even set up a commission, the Florida Commission on Loss Prevention Methodology, which consists of “an independent body of experts created by the Florida Legislature in 1995 for the purpose of developing standards and reviewing hurricane loss models used in the development of residential property insurance rates and the calculation of probable maximum loss levels.” Models are available for tropical cyclones, extra tropical cyclones, earthquakes, tornados, hail, coastal and inland flooding, tsunamis and even for pandemics and certain types of terrorist attacks. The first set of models started out as simulated catastrophes for U.S.-based perils, but now models exist globally for countries in Europe, Australia, Japan, China and South America. In an effort to get ahead of the potential impact of climate change, all leading model vendors even provide U.S. hurricane event catalogs, which simulate potential catastrophic scenarios under the assumption that the Atlantic Ocean sea-surface temperatures will be warmer on average. And with advancing technologies, open-source platforms are being developed, which will help scores of researchers working globally on catastrophes to become entrepreneurs by allowing “plug and play” use of their models. This is the virtual equivalent of a cat modeling app store. Catastrophe models have provided the insurance industry with an innovative solution to a major problem. Ironically, the solution itself is now an industry in its own right, as estimated revenues from model licenses now annually exceed $500 million (based on conversations with industry experts). But how have the models performed over time? Have they made a difference in the industry’s ability to help manage catastrophic loss? Those are not easy questions to answer, but we believe they have. All the chaos from Hurricane Andrew and the Northridge earthquake taught the industry some invaluable lessons. After the horrific 2004 and 2005 hurricane seasons, which ravaged Florida with four major hurricanes in a single year, followed by a year that saw two major hurricanes striking the Gulf Coast – one of them being Hurricane Katrina, the single most costly natural disaster in history – there were no ensuing major insurance company insolvencies. This was a profound success. The industry withstood a two-year period of major catastrophic losses. Clearly, something had changed. Cat models played a significant role in this transformation. The hurricane losses from 2004 and 2005 were large and painful, but did not come as a surprise. Using model results, the industry now had a framework to place those losses in proper context. In fact, each model vendor has many simulated hurricane events in their catalogs, which resemble Hurricane Katrina. Insurers knew, from the models, that Katrina could happen and were therefore prepared for that possible, albeit unlikely, outcome. However, with the universal use of cat models in property insurance comes other issues. Are we misusing these tools? Are we becoming overly dependent on them? Are models being treated as a panacea to vexing business and scientific questions instead of as the simple framework for understanding potential loss? Next in this series, we will illustrate how modeling results are being used in the industry and how overconfidence in the models could, once again, lead to crisis.

James Rice

Profile picture for user JamesRice

James Rice

James Rice is senior business development director at Xuber, a provider of insurance software solutions serving 180+ brokers and carriers in nearly 50 countries worldwide. Rice brings more than 20 years of experience to the insurance technology, predictive analytics, BI, information services and business process management (BPM) sectors.


Nick Lamparelli

Profile picture for user NickLamparelli

Nick Lamparelli

Nick Lamparelli has been working in the insurance industry for nearly 20 years as an agent, broker and underwriter for firms including AIR Worldwide, Aon, Marsh and QBE. Simulation and modeling of natural catastrophes occupy most of his day-to-day thinking. Billions of dollars of properties exposed to catastrophe that were once uninsurable are now insured because of his novel approaches.

3 Reasons Why Risks Are Mismanaged

ERM practitioners must determine how much they can rely on what colleagues in other functions or units say about a situation and its risks.

ERM can bring great benefits. By managing risk, it helps to minimize loss as well as maximize strategic profitability, optimize opportunities and enhance culture and reputation. Thus, when a loss occurs in a company that has been practicing ERM, the reaction is to be disappointed in ERM as a practice or to blame the ERM leader for faulty execution. It would be unwise, however, to react without further analysis and greater understanding. No process or person is perfect; there will be times when ERM may fail to live up to expectations. Even with excellent execution, there will be times when a risk is too opaque or too complicated to be identified or managed effectively. All ERM practitioners must determine how much they can rely on what their colleagues in other functions or business units say about a business situation and how much risk it holds, because there are, at least, three circumstances that might cause a business leader or “expert” to overlook or underestimate a risk. They are:
  1. Reluctance to expose or report a risk, for whatever reason,
  2. Lack of sufficient expertise or experience to recognize a risk or determine its size.
  3. Reliance on imprecise or inadequate standards and models that fail to signal a risk.
Reluctance  It is really not such a mystery why a business leader or staff member might be reluctant to identify a risk. Among the reasons are:
  • Fear of being labeled a naysayer,
  • Fear of derailing an initiative that has favor in the C-Suite, thus becoming persona non grata,
  • Concern that identifying a risk might hurt personal compensation, at least in the short term.
An environment that is rife with these cultural stimuli will never produce transparency in risk identification and mitigation. Factors that can give rise to such an environment include:
  • Senior management who cannot distinguish between a naysayer and someone who is risk-aware and committed,
  • Senior management who have shown themselves to be closed-minded or have “shot the messenger” when presented with an issue,
  • Staff at any level who are not able or willing to consider the long-term health of the organization.
An environment where risk is openly discussed is a prerequisite to being able to manage risk well, but producing such an atmosphere takes time and effort. Much has been written about ERM and culture, and this literature holds great advice about how to build a risk-aware culture.  Among the collected wisdom is:
  • The board and CEO must continuously champion ERM,
  • Risk must be represented in strategy discussions, organizational performance management, various employee communications and individual performance plans,
  • ERM must be given effective resources,
  • The ERM process should be robust and repeatable,
  • Mitigation plans should be closely monitored,
  • Rewards or lack thereof should be determined on the basis of how well risk is managed per plans. 
Lack of Expertise Less sinister but no less dangerous a situation exists when the presumed experts do not have the knowledge or skill to identify risks within their spheres of responsibility.The person involved could be a business unit leader, a plant manager, a department/function head or a member of the C-Suite. Consider the testimony given by Jamie Dimon, chairman and CEO of JPMorgan Chase, about the bank’s chief investment office (CIO). His bank lost billions of dollars from a large accumulation of synthetic derivatives tied to credit default swaps that crashed in value. These investments were handled by staff based in London, in a debacle nicknamed “The London Whale.” The following is an excerpt of that testimony before the Committee on Banking, Housing, and Urban Affairs in the U.S. Senate on June 13, 2012: • "CIO’s strategy for reducing the synthetic credit portfolio was poorly conceived and vetted. The strategy was not carefully analyzed or subjected to rigorous stress testing within CIO and was not reviewed outside CIO. • "In hindsight, CIO’s traders did not have the requisite understanding of the risks they took. When the positions began to experience losses in March and early April, they incorrectly concluded that those losses were the result of anomalous and temporary market movements, and therefore were likely to reverse themselves. • "The risk limits for the synthetic credit portfolio should have been specific to the portfolio and much more granular, i.e., only allowing lower limits on each specific risk being taken. • "Personnel in key control roles in CIO were in transition, and risk control functions were generally ineffective in challenging the judgment of CIO’s trading personnel. Risk committee structures and processes in CIO were not as formal or robust as they should have been. • "CIO, particularly the synthetic credit portfolio, should have gotten more scrutiny from both senior management and the firmwide risk control function. “ This is truly a wake-up call to all organizations. It is an example of consciously adopted risk that produced billions of dollars of loss. The reason for its having reached the proportions that it did is described by the CEO as a lack of expertise, whether it be in terms of market knowledge, management controls and processes or something else. To help ensure appropriate levels of expertise, an organization should ask these questions and act when the answer is negative:
  • Do the leaders of significant areas of the organization have deep knowledge of their operations?
  • Do the leaders of significant areas of the organization understand the importance of managing risk?
  • Do the leaders of significant areas of the organization have critical thinking capabilities and the communication skills to articulate what the risk profile of their operation looks like?
  • Do the leaders of significant areas of the organization ask for input from others who may be expert about risk?
  • Are those who facilitate the risk management process adequately knowledgeable and given sufficient resources?
  • Are there specialized risk management professionals in place in key areas, e.g. a chief information security officer (CISO) for information technology, as needed? Alternatively, is this role competently outsourced? 
Inadequate Standards or Models Organizations of all sizes rely on standards or models, either self-designed or designed by an expert group (governmental or professional), which indicate when some aspect of the business is exceeding a safe level of operation. Insurers use loss-modeling tools; banks use “value at risk” models; manufacturers use all sorts of gauges, such as air safety levels and equipment safe usage levels.  There are also standards of safety applied to all manner of things both public and private, from buildings to transportation to infrastructure such as bridges, power grids and so on. These are routinely inspected to ascertain performance against pre-established standards of acceptability. As can be readily appreciated, if the standard or model is faulty, then the business leader, staff or  risk professional is placed at a disadvantage in identifying or evaluating the likelihood or the size of a risk. Consider that the models used by many banks and investment houses before the financial crisis of 2008 did not help them avoid major losses. The testimony quoted above shows issues with the model used to monitor the synthetic credit portfolio at JPMorgan Chase. Consider that, according to the Associated Press in 2013, “Of 607,380 bridges, the most recent Federal Bridge Inventory showed that 65.605 were classified as structurally deficient and 20, 808 were as fracture critical. . . . Officials say the bridges are safe.” How can a state or city risk manager know how to handle risk associated with the bridges when the standard of safety is so confusing? Not surprisingly, there have been some major bridge failures in the recent past. Organizations need to vet their standards and models. For example, they could:
  • Get second opinions on the model of choice,
  • Use multiple models, not just one,
  • Stress test the model at regular intervals,
  • Establish contingency plans in case the model fails.
No organization will eliminate all uncertainty. However, with the right risk culture, knowledgeable leaders and robust models, an organization can minimize exposure to unanticipated and unmitigated risk.

Donna Galer

Profile picture for user DonnaGaler

Donna Galer

Donna Galer is a consultant, author and lecturer. 

She has written three books on ERM: Enterprise Risk Management – Straight To The Point, Enterprise Risk Management – Straight To The Value and Enterprise Risk Management – Straight Talk For Nonprofits, with co-author Al Decker. She is an active contributor to the Insurance Thought Leadership website and other industry publications. In addition, she has given presentations at RIMS, CPCU, PCI (now APCIA) and university events.

Currently, she is an independent consultant on ERM, ESG and strategic planning. She was recently a senior adviser at Hanover Stone Solutions. She served as the chairwoman of the Spencer Educational Foundation from 2006-2010. From 1989 to 2006, she was with Zurich Insurance Group, where she held many positions both in the U.S. and in Switzerland, including: EVP corporate development, global head of investor relations, EVP compliance and governance and regional manager for North America. Her last position at Zurich was executive vice president and chief administrative officer for Zurich’s world-wide general insurance business ($36 Billion GWP), with responsibility for strategic planning and other areas. She began her insurance career at Crum & Forster Insurance.  

She has served on numerous industry and academic boards. Among these are: NC State’s Poole School of Business’ Enterprise Risk Management’s Advisory Board, Illinois State University’s Katie School of Insurance, Spencer Educational Foundation. She won “The Editor’s Choice Award” from the Society of Financial Examiners in 2017 for her co-written articles on KRIs/KPIs and related subjects. She was named among the “Top 100 Insurance Women” by Business Insurance in 2000.

I'm Spending a Fortune on Digital...So Where Are the Profits?

Part two of a series on the Digital Experience: Return on Empathy is the new ROE.

I doubt any readers of this post work with a CFO who is measuring Return on Empathy. Empathy? How can something as soft, as emotional, as seemingly non-quantifiable as identifying with people’s feelings, thoughts and emotions translate not only into hard-core financial benefit but also value to customers, patients, agents, employees or other participants in your digital experience? The fact is, the more you demonstrate empathy for your digital-experience participants, and connect that experience to your key performance indicators (KPIs), the more value you will uncover. I’ll share an example of how the credit card industry established a transformational industry practice by showing empathy via an innovative digital experience. It started with the simple insight that, by relieving customer stress, debt repayment rates could be improved. Here’s the story: We all have an image of how credit card companies collect past due balances. Late payers get a “friendly” phone call from the Collections Department, followed eventually by more persistent calling from collections agents to whom severely past due accounts are outsourced. These guys make pennies on the dollar extracting and following through on what the industry calls “promises to pay.” From the customer’s perspective, this is a confrontational and embarrassing situation. It’s full of stress that is probably only adding to what got the customer into a financial pickle to begin with. The reality is that most people don’t plan to find themselves at the other end of a phone call from a debt collector. But life happens. Medical emergencies, job loss, other surprises simply overwhelm cash flow and savings. In an industry where the interests of the institution and the customer may not necessarily be particularly well-aligned, the standard was historically an adversarial approach. Using digital technology, innovators within the industry were able to prevail against the belief that collections could only happen through outbound calling. Innovators advanced the notion that collections rates could be improved by providing late payers with the means to set repayment terms online. Good for the customer. Good for the company. Avoiding the confrontation of the phone call, and providing a private way to work through the issue, actually gave the customer a new opportunity to strike a deal. This led to meaningful increases in recovery of past due balances. So meaningful that the capability to set repayment terms online went from being an outlier, crazy idea to an industry standard that is spreading globally. Online collections didn’t arise from spreadsheet analysis or financial engineering. It started from the simple insight that relieving stress – showing empathy by giving the customer a private way to settle up – would tap into people’s real needs. This simple insight, based totally on emotion and flying in the face of industry practices and beliefs, opened a big opportunity that leveraged digital technology to improve the customer experience and by so doing created a whole new source of value for credit card issuers. Where is the learning transferable within the insurance and wealth management sectors? The way I see it, we are operating in categories where emotion plays a big role. And where there is emotion, there is potential for Return on Empathy. There are numerous opportunities to translate empathy into experience design using digital capabilities that will translate into results. Here are three starting points:
  1. Look for broken “moments of truth.” Across the opportunities for improved revenue cycle management, examine the “moments of truth.” Which ones are working and not working for your constituents? What are your constituents worrying about in the larger context of their lives, not just within the insurance transaction? Tiny adjustments can have a large impact. Testing and learning is required to tease out the benefits. Consider that application submission and processing, billing, payments, account management, servicing, inbound inquiries and outbound communications are all areas to explore. Within the healthcare category, these same principles may apply more specifically to population health management efforts.
  2. Focus on the bottom three dissatisfiers with your experience. Some very successful brands build their value story around addressing areas of dissatisfaction. Capital One is one example. What are the three worst areas of dissatisfaction with your experience based on your customer satisfaction tracking studies? What is the emotional basis for the dissatisfaction? How can you fix the experience by leveraging digital, mobile and social capabilities to close gaps? Can your team develop some quick mockups and share them in a usability lab?
  3. It isn’t always about pricing. I know some readers are thinking, “well, my customers just care about price; none of this emotional stuff really matters.” My rule of thumb is that one-third of the market for insurance and financial products may be truly, truly price-driven. But for most people there is a “value for the money” calculation that will readily trade off price for perceived additional value. That value is often in intangible, emotional connection to the brand and offering. Just ask all the people who willingly pay more for Apple products: “Better feature functionality at lower price” will not come up as an answer. And even where price is a heavier factor (say, in P&C, where pricing is more transparent and where the industry emphasize low-cost offers) emotion rules more heavily inside the experience than may appear at first look. That means the potential for Return on Empathy is high.

Amy Radin

Profile picture for user AmyRadin

Amy Radin

Amy Radin is a transformation strategist, a scholar-practitioner at Columbia University and an executive adviser.

She partners with senior executives to navigate complex organizational transformations, bringing fresh perspectives shaped by decades of experience across regulated industries and emerging technology landscapes. As a strategic adviser, keynote speaker and workshop facilitator, she helps leaders translate ambitious visions into tangible results that align with evolving stakeholder expectations.

At Columbia University's School of Professional Studies, Radin serves as a scholar-practitioner, where she designed and teaches strategic advocacy in the MS Technology Management program. This role exemplifies her commitment to bridging academic insights with practical business applications, particularly crucial as organizations navigate the complexities of Industry 5.0.

Her approach challenges traditional change management paradigms, introducing frameworks that embrace the realities of today's business environment – from AI and advanced analytics to shifting workforce dynamics. Her methodology, refined through extensive corporate leadership experience, enables executives to build the capabilities needed to drive sustainable transformation in highly regulated environments.

As a member of the Fast Company Executive Board and author of the award-winning book, "The Change Maker's Playbook: How to Seek, Seed and Scale Innovation in Any Company," Radin regularly shares insights that help leaders reimagine their approach to organizational change. Her thought leadership draws from both her scholarly work and hands-on experience implementing transformative initiatives in complex business environments.

Previously, she held senior roles at American Express, served as chief digital officer and one of the corporate world’s first chief innovation officers at Citi and was chief marketing officer at AXA (now Equitable) in the U.S. 

Radin holds degrees from Wesleyan University and the Wharton School.

To explore collaboration opportunities or learn more about her work, visit her website or connect with her on LinkedIn.