Download

If Data Is the New Oil, Why Do We Insure So Little of It?

Organizations only insure about 19% of their information assets, vs. 60% of physical assets -- even though losses are far more likely on information assets.

Image
Gas pumps in a row

We've all seen the claim that "data is the new oil," and a recent global survey by Aon found that, in fact, information assets are 14% greater than property, plant and equipment assets (PP&E) at major organizations. But Aon also reports that organizations only insure about 19% of their information assets, vs. 60% of their physical assets -- despite believing that they have at least 2 1/2 times the likelihood of a loss on their information assets.

That sounds to me like a need for customers and an opportunity for insurers. 

So let's have a look at that report, along with some other items of interest that caught my eye over the holiday weekend:

  • A pretty thorough argument that climate change is NOT increasing the number of Atlantic hurricanes (even though it appears to be raising their intensity and causing other sorts of storms).
  • A move by Google that may radically change search and, thus, render obsolete many of the social marketing tactics used by agents and insurers.
  • A really bad sign for commercial real estate, which insurers not only provide coverage for but invest in. 
  • And, for fun, the latest high-profile glitches by generative AI. (Hint: Don't use Elmer's glue to keep the cheese from sliding off your pizza.) 

The Aon report says a major reason for the discrepancy between tangible and intangible assets is that intangible assets are too new and too volatile for insurers to be comfortable with them. 

"The insurance industry typically builds actuarial loss models based on decades of data," the report says. "However, due to the dynamic and fluid nature of intangible assets, we will never have decades worth of static intangible assets and risks data. Therefore, 'retain risk' versus 'transfer risk' decisions require fresh thinking."

How do we achieve that fresh thinking? The short answer in the report is: "Risk management typically considers frequency and severity of perils. With respect to intangible assets (especially artificial intelligence and cyber), we should add velocity of evolving risk profiles."

The long answer requires some serious engagement with the report. Here, I'll just cite a few more of the stats that show the serious underinsurance of intangible assets:

  • "70 percent of respondents say their organizations are still not purchasing standalone cyber insurance coverage. The average limit for those organizations purchasing cyber insurance is $17 million." 
  • "Only 35 percent of respondents say they have a trade secret theft insurance policy, and a similar percentage of respondents (34 percent) have an intellectual property liability policy."
  • "Business disruption has a greater impact on information assets ($324 million) than on PP&E ($144 million)."

So, while the issue of information assets is truly tricky, the need is huge.

An article in Forbes, "Climate Change, Though Quite Real, Isn't Spawning More Hurricanes," says: "A search for answers about climate and hurricane connections reveals little or no evidence that major landfalling hurricanes in the Eastern United States have increased in frequency since data collection started around 1850."

The author says the data most commonly used suggest otherwise but says that we are simply better at detecting storms in the Atlantic and that they have become more memorable in the decades during which they've been named. He also acknowledges what is obvious to the insurance industry: that damages from hurricanes have soared, because at-risk areas are being built up and more people are moving into them. 

He adds that "there is some recent evidence that storms may be intensifying (i.e., increasing in severity) faster and traveling slower, which are subjects of active research."

The article challenged some of my assumptions and is an interesting read.

An article in Platformer describes the vast implications of an announcement Google made last week on providing more AI capabilities to search.  

"Google’s idea for the future of search," the articles says, "is to deliver ever more answers within its walled garden, collapsing projects that would once have required a host of visits to individual web pages into a single answer delivered within Google itself. The company’s AI-powered search results, which it calls Search Generative Experience, are coming to all users in the United States soon, Google announced.... By the end of 2024, they will appear at the top of results for 1 billion users." 

We've all seen this trend developing for some time. If you Google a matchup in the NBA playoffs, you get all the relevant information in a box: the time of the game, the win-loss record in the series, where it's broadcast, what the in-game score is if one is in progress and so on. No visit necessary to any other site. But Google is planning to go much further -- a slogan that appeared frequently at the announcement was, "Let Google do the Googling for you."  

That will be great for Google and often for the user, but all those sites that people used to click through to visit will be starved of traffic. Gartner Group predicts that traffic to the web from search engines will fall 25% by 2026.

That will mostly hit the publishing business, but it will also affect anyone -- including many carriers and agents -- that provide information online in the hope that people will click through to their sites to learn more. Why click through if Google's AI already tells you everything you want to know?

You've been warned.

An article in Bloomberg reports the sort of warning sign I've worried about and have been watching for in commercial real estate: 

"For the first time since the financial crisis, investors in top-rated bonds backed by commercial real estate debt are getting hit with losses. Buyers of the AAA portion of a $308 million note backed by the mortgage on a building in midtown Manhattan got back less than three-quarters of their original investment after the loan was sold at a steep discount. It’s the first such loss of the post-crisis era, says Barclays."

The five groups of lower-ranking creditors were wiped out, but what concerned Bloomberg most was that "the pain is reaching all the way up to top-ranked holders, overwhelming safeguards put in place to ensure their full repayment." Those losses are "a testament to how deeply distressed pockets of the US commercial real estate market have become. 'These losses,' warns Barclays strategist Lea Overby, 'may be a sign that the commercial real estate market is starting to hit rock bottom.'"

That is a concern for insurers on multiple levels. The most straightforward issue is for those that have invested in commercial real estate. But there will be ripple effects in other areas, too, for instance as workers' comp carriers have to adjust to the new world of hybrid work and as P&C carriers support construction companies as they turn many office buildings into apartments.

Finally, an article in the Washington Post documents a series of embarrassing errors by generative AI. The article says: 

"In search results, Google’s AI recently suggested mixing glue into pizza ingredients so the cheese doesn’t slide off. (Don’t do this.) It has previously said drinking plenty of urine can help you pass a kidney stone. (Don’t do this. And Google said it fixed this one.)

"Google’s AI said John F. Kennedy graduated from the University of Wisconsin at Madison in six different years, including 1993. It said no African countries start with the letter “K.” (Nope. Kenya.)"

The article explains at length where these stupid answers come from. For instance, the glue suggestion somehow came from a joking Reddit post from 11 years ago. Google's AI missed the joke part. 

But I mostly read the article to have a chuckle about that stupid AI... while it lets me.

Cheers,

Paul

 

 

Is Your Customer Portal Good Enough?

Having a customer portal is a non-negotiable, but deciding how much to invest in its design is the challenge.

people using computers

It is no longer a matter of whether or not a business needs a customer portal. That question has been asked and answered with a resounding yes. And most businesses have conceded and developed one, though quality varies greatly. But the question remains, how good does your customer portal have to be?

When considering if a customer portal is good enough to satisfy customers, remember the competition is not only from other insurers. Rather, the portal has to stand up to scrutiny from consumers who are comparing it to other digital portals they use like Uber, Amazon, and Jet Blue— and they have high expectations. 68% of consumers surveyed say they wouldn’t use a company’s chatbot again if they had a poor experience and 32% of consumers agree they’d leave a brand they love after one bad experience, proving the criticality of getting digital portal services right the first time. 

Focus on the Right Parts of Your Digital Portal

With the seemingly endless customizations possible, insurers need to narrow their focus to the most critical components when it comes to designing a digital portal. Keeping customer data secure is non-negotiable. Other essential features customers expect from their portal include:

  • Mobile web friendly 
  • Easy, integrated payment capabilities
  • Ability to submit and track a claim
  • Data-driven product recommendations 
  • Alerts and notifications for policy updates or severe weather warnings 

In addition to these top customer concerns, businesses should focus on these considerations when designing their portal:

Understanding Your Customer’s Needs

When developing a digital portal, consider the reasons why customers engage with companies, then use this information to prioritize the project. The top three reasons for customer inquiries are billing issues, support with a product or service, and order status. Start by adding or improving these aspects of the digital portal to add immediate value. 

Insurers can look at actual usage logs provided by most portals to determine what is most important to users and to gain insights into what may need to be corrected. Finally, insurers can also survey their customers to learn more about their needs and wants. 

By analyzing user interaction data within the portal, insurers can identify which capabilities are most used and valued by customers. This data-driven approach can lead to better decisions on where to invest future funds when refining the portal to offer more tangible user benefits. The balance between innovation and cost-efficiency helps to ensure strategic business goals are met while customers are satisfied.

With this knowledge, insurers can enhance their digital portal to meet these top needs first, optimizing the experiences customers use most often. By making these critical parts of the customer journey simple and intuitive, insurers will see more value than if they spent time improving parts of the portal that are rarely used. 

Flexibility

Flexibility is an important consideration when designing a portal. Consumers expect real-time experiences that provide value when they need it, so adapting to changing customer demands is critical. The ability to add new features or update existing ones very rapidly is an integral part of the flexibility customers expect. A configurable portal, along with regular vendor updates, allow an insurer to keep up with customer expectations. 

Research has shown 65% of customers expect the companies they do business with to adapt to their changing needs over time. Working with a vendor that offers continuous improvements through system upgrades can help ensure portals are flexible and adapt to changing consumer needs.

User Experience

Consumers expect a logical user experience. They are familiar with cool technology that is intuitive, easy, and functional — and they demand the same from their insurance portal experience. Any customer-facing portal must be easy to navigate and should match the look and feel of the insurer’s brand. 

Consumers demand mobile-friendly experiences and enjoy performing many different tasks on their smart phones, so any customer portal must be optimized for the mobile experience. Consumers pay bills, do research and chat with live agents and bots within other apps, so they expect this same functionality to work smoothly when logging into their insurer’s portal from their mobile device.

The hyper-personalization trend continues with a McKinsey study showing 71% of customers expect personalized interactions, like tailored content, customized recommendations, and personalized messages, and a full 76% are frustrated when that doesn’t happen. Configurable portal designs should allow insurers to personalize interactions for returning customers to enjoy. Insurers can accomplish this by knowing their customers’ wants and working with developers to include these personlizations. 

Single Interface 

Having an omnichannel experience means customers can decide how they want to interact with their insurer, having the freedom to choose different methods of communication at different times. But this experience should still be streamlined and optimized for the consumer, so even if an insurer operates on different core systems for different lines of business, the customer should see one integrated portal. They shouldn’t need to log out and back into different systems to accomplish their needs when a single interface provides a more streamlined customer experience. 

Balancing Cost and Capabilities

It never makes business sense to overspend on technology. However, insurers need to allocate a reasonable budget toward developing and maintaining their customer portal, and may need to overcome technical debt left from previous solutions. Some signs a business is grappling with technical debt include: 

  • Slow release cycles, which may result from workarounds or complexities in the code that need to be managed each time a change is made. 
  • Frequent outages or system crashes that may derive from shortcuts during development. 
  • Challenges scaling the technology or integrating it with new tools as the original design did not account for growth. 
  • Maintenance outweighs new development as IT resources are forced to spend more time fixing bugs than developing new features. 

Insurers with greater technical debt and those who have not kept up with digital trends will likely need to spend more to remain competitive. The trend toward digital self-service is only accelerating as consumers demand robust online capabilities. 

To calculate the return on investment for a digital portal, consider the increase in customer satisfaction and retention in ROI calculations. A good customer portal can significantly increase customer retention, with a recent survey showing customers who repeatedly use multiple self-service channels have a 25% higher retention rate. 

Balance the cost and the desire to add more features by prioritizing customers’ demands. With survey data and an understanding of what customers need, insurers can control costs by adding the right features and capabilities. And with a continuous feedback loop and regular vendor updates, the customer portal always remains current.

The Role of the Agent

The agent plays an important role in leveraging digital portals to help their clients through the purchase experience. Agents often help customers access the portal for the first time, helping them to set up and personalize the portal. Policyholders often need access to their ID cards or dec pages, and agents may help them access those documents. 

Policyholders often call their agent for help with filing a claim or inquiring about their claim status. Agents may also use digital portals to educate policyholders about their coverage, provide updated quotes, or offer recommendations based on their current coverage and personal needs. Agents may also help with basic updates for their clients, like adding an email address or updating employer information. 

Because the agent is often the first point of contact for policyholders when something goes wrong, insurers should invest in training agents on how to best use their portals. This best practice will help both the insurer and agent to sell more and spend less time with customer support. Providing incentives — for both insurers and policyholders —— will help encourage greater adoption. 

Portals can be valuable tools for agents to upsell policies and provide more personalized service to customers. Agents can add value by helping their policyholders navigate the portal to service their needs. 

How Good Does Your Customer Portal Have to Be?

In today's digital age, the quality of the customer portal can define a brand's success. To truly stand out, it must rival not only others in the insurance industry but also the best in any industry, as consumers compare their digital experiences universally. Staying ahead means embracing flexibility and continually refining the user experience based on actual customer feedback and emerging trends.

So how good does an insurer’s portal have to be? It must score well with consumers based on customer satisfaction surveys. Insurers should analyze how much the portal reduces their customer support costs, and decide if there is room for further improvements compared to industry benchmarks. Consider how the portal supports the agent network and whether it provides a positive experience that makes agents want to promote the insurer’s brand. Finally, benchmark how the portal stands up to competitors to be sure it compares favorably.

By investing in a customer portal that excels in these key areas, insurers can not only meet the high standards consumers now demand but also create lasting engagement that fosters consumer loyalty and drives growth.

 

Sponsored by: ITL Partner: insured.io

 

 

External Sources:

  1. https://www.salesforce.com/content/dam/web/en_us/www/documents/research/State-of-the-Connected-Customer.pdf
  2. https://www.pwc.com/us/en/advisory-services/publications/consumer-intelligence-series/pwc-consumer-intelligence-series-customer-experience.pdf 
  3. https://www.forbes.com/sites/forbestechcouncil/2023/05/19/the-changing-face-of-customer-experience-in-the-self-service-economy/?sh=7bc6bc94545f
  4. https://www.mckinsey.com/capabilities/growth-marketing-and-sales/our-insights/the-value-of-getting-personalization-right-or-wrong-is-multiplying

ITL Partner: insured.io

Profile picture for user insured.io

ITL Partner: insured.io

Insured.IO provides mid-market insurance carriers with the most complete and modern SaaS customer self-service platform for mobile, desktop, and telephone IVR that is affordable and can be maintained with minimal ongoing technical support. It serves the complete insurance product lifecycle, including sales, payment, FNOL, and analytics. Using cloud-native technology, the platform easily and quickly integrates with any insurance core systems and can be tailored to each carrier’s unique needs. It delivers real-time data synchronized across all channels, providing greater process automation, reduced CSR utilization, and great business intelligence that improves operating performance. Insured.IO can be up and running in as little as 60-90 days.

Navigating Tough Markets: The Case for Advanced Risk Selection in P&C Insurance

Insurance carriers facing profitability challenges and shrinking market options turn to behavioral deep learning for sustainable solutions, managing risks effectively while serving stakeholders.

Data Streams

We’ve all read the headlines about major insurance carriers reducing their presence in certain markets. In California alone, 90% of companies either are not offering property insurance or have heavy restrictions, while 70% are not currently offering new plans. This trend raises important questions about how insurers can navigate profitability challenges without limiting customer options. The term double inflation is being heard, where not only is inflation causing continuing uncertainty, the industry’s pains have been exacerbated by social inflation.

Rising costs, unprecedented claims payouts, and consumer expectations are posing challenges for insurers. Many are struggling to navigate the combined pressures of  macroeconomic issues alongside escalating frequency and severity losses in personal lines. The situation demands P&C insurers adapt, focusing on profitability while delivering the relevant experiences consumers deserve and require.

One lever carriers have attempted to utilize in addressing profitability is filing for rate increases. However, there comes a point when requested rate increases are limited or denied by regulators, forcing carriers to make the difficult decision to stop offering policies. This leaves consumers with fewer choices, driving  premiums up for those that remain -  a death spiral by adverse selection.

Next Era Risk Identifier

A better approach is to leverage risk intelligence at an individual level, powered by behavioral deep learning. This type of predictive intelligence helps carriers prepare more effectively and avoid future profitability issues or adverse selection as more people shop for insurance. By incorporating individual-level risk selection insurers can identify consumers with the highest risk propensity before they even become policyholders.

In the latest report, AM Best projects a combined ratio of 100.7 for all P&C underwriting lines in 2024. With advanced intelligence available earlier in the buying process, insurers can guide consumers down different underwriting and pricing paths, evaluate critical financial implications and avoid the potential damage caused by the lack of early intelligence.

Enhancing Fairness Through Advanced Risk Selection

Fairness in insurance goes beyond industry standards to include a deeper understanding of the individual. Even if a customer is unbanked or has no credit history, carriers can still leverage behavioral data. Insuring people more appropriately, rather than exiting a market to alleviate profitability and rating constraints, is fairer and better serves all stakeholders, including customers.. 

A Better Approach for P&C Insurers in Tough Markets

Utilizing individualized predictive intelligence allows carriers to more precisely identify and quantify risks, placing policyholders into the appropriate tiers without the need for continuous premium increases or drastic market exits.

Furthermore, entering a market can also benefit from this cutting-edge individual-level intelligence. For example, a carrier entering a market like California, can use this approach to make informed decisions, identify customers who match their risk profile and gather better intelligence on potential agency partners for both the short and long term. 

Conclusion

Insurers have a responsibility to serve all stakeholders, even in volatile times. It’s our duty to keep the company sustainable, uphold promises to consumers, and deliver returns for investors or owners.

The challenges of achieving sustainable profitability amidst rising costs of claims, litigation, and social inflation require adapting to new or growing risks. By failing to utilize new technologies and intelligence, insurers miss the opportunity to build stronger companies and fulfill their obligations to stakeholders.. 

The key is more effective risk selection and assessment. Planning well and ensuring robust financial management are essential attributes for any company. The insurance industry, at its core, protects people and their dreams, providing peace of mind. By improving our practices and solidifying the industry, we leave a legacy we can all be proud of.

 

About Scott Ham

Headshot of Scott HamScott Ham is the CEO of Pinpoint Predictive and was previously the President & CEO of Transamerica’s life, P&C and non-life business, where he provided strategic vision and leadership to global organization. At Transamerica he oversaw day-to-day operations and management of over 6,000 employees, $1.6 billion in new sales and over $6 billion in overall premium. Scott more recently was extensively involved with insurtech startups as a strategic advisor at McKinsey & Co, where he also led a Small Commercial Insurance SaaS platform that made possible real-time underwriting and pricing decisions through cutting-edge machine learning analytics and 3rd party data. Scott is joining Pinpoint to lead the company through a period of unprecedented growth.

Sponsored by ITL Partner: Pinpoint Predictive


ITL Partner: Pinpoint Predictive

Profile picture for user PinpointPredictivePartner

ITL Partner: Pinpoint Predictive

Pinpoint Predictive provides P&C insurers the earliest and most accurate loss predictions and risk scores to fast-track profitable growth and improve loss ratios. Unlike traditional methods, Pinpoint’s platform leverages deep learning, proprietary behavioral economics data, and trillions of individual behavioral predictors to help insurers identify the risk costs associated with customers and prospects.

Insurtech 100 Awards 2022 | Insurtech Vanguard | AI Breakthrough Awards 2023 | Global Tech Awards 2023 - Category Winner for AI, AnalyticsTech and Insurtech | Insurance Awards 2023 - Category winner for Insurtech in World Finance Magazine 

Are You Fraud-Friendly?

Insurance companies have money and don't protect it as well as banks do. We as an industry look like we deserve to be stolen from.

A Person Holding a Black Pen

No matter how much you say you protect yourself from fraud, you don’t know what server your last email came from and certainly don’t know where the next email you send goes.  

Trust is invisible yet somehow tangible in daily digital and electronic transactions.

This is not an insurance industry phenomenon, but because everyone has risk and almost everyone has insurance, yes, it is. In this respect, we insure tangible invisible transactions--sometimes our own operational work process, many times interactions with vendors, policyholders and claimants and other times actual named perils in a cyber policy. 

See also: How Technology Is Changing Fraud Detection

You can’t see a digital punch, but you can feel it

Trust issues can be like the wind. They can hit everywhere all at once at global scale, so digital identity is unworldly in every possible respect. That is a digital blind spot, and as we stumble toward modernization it is getting bigger.

The current digital, social, internet, connected, IoT-enabled world, and its surrounding cybersphere, is full of high-trust transactions as well as risky ones. The high-trust stuff is end-to-end encrypted with established security and verification protocols. The risky stuff is everything else to varying degrees.

Our legacy of paper and people process underpins our approach to technology adoption. And in a P&C world of people, places, vehicles, services and businesses, that means physical world thinking dominates our day-to-day practices. Digital is not that. 

Digital stands on trust alone. We won’t dive into AI fakes and deep cyber shenanigans here but focus on a more mundane topic -- identity capture.

The data we think of in the physical manner is usually captured in free form fields where data is simply entered as names, addresses, titles, numbers and such. Much is to establish identity and the ability to contact someone, which now may include digital labels like email or a website address.

While it is easy to assume a digital label relates to a physical presence, IT DOES NOT.

For example, one bad actor can have several variations of their legitimate name, address, phone number and email address. With a little effort, that same bad actor can create alternative additional addresses, business aliases, phone numbers and email addresses. Some are simple ways to hide their identity, others are fake data to make it impossible to contact them outside of their preferred channel. And each contributes to disaggregating the monetary impact you recognize at any level of perpetrator and their extended network of identities as well as those of their conspirators.

If you knew you were paying the same person multiple times for the same thing, or that a host of payments were all going to a fictitious identity, of course you would stop it. Because you don’t... you are fraud-friendly.  

Ghosts and mirages of every kind seem to exist in your data from marketing and advertising, to underwriting, pricing, schedules and claims. Some are data errors, others are intentional. Both contribute to doubt and mistrust, yet ennui and acceptance that the same data is in multiple systems yet inaccessible and unmatchable for most practical purposes.

If you fall in cyberspace, only headlines might catch you

Data, private information, system access, operation interruptions, trade secrets, ransom, reputation and other forms of value are also fungible, if perhaps latent, still a form of money. While there is a lot of non-financial motivation in some devious use cases, most people steal from insurance companies because they have money and don't protect it as well as a bank. Add to that all the manual processes and siloed data architectures, and we as an industry look like we deserve to be stolen from.

It's not like all the old fraud, waste and abuse menagerie of threats have gone away. And add to the list of “not to be trusted”: e-pickpockets, scammers, skimmers, online hackers, deceptive practices, fake claims, false identities, synthetic accounts, papermill billings, forgeries, data stealers, phishing, viruses, ransomware attacks, malicious actors, insider threats, corporate espionage, misinformation, malinformation, hostile governments in some cases or clever business email compromises and deepfakes of all varieties. All continue to make headlines.  

See also: Disaster Fraud — The Dark Side of Insurance Claims

Maintenance that matters

Without persisting entity resolution processes over your data, you make it easy for untrustworthy transactions to proliferate across your enterprise and throughout the insurance ecosystem.  

Savvy companies have multi-tiered supply chains of connectedness under persistent identification monitoring, as well as adding pre-fill and form flow application programming interfaces (APIs) to get the data right the first time while avoiding duplicate data entry everywhere. Ensemble data feeds augment analytic efforts while governance programs improve master data management.

It is a common issue everywhere as company after company is missing their digital transformation targets. It is as if we meant to take five years and do nothing more than move a few core applications to the cloud with little if any process change -- just a virtual mainframe lift & shift. Corporate-speak now calls that continuous modernization.

If you don’t know the businesses you do business with or underwrite, why not look them up using a digital entity resolution service? This outside-in master data management approach can then have all the digital attributes and technographic features and values kept current for your daily use.  

These data can also be historically accessible to show changes over time and monitor all sorts of necessary updates, changes and links. That last bit has added benefits of logged caching so you can forensically create feedback and plan to manage your relationships with digital entities based on their behavior and readiness to work with you over time -- at a renewal event or at a transaction event in real time.

If you don’t stay up to date on your vulnerabilities, how do you address them? Less obvious, yet equally on point, your connected partners and suppliers may not know their vulnerabilities (which are now yours, too). 

If you don’t know yourself, and you don’t know others, your “unknown unknowns” is a much larger window than you realize. No one can hear an AI scream in cyberspace -- but those voices all say, “send money.”

Catastrophic financial events and operational and reputational compromises occur with long lingering periods of latent system access to culprits. This will be easier than ever with newer AI capabilities for even the most novice of bad actors. 

Be savvy. Invest in a full-spectrum entity resolution service to power trust and to repel fraud. Don’t be fraud-friendly.

Lessons From Florida's Hurricane 'Mean Season'

In the 20 years since four hurricanes hit the state in just six weeks, insurers have learned many lessons--and have much better technology. 

Photo of Columbus Clouds

Florida is one of America’s most disaster-prone states, especially vulnerable to hurricanes due to its location between the Atlantic Ocean and the Gulf of Mexico, so it should come as no surprise that Florida ranks fifth in major disasters since 1953. 160 events have hit the Sunshine State in this 71-year period.

With the notable exception of 1992’s Category 5 Hurricane Andrew, the most destructive hurricane to ever hit Florida in terms of structures damaged or destroyed, the state was spared major hurricane-driven losses across several decades in the 20th century. This was even as it became the country’s third-most-populated state and exposure increased year-on-year as population centers expanded.

However, in 2004, what is remembered today as the “mean season,” four hurricanes – Charley, Frances, Ivan and Jeanne – struck the state in a single year, one after the other. 

This was the first such season in the U.S. since 1886. In just six weeks, those four hurricanes caused 125 deaths, damaged hundreds of thousands of properties, left millions without power for weeks and resulted in around $40 billion in property damages in Florida alone. 

While the sheer destruction of Hurricane Katrina might exert a stronger pull on our collective imaginations, given that it’s now 20 years since those terrible storms in Florida, it’s worth asking what the industry has learned since then.

See also: The Hurricane Forecast Keeps Getting Darker

The many challenges of natural catastrophe response

It’s important to recognize the increasing scale of the problem. Florida has suffered more hurricanes than any other U.S. state, and the U.S. Environmental Defense Fund says they are becoming more unpredictable and causing more flooding.

Policies can offer inadequate coverage. Insurers may refuse to cover many homes and businesses in particularly hard-hit areas. And, of course, claims can take months to settle, leading to bankruptcy for many businesses and great distress for domestic policyholders.   

Important lessons have been learned

But society is more prepared overall, and the insurance industry, in turn, has upped its game regarding hurricane event response. These changes include:

  • Publicly underwritten property insurance. Some of the changes in Florida predate 2004. They came about in response to 1992’s Hurricane Andrew, which drove 16 insurance companies into insolvency and caused several others to leave the Florida home insurance market altogether. As a result, in the 1990s, the state of Florida created the Florida Windstorm Underwriting Association (FWUA), which provided coastal residents with wind-only coverage, along with the Florida Residential Property and Casualty Joint Underwriting Association (JUA). These two combined in 2002 to form the Citizens Property Insurance Corporation, a not-for-profit, tax-exempt government entity to provide property insurance to eligible Florida property owners unable to find insurance coverage in the private market. 
  • Cat modeling. A key change since 2004 has been the industry’s approach to catastrophe (Cat) modeling. Once the preserve of reinsurance buyers, Cat modeling and analytics are now widely adopted across the industry. Virtually every risk with catastrophe exposure is now run through one or more models to consider potential loss scenarios, while commercial policyholders will often seek to understand their Cat exposures so they can negotiate with insurers, according to Marsh. 
  • Increased focus on data quality. We’ve also seen a massively increased focus on data quality. For example, in the wake of Hurricane Katrina, the industry noticed a significant mismatch in modeled and unmodeled insurance losses because the Cat models of the time overestimated the impact of wind damage and underestimated the damage from flooding and storm surge. Initial damage estimates for Hurricane Katrina represented only about a quarter of the final tally. This issue remains to this day. In the 2000s, credit rating agencies also started looking at how insurers and reinsurers used Cat modeling in their capital management strategies. As Marsh puts it, credit ratings agencies’ attention “sharpened insurers’ focus on accurate risk modeling.”
  • Better understanding of the scope of coverage. In the wake of 2004 and particularly 2005, many businesses were shocked to learn that they weren’t covered for storm surge. Yet storm surge was responsible for much of the damage post-Hurricane Katrina—even more so in New Orleans compared with Florida. Many disputes ended up in court, and insurers had to pay out for significant extra losses. Since then, many policyholders have become more careful about the coverage they purchase, while insurers have tightened their policy wordings. Even so, the possibility of misunderstanding, underinsurance and expensive litigation persists. 
  • Business continuity and crisis management. The mid-2000s hurricanes also increase the industry’s – and policyholders’ – awareness of the need for business continuity, crisis management and mitigation. This is partly a response to the rising costs of coverage and advances in technologies that can sense and track storm damage and issue more timely warnings ahead of storms making landfall. 

Building on the lessons of the last 20 years

The history of the insurance industry’s hurricane modeling, mitigation and claims response shows an industry keen to learn and progress but also one that is still hindered by the scale of the issues and the limitations of technology. For example, a keen awareness of the value of accurate data is of little use if the data available is not accurate enough to deliver an actionable response.

Fortunately, developments in both new technologies and approaches are now available to address this and leverage the value of what’s been learned over the last 20 years. For instance, insurers are now ready to embrace a more data-driven approach to event response and claims. 

See also: Geomagnetic Storm for Insurance?

What’s new and making a difference?

Solutions now exist that empower insurers with timely and accurate data and intelligence to respond swiftly and effectively to hurricanes and other catastrophes. Using near real-time, multi-sourced data, such as satellite, aircraft, drone, IoT and on-the-ground vehicle-based imagery overlaid with human insights, now delivers actionable intelligence. 

With this data, loss-reserving teams gain a rapid quantification for events, at both individual and portfolio level. Exposure managers benefit from the ability to refine ultimate loss estimates to a finer degree, streamline processes and optimize resource allocation. 

Claims managers gain rapid on-site assessments immediately following catastrophic events, even in hard-to-reach areas. This expedites claims decisions and facilitates efficient on-ground resource management, including triage and oversight. 

Last but not least, underwriting teams gain a better understanding of the quantum of loss and a more accurate assessment of risk for future pricing or risk acceptance.

Of course, the speed and accuracy of this data mean that, in addition to getting a more accurate assessment of the damage as it happens and a faster claims response time, insurers can also get an accurate advanced picture of claims, even before first notice of loss, giving them a high level of reserving accuracy and oversight of how hurricanes are likely to affect their portfolio.

Many forward-thinking carriers, reinsurers and their clients already benefit from these advances, building on the lessons learned over the last 20 years of increasing storm damage. With solutions like these available, it will be fascinating to see what lessons the leading insurers of the next 20 years will learn from the hurricane seasons that lie in wait. 

3 Key Life Insurance Agency Challenges

Manual processes, a lack of centralized data and ineffective commissions processes consume valuable time and introduce risks for human error. 

Two Yellow Flowers Surrounded by Rocks

Life insurance distributors are more important now than ever before! They play a pivotal role in the industry, acting as a bridge between carriers, clients and potential policyholders. However, despite technological advancements and an abundance of solutions, many distributors continue to grapple with outdated manual processes that impede efficiency and hinder growth. From commission calculation to policy submission tracking and sales follow-ups, these agency-critical tasks are often mired in ad hoc methods and manual, spreadsheet-based workflows.

Manual processes, a lack of centralized data and ineffective commissions processes are all challenges that not only consume valuable time and resources but also introduce a significant risk of human error, leading to dissatisfaction among advisers and clients alike. These challenges can affect an agency’s ability to function and can hinder industry-wide success.

In the following sections, we’ll take a closer look at all three of these critical life insurance agency management challenges and detail how they can be resolved. 

Challenge #1 – Manual processes degrade visibility into the policy approval process. 

Managing new business applications as they move from adviser to distributor to carrier is a process that still relies on too many manual steps. This ultimately delays the time to issuance and creates frustration among clients, a sentiment that then gets passed through the rest of the chain, with advisers flustered over being left in the dark about where an application is in the process and why it’s been held up. Manual workflows are more error-prone, not often well-documented or updated and filled with inefficiencies, which create an abundance of preventable frustration. 

This is an issue that can easily be addressed through technology. Many other industries have already digitized similar manual processes. It’s time for life insurance carriers and agencies to catch up by accelerating their adoption of integrated solutions that streamline the application process and increase policy status visibility. 

Challenge #2 – Lack of centralized data causes inefficiencies.

Unfortunately, many distributors still don’t have automated access to the carrier data that’s vital to their business. That lack damages their ability to track and manage policy submission status. And when manual processes must be used to move data where it needs to go, inefficiencies multiply.

For advisers, the lack of centralized client data from multiple carriers makes it difficult and time-consuming to get a full 360-degree picture of a client’s insurance products. Often, advisers need to log in to each carrier’s portal and flip back and forth to find the information they need, wasting time that could be spent on solving client challenges, acquiring new business or providing essential client consultation support. 

The clear solution to these challenges is to centralize all carrier data feeds in the agency management systems to provide an exhaustive view of a client’s insurance portfolio and build on the visibility created by automating manual agency processes. 

Challenge #3 – Commission frustrations erode financial adviser trust.

Commissions are a continuing sore spot for advisers. Quick and accurate payment isn’t as universal as one might think. When distributors and carriers don’t share commission feeds, and distributors lack robust commission management systems, significant obstacles can arise. 

According to Equisoft-commissioned Forrester research, one of the top four challenging aspects of a broker's relationship with a carrier is their inability to track compensation and performance in real time. Only 14% of broker respondents said digital commission accounting capabilities are fully integrated into their agency management systems, and 11% of carriers said it takes them more than 60 days to pay brokers. Additionally, brokers said that carriers’ compensation plans are overly complicated. 

Administering complex commission splits among all parties involved in a sale can be hard, and reconciling expected commissions with actual payouts is also challenging. But advisoers are just like everyone else. They expect to have transparency and visibility into their commissions and payments, and they’re frustrated by any errors, delays or complications. 

To win adviser trust, build stronger relationships and attract new advisers into the industry, distributors need to take back control of their commissions by implementing robust systems that automatically calculate and pay advisers — accurately and on time.

The solution: an effective agency management system, complete with a data management process.

Distributors’ unique position in the insurance ecosystem makes it essential for them to have a panoramic view of data from all sources. They need to be able to access, leverage and present a comprehensive data overview to give advisers and stakeholders transparency into every step of the policy lifecycle. To solve the issues detailed above and enable agency staff to focus on higher-value, customer-centric activities, life insurance distributors must tackle their challenges head on with an agency management system (AMS) and a strong data management strategy. 

Not only does an AMS provide a full scope of prospect and client data to help advisers streamline their process and manage the service aspect of their client interactions, but when leveraged properly, the system can provide updates on applications in underwriting, allowing advisers to manage client expectations. An AMS can also help open opportunities to cross-sell and up-sell and enhance compliance actions like client review meetings. 

See also: Predictions for Life Insurance in 2024

State-of-the-art agency management starts with centralized data feeds and good data management.

The data that distributors need to digitally transform their operations and increase efficiency and adviser effectiveness exists; the issue is consolidating it, getting it into the system and being able to share it with all the other necessary stakeholders. 

For some insurance distributors, their data’s true potential isn’t being harnessed because they’re continuing to operate with an outdated mindset. In some cases, even with an AMS in place, they’re still relying on the same types of manual processes they have always used, even though technology now makes it possible to automate and streamline many workflows. 

New business models and systems provide the opportunity for insurance distributors to re-think their approach to operations and optimize their organizations. That means leaving behind legacy thinking. Distributors should ask themselves how they can eliminate old processes, like data entry and manual review, and shift their resources to higher-value work within their organization.

Accessing a consolidated 360-degree view of clients’ data doesn’t need to be difficult. It just takes the adoption of an agency management system that has built in integrations to the relevant carrier feeds and rich data sources from all value chain stakeholders. 

See also: Revolutionizing Life Insurance Uptake in Younger Markets

A new approach to centralizing data and overseeing agency management is needed. 

To compete in the evolving independent channel and current digital era, distributors need to find AMS solutions that will speed the digitalization of their business. They need solutions that automate all core processes, enhance transparency and productivity among agents and staff, reduce errors and costs and create efficiencies. 

AMS solutions can be management workhorses that give distributors complete visibility into data and activity throughout the value chain. But this can only work when the AMS is coupled with a strong data management process. Implementing a comprehensive approach to centralizing, storing, and managing data will position distributors for success. 


Grace Apea

Profile picture for user GraceApea

Grace Apea

Grace Apea is AVP, product development, at Equisoft

She has led the development of several game-changing digital transformation initiatives within the automotive lending sector and insurance and wealth management industries in North America. 

Social Inflation and Reserve Development

An analysis of the causes of adverse development in the 2016-2019 accident years and what this means for more recent accident years.

Office worker standing with papers in hand

Deterioration of accident year 2016-2019 reserves

The adequacy of reserves is a critical issue for the property and casualty (P&C) insurance industry, as it affects the profitability, solvency and reputation of insurers. Reserves are estimates of future liabilities based on past claim experience and actuarial assumptions. These estimates are subject to uncertainty and may change over time due to various factors, such as changes in claim frequency and severity, changes in legal environment, changes in social attitudes and changes in economic conditions. These factors may cause the actual payments to differ from the expected payments, resulting in reserve development, which can be either favorable or unfavorable.

One of the factors that has been widely discussed in recent years is social inflation, also called legal system abuse, which refers to the phenomenon of rising litigation costs, higher jury awards and broader contract interpretations that increase insurers' liabilities beyond their original expectations. Social inflation can have a significant impact on the reserve adequacy of P&C segments, especially in lines of business that involve long-tail claims, such as liability lines. In this article, we analyze the trends and patterns of reserve development in P&C segments, with a focus on the effects of social inflation and other factors that affect the adequacy of reserves. We will focus on the causes of adverse reserve development in the 2016-2019 accident years and what this means for future development in more recent accident years.

Adverse reserve development in the 2016-2019 accident years of liability segments could be a precursor to the deterioration of 2020 and subsequent. As can be seen in Figure 1, the booked net ultimate loss ratios in the Other Liability Occurrence line of business for the U.S. P&C industry for accident years 2016-2019 have developed adversely every calendar year since they were initially established (the initially established reserves are represented by the yellow bar at 12 months of development in Figures 1 to 3). The net ultimate loss and defense and cost containment expense (DCCE) ratios for the Other Liability Occurrence line of business have increased six to 10 loss ratio points depending on the accident year between the initial estimate after 12 months of development and the most recent estimate at year-end 2023. Other Liability Occurrence is not alone in experiencing development in accident years 2016 through 2019. Other Liability Claims Made has also seen almost universal adverse development in report years 2016 through 2019, with loss and DCCE ratios in report years 2018 and 2019 developing more than 13% and 8%, respectively. See Figure 2. The Commercial Auto Liability industry net loss and DCCE ratios for the 2016-2019 accident years have seen strengthening of over 10% since they were initially established. See Figure 3. Our analysis focused on industry aggregate net results, as net results are currently available. We suspect that direct results will display even more adverse development due to nuclear verdicts that would have a greater impact on the reinsurance layer. Direct results are not currently available but will be available in a month. 

See also: Insurers' Social Inflation Problem

Figure 1: Aggregated Other Liability Occurrence Industry Data From 2016-2023 Year-End Schedule Ps

Figure 1: Aggregated Other Liability Occurrence Industry Data From 2016-2023 Year-End Schedule Ps

Figure 2: Aggregated Other Liability Claims Made Industry Data From 2016-2023 Year-End Schedule Ps

Figure 2: Aggregated Other Liability Claims Made Industry Data From 2016-2023 Year-End Schedule Ps

Figure 3: Aggregated Commercial Auto Liability Industry Data From 2016-2023 Year-End Schedule Ps

Figure 2: Aggregated Other Liability Claims Made Industry Data From 2016-2023 Year-End Schedule Ps

Some insurers’ press releases include comments on social inflation increasing prior accident/report years’ reserves. When commenting on 2023 financial results, Vince Tizzio, president and CEO of AXIS Capital, said, “The decisive actions we are taking this quarter address reserve development that is predominantly related to 2019 and older accident years, as current economic and social inflation trends impact the overall U.S. casualty market. We undertook a rigorous review that included an examination of trend assumptions, emerging development patterns, new industry data and current legal trends.” AXIS Capital’s president is not the only one pointing to social inflation. Everest Re’s CEO, Juan Andrade, noted that Everest Re strengthened insurance segment reserves by $392 million for the impact of social inflation on long-tail lines with a focus on the 2016-2019 accident years. While not all point to social inflation as a root cause, companies are noting latent development in the 2016-2019 accident/report years in their press releases. 

See also: What to Do About Rising Inflation?

Social inflation – the driver of latent reserve development

The large development of more seasoned reserves stems at least in part from components of social inflation such as juror sentiments, nuclear verdicts, third-party litigation funding and evolving legal tactics.

Juror sentiments and nuclear verdicts

The 2007-2008 global financial crisis damaged people's trust in corporations, and many people now want to see big corporations pay more for their actions. This has led to higher-severity claims, especially in lines of business where lawsuits can be framed as a corporation against an individual. We are defining nuclear verdicts as those with a settlement over $10 million. Nearly two-thirds of the nuclear verdicts in personal injury and wrongful death cases studied by the U.S. Chamber of Commerce Institute for Legal Reform over a 10-year period came from product liability (24%), auto accidents (23%), and medical liability (21%) cases. Commercial Auto Liability has experienced nuclear verdicts for many years now; however, 2021 brought a new landmark, a $1 billion wrongful death verdict against two trucking companies found to be negligent. As nuclear verdicts become more and more prevalent, nuclear award amounts are reported by the media, making very large verdicts feel more mainstream to potential jurors regardless of whether the plaintiff is able to collect the full verdict amount. This rise of nuclear verdicts is contributing to rising social inflation. 

Third-party litigation funding

Jury attitudes are not the only factor in the increase in nuclear verdicts. Third-party litigation funding (TPLF) also continues to drive up the cost of claims. TPLF is when an external party invests capital in the litigation process in exchange for a portion of the settlement amount. The rise of TPLF has created larger verdicts as plaintiffs are able to pursue cases with stronger funding for longer periods. Just between 2019 and 2022, the TPLF market grew by 44% in the U.S. This market growth shows no signs of slowing, which will likely compound the upward pressure on liability severities. According to Swiss Re, TPLF-funded cases generated average internal rates of return of between 20% and 35% in 2019 and 2020. The rates of return on TPLF outperform those of risky asset classes such as venture capital and private equity. The TPLF market rose to $17 billion in 2021, with over 50% of that occurring in the U.S. Nine states have pending legislation regarding TPLF. The pending legislation in many states is aimed at requiring disclosure of TPLF to a jury as well as consumer protections. While this is a start, it may be quite some time before we see TPLF legislation soften the impact of TPLF on claim severity. TPLF thrives in environments where the plaintiff can be seen as the little guy against a big corporation, such as Commercial Auto and Professional Liability lines. 

Evolving legal tactics – a defense attorney’s perspective

Inflation—both social and economic—continues to drive settlements and judgments higher. Although inflation seems to have cooled from its 2022 high, it continues to take a toll on consumer sentiment and the cost of resolving claims. Law firms, experts and vendors are still increasing hourly rates to keep up with inflation, which serves to drive up the projected cost of defense beyond initially projected budgets. Social inflation is going hand in hand with that and also driving up costs. As nuclear verdicts desensitize potential jurors, we have seen plaintiffs’ firms acting more bullish about jury trials and the potential for excess of policy judgments. Plaintiffs also seem to be using inflation to increase their models of alleged damages to put pressure on pretrial settlements. These phenomena are part of the cause of a substantial increase in claim severity in professional liability lines.

In light of plaintiffs’ increased expectations, we are seeing evidence that it is harder to resolve professional liability claims, as plaintiffs seek larger monetary compensation than they have in the past. Mediators have reported that it is rare these days for complex professional liability cases to settle early, before considerable expense is incurred. Why there is a sudden divergence in case assessments is unclear, but what it means is that cases are dragging on for longer than they have in the past, especially large, eight- and nine-figure cases, where plaintiffs are refusing to settle early for fear of leaving any money on the table. This is especially in line with cases involving failed or bankrupt companies, as investor and creditor losses can be substantial and lawyers and accountants are viewed as ”insurance policies” when things go wrong. Investors who have lost money in bad deals often feel entitled to compensation from professional advisers, even when they struggle to link any acts by the professional advisors to the ultimate losses. Further, the judiciary seems disinclined to accept strong legal defenses such as in pari delicto and grant dispositive motions. What this means is that more cases are going all the way through discovery and into trial—or at least to the courthouse steps—substantially increasing the defense spending and ultimate cost of a claim.

Furthermore, regulators seem to be affecting claim inflation, as well. According to a recent Cornerstone Research article, the number of accounting and auditing enforcement actions increased 22% in fiscal year 2023 over 2022. This is affecting publicly traded companies as well as their professional service providers, and there has been a marked increase in claim notifications involving claims, subpoenas, inquiries or other actions by regulators against professionals. Although fines and penalties from such actions are not always covered by professional liability policies, the cost of investigating and defending these claims is, and that is also increasing the overall cost of liability claims. 

What does this mean for future reserves?

The adverse development trends in these liability lines lead us to question whether we will see similar deterioration in reserves for the 2020-2023 accident years. Unfortunately, the adverse reserve development may not be limited to the 2016-2019 accident/report years. Many actuarial methods rely on the loss ratios or ultimate losses from more mature accident/report years to predict the ultimate losses on recent accident/report years, where very little is known from actual claim data. Therefore, the deterioration of the 2016-2019 reserves could mean that more recent years might also undergo a similar period of adverse development, as pricing estimates in more recent years initially relied on the 2016-2019 years. The Other Liability Occurrence industry-weighted average booked loss and DCCE ratio at 12 months of development for the 2020 through 2023 accident years is 64%, which is slightly below where the weighted average 2016 through 2019 booked net ultimate loss and DCCE ratio started out at 12 months of development (65%) and significantly below where the 2016 through 2019 booked net ultimate loss and DCCE ratios were at year-end 2023 (72%). Likewise, Other Liability Claims Made and Commercial Auto Liability also have initially established weighted-average loss and DCCE ratios for the 2020-2023 period that are similar in magnitude to the initially established weighted-average loss and DCCE ratios from 2016-2019, which have developed significantly since their establishment. This is exhibited in Figure 4. There have been significant rate increases over the past few years, which might mean the 2020-2023 ultimate loss ratios are not overly optimistic. However, only time will tell where loss ratios will ultimately settle.

See also: A Tipping Point for P&C Litigation 

Figure 4: Net Ultimate Loss and DCCE Ratios by Accident/Report Year at 12 months of Development and at 12/31/2023 (from 2016-2023 year-end Schedule Ps)

Figure 4: Net Ultimate Loss and DCCE Ratios by Accident/Report Year at 12 months of Development and at 12/31/2023 (from 2016-2023 year-end Schedule Ps)

As of year-end 2023, accident year 2020 for Other Liability Occurrence, Other Liability Claims Made, and Commercial Auto Liability have seen adverse development in the last 12 months. However, as of year-end 2023, the 2021 and subsequent net ultimate loss and DCCE ratios for Other Liability Occurrence and Other Liability Claims Made have experienced negligible adverse development. The same cannot be said of Commercial Auto Liability which has already seen its 2021 loss and DCCE ratios deteriorate from 69% to 71%, and its 2022 ultimate loss and DCCE ratios deteriorate from 70% to 72%. Figure 5 displays the development of the Other Liability Occurrence, Other Liability Claims Made, and Commercial Auto Liability net ultimate loss and DCCE ratios over the past 12 months.

Figure 5: Development of Net Ultimate Loss and DCCE Ratios by Accident/Report Year Between 12/31/2022 and 12/31/2023 (from 2022 and 2023 year-end Schedule Ps)

Figure 5: Development of Net Ultimate Loss and DCCE Ratios by Accident/Report Year Between 12/31/2022 and 12/31/2023 (from 2022 and 2023 year-end Schedule Ps)

Juror sentiments, nuclear verdicts, TPLF and evolving legal tactics have given rise to social inflation. Social inflation’s influence has been prevalent in the Other Liability Occurrence, Other Liability Claims Made and Commercial Auto Liability lines of business. In recent years, these lines of business have seen a deterioration of the 2016-2019 net ultimate loss and DCCE ratios. Only time will confirm whether more recent accident/report years will deteriorate under the pressure of social inflation as the 2016-2019 accident/report years have deteriorated. 


Brian Brown

Profile picture for user BrianBrown

Brian Brown

Brian Brown is a principal and consulting actuary for Milliman.

His areas of expertise are property and casualty insurance, especially ratemaking, loss reserve analysis and actuarial appraisals for mergers and acquisitions. Brown’s clients include many of the largest insurers/reinsurers in the world.

He is a past CAS president and was Milliman’s global casualty practice director.


Katherine Pipkorn

Profile picture for user KatherinePipkorn

Katherine Pipkorn

Katherine (Katie) Pipkorn is a fellow of the Casualty Actuarial Society, a member of the American Academy of Actuaries and a consulting actuary at Milliman.

She has experience in various property and casualty lines of business, most notably personal lines, professional liability and credit risk. She is a frequent presenter at internal and external conferences.


Christopher Fredericks

Profile picture for user ChristopherFredericks

Christopher Fredericks

Christopher Fredericks is a partner at Mendes & Mount

His work is primarily on professional liability claims involving lawyers, accountants and directors and officers. He also represents reinsurers with respect to reinsurance programs involving those same professionals.  

Tailoring Solutions for Affinity Groups

The focus on affinity groups is gaining momentum in tandem with consumers' growing preference for highly personalized solutions. 

Insurance's New Math

The "personalization economy" has had a transformative effect on every industry, including insurance. As more consumers demand customization from all their transactional experiences with favorite brands, insurance carriers are being pressed to offer unique programs and plans. Fortunately, focusing on serving affinity groups helps carriers meet and exceed member expectations and has a broad and positive ripple effect on bigger communities.  

The Rise of Affinity Groups  

Affinity groups have been a part of insurance for a long time. Yet they are increasingly gaining momentum in tandem with consumers' growing preference for highly personalized solutions.   

What makes affinity group programs so valuable is the basis of their creation. In business, an affinity group is a collection of individuals who share interests, characteristics or affiliations. Consequently, an affinity group's members tend to have common challenges. Insurance packages and complementary resources designed to address those challenges can give members the exact security and protection they need.  

The effectiveness of affinity group programs is exemplified by the American Medical Association's (AMA) recent initiative that provides unique insurance options specifically for medical students. In my roles as the president of the Professional Insurance Marketing Association (PIMA) and the president of AMA Insurance, I have had the privilege of overseeing the development of various affinity group insurance programs, including this program designed to meet the unique needs of medical students.  

Why target medical students, though? The answer goes back to conversations the AMA was having through their House of Delegates. The House of Delegates is a body whose mission is to advocate and bring forth ideas for physicians and physicians in training. Though physicians had access to disability insurance through the AMA's insurance packages, medical students didn't. There was a big gap because becoming disabled can be financially and emotionally devastating for medical students.  

According to statistics from a 2023 University of Pennsylvania study, having a disability can reduce a physician's salary by more than 20%. That's significant on its own, but for medical students shouldering student loan debt and other financial obligations, it can lead to personal and fiscal crises.   

As a result, the AMA decided to view medical students as an affinity group and find carriers that could provide a custom disability insurance product for them. It took time but came to fruition. The insurance offerings included a unique student loan payoff provision and preventative tools and resources to help students stay in school. These features helped AMA enhance the plan and make it even more practical and customized for medical students. Consequently, medical schools that opted for disability insurance were also tapping into these prevention-based wellness programs for their students.   

This one-of-a-kind support for medical students had a macro impact that can't be overstated. By giving medical students tools to prevent and handle disabilities, the AMA enabled them to move forward more confidently to become practicing physicians and extend their knowledge, compassion and skills to countless patients.  

See also: How Parametric Insurance Fills in Gaps

Key Insights From Healthcare-Related Affinity Groups  

There were several factors and trends that arose throughout the medical student affinity group experience. The first was that personalized insurance solutions greatly appealed to the end users. Consequently, it is critical to have a deep understanding of the target affinity group audiences. From this understanding, you can more easily seek out partner carriers and options.  

For instance, the AMA considered the mental health aspects of becoming disabled as part of our mission. In the demanding world of healthcare, addressing and protecting physicians' well-being is vital for their ability to deliver high-quality patient care. Recognizing this crucial need, the Accreditation Council for Graduate Medical Education (ACGME) established new requirements in 2017 to prioritize and promote the well-being of residents, fellows and faculty members. The AMA began providing students and residents with preventative programs and resources.   

A second insight from the medical student affinity group was the importance of technology in insurance. Technology is changing how carriers can be found, and quotes can be produced. Technology also makes the consumer experience more streamlined and personal, such as with mobile apps and texting capabilities. Consumers no longer have to accept cookie-cutter engagements with carriers. Those who become part of affinity groups can avail themselves of services designed and customized for them.  

A final insight is that cost is still a barrier. It's hard to get organizations and even end users to pay for insurance, even when it's delivered in a personalized, affinity group-directed way. This is where more education is needed on the true context and benefits of affinity group insurance programs. Affinity group packages aren't limited to offering discounts, lower premiums or accessibility. On the contrary, they can improve the health, welfare and stability of entire communities.  

See also: Insurance's New Math

Looking Beyond Physician Affinity Groups 

The medical student disability insurance example illustrates a use case for how leveraging affinity groups can have sweeping outcomes. Yet affinity groups aren't limited to healthcare professional insurance or medicine alone; their application is more universal. Coverage for members of any affinity group can leave a lasting mark on communities around the world. 

Although it can take time and energy to identify and advocate for affinity groups in the insurance realm, it can be a worthwhile venture. It can also open doors for insurance carriers to build trust with consumers who may not have been exposed to those carriers' brands or product lines. The affinity group members get the insurance to support them and their families, and the carriers get to expand their reach and influence. 

In the future, we expect to see affinity group programs evolve and become more commonplace. This exciting evolution has the potential to make inroads for individuals and families across the nation. All it takes is a willingness to think beyond the status quo and begin leaning into the power — and possibilities — of hyper-personalized insurance solutions from affinity groups.

The New Era of Underwriting

With AI handling data processing, information retrieval and automated recommendations, underwriters can reinvent their role.

Woman in Purple Blazer Smiling while Holding a Pen

AI is everywhere in insurance now, transforming traditional processes and methodologies. If you’re in the industry, you’re no doubt already experiencing its effects in a number of ways. My own work, for example, involves using AI to streamline and automate many aspects of property insurance. We use a combination of aerial, ground-level and satellite imagery analysis for insights into the physical characteristics and risk profiles of buildings, enriched further with additional pertinent insurance data such as tax records and building permits.

Generative AI is currently the most visible aspect of AI to most people. That’s because anybody can make use of it, regardless of technical competence. You just talk to it and (if all goes well) get some astonishingly useful answers. We’ll come back later to what happens when things don’t go quite so well.  

Despite the widespread applicability and visibility of generative AI across various sectors, its potential within the insurance industry remains somewhat untapped. However, with the emergence of tools facilitating the parsing, comprehension and transformation of intricate insurance documents, the landscape is shifting. With these developments, generative AI is poised to significantly influence underwriting practices, promising improvements in risk analysis, operational streamlining and overall process efficiency.

Let’s zoom out and think about the likely impacts of generative AI on underwriting in general. It’s already pretty clear that the integration of GenAI in the underwriting process has the potential to change this centuries-old process, improving risk analysis, streamlining operations and enhancing efficiency. 

See also: The Promise of Continuous Underwriting

The operational perspective

Generative text solutions based on large language models are already powering new business underwriter-focused chat applications, providing instant access to comprehensive answers from underwriting manuals. These AI-driven assistants appear almost magical in their ability to interpret complex queries, synthesize relevant information and provide tailored responses. 

Underwriters used to have to consult various manuals, guidelines and policy documents to do their work. But not anymore. GenAI’s automated data extraction cuts out drudge and minimizes errors. It analyzes vast amounts of data, including applicant information, risk factors and historical data, leaving humans to focus on higher-level decision-making.

For anyone brought up on manual underwriting processes, the time savings on offer are mind-boggling. But that’s a one-dimensional view. As well as saving time, GenAI helps identify potential risks or opportunities underwriters may have missed, leading to more accurate, consistent and objective underwriting decisions.  

And beyond the evaluation and pricing of risk, it can potentially automate the generation of policy documents and contracts, multiplying time savings and ensuring consistency across policies.

We’re still not done: AI systems can monitor underwriting workflows, identify bottlenecks, and suggest process improvements. This continuous monitoring and optimization can lead to continuing efficiency gains and cost savings.

What are the challenges?

If that’s the case in favor, there are also challenges. Underwriting involves intricate rules, regulations and industry-specific knowledge that AI models are going to struggle to fully absorb without proper training and oversight. Concerns also remain about potential biases in the data used to train LLMs (large language models), and the potential for unfair practices if not carefully monitored.

These questions about accountability and transparency are not unique to insurance. All industries are wrestling with the legal and ethical implications of GenAI. But the highly regulated nature of our sector means that insurers must carefully navigate these challenges, without losing the power of AI to enhance operational efficiency and decision-making accuracy.

That should be possible, as AI frees people to focus on more complex and high-value tasks. However, successful implementation of AI in underwriting will require careful planning, training and change management.

See also: Underwriters' Productivity Can Double

IT considerations

If AI puts new demands on people, the same is true for tech. Integrating AI into existing underwriting systems can affect infrastructure and support requirements. AI models rely heavily on data for training and inference. While the data may be plentiful, do you have a robust data infrastructure in place and the data governance and quality processes that are also crucial to ensure the accuracy and integrity of the data feeding the AI models?

Integrating with existing underwriting systems, policy administration systems and other core insurance platforms may also require new APIs, data pipelines and middleware for smooth data exchange and communication with legacy systems.

On the security front, AI systems handle sensitive customer and underwriting data. So there is a red flag here. Robust cybersecurity measures and data privacy protocols are essential to protect against breaches or misuse of data and to ensure full compliance with industry regulations and data protection laws.

And with persisting skills shortages, insurers will probably need to invest in upskilling the IT workforce or partnering with external AI experts to ensure effective development, deployment and maintenance.

The underwriter of the future

With AI handling data processing, information retrieval and automated recommendations, underwriters can reinvent their role to focus on more nuanced decision-making, client advice and driving innovation. 

Ultimately, successful adoption of AI in underwriting is going to be about a best-of-both worlds approach that combines human expertise and experience with the speed and insights of AI technologies. By getting the balance right, insurers can unlock the full potential of these tools while ensuring fairness, transparency and responsible decision-making.


Jesse Canella

Profile picture for user JesseCanella

Jesse Canella

Jesse Canella is chief executive officer at Tensorflight.

An AI- and imaging-based insurtech focused on the commercial property industry, Tensorflight uses satellite, aerial and ground -level imagery to automate commercial property inspections and claims processing.

Previously, Canella was a non-commissioned officer in the U.S. Marine Corps, serving in the Infantry as a rifleman and squad leader during Operation Iraqi Freedom.

Changing Face of Cyber Insurance

It's no longer enough to ask if companies are following protocols to avoid cyber attacks. It's time to ask, "Are those protocols the right ones?"

Man Reclining and Looking at his Laptop

The world should be bracing for an increase in cybercrime, with the global cost of cyberattacks expected to surge over the next few years from $9.22 trillion in 2024 to $13.82 trillion by 2028, equivalent to over half the U.S.’ gross domestic product (GDP). 

Over the last five years, companies have continued to increase their investment in cyber tools, yet fewer than 25% of organizations say they are "extremely confident" in their ability to respond to a cybersecurity event. This raises alarm bells for the future of cybersecurity and our collective ability to insure its risks.

In 2023, over 60,000 emails from U.S. State Department accounts were stolen when Chinese hackers breached Microsoft’s cloud-based Exchange email platform. Recently, Clorox suffered over $350 million in damages. In December 2023, genetic-testing company 23andMe admitted that nearly seven million people’s information was accessed by threat actors.

Adding UnitedHealthcare’s recent breach to the list, operational disruptions are replacing mere data loss in cybercrime, and traditional exclusion language in riders is not proving to be as useful in the cyber domain as it has been in the property and casualty domains. We are in a new era.

For the last five years, increasing cyber insurance premiums and more rigorous attention to the presence and maturity of key cyber risk reduction measures at companies – such as regular tabletops and implementation of controls such as the NIST Cyber Security Framework (NIST-CSF) or the CIS Critical Security Controls (CIS-CSC) – have proved sufficient for insurers to improve their underwriting and profitability. It may feel like we are in equilibrium, but we really are just in the calm before the next storm.

Why? Because like so many times in the past, all we have been doing so far is to ask, “Are you doing the right things?” We have not been asking the crucial question: “Are they effective?” 

The new era in cyber risks and their management, and those who underwrite and insure them, calls for us to take all defenses to that last step – to underwrite not just on the paper confirmations about the presence, maturity and tabletop practice of company controls and cyber response and restoration procedures, but now to focus on (i) how effective they can actually be in practice and (ii) whether that effectiveness is being regularly determined using the actual defensive tools under realistic, severe event circumstances.

Underwriting using effectiveness criteria is called Efficacy-Based Underwriting (EBU), and this is how more underwriters can take advantage of the kinds of information an increasing number of companies can now provide.

Large swaths of companies around the world have learned how to efficacy test their financial controls over financial reporting in the wake of the passage of Sarbanes-Oxley. CFOs led the way. In the wake of new regulation in cyber, DORA in the E.U., as well as the new SEC Cyber Rules in the U.S, CISOs are leading the way in regularly testing and showing the efficacy of their cyber controls.  This is the key new lever for underwriters to use as they confront this new era of operationally disruptive cyber risk.

See also: How to Build a Solid Cybersecurity Program

Contextualizing the financial risks of cybercrime 

Cyber insurance, or cyber liability insurance, tries to protect businesses and people from the financial consequences of cyber incidents. The global average cost of a data breach stands at $4.5 million, an increase of 15% over the last three years. As the cost of cybercrime balloons and cyberattacks become bigger threats to a company’s infrastructure, cyber insurance and a comprehensive view of actual and residual risk exposures in cyber are no longer a luxury.

Recent research indicates that only 19% of companies claim to have coverage for cyber events beyond $600,000, with just 55% having some form of insurance coverage at all. An even larger hurdle, however, is that ineffective underwriting models continue to lessen businesses’ appetite for cyber insurance and for insurance companies to provide it.

An acute lack of corporate comprehension of a company’s ability to withstand severe cyber incidents highlights why boards are unsure as to whether they cover those cyber risks and underlines why premiums can be so expensive. If the industry demands to use a data-driven, efficacy-based approach to know where that line exists in cyber, it gives a more adequate option to companies and insurers alike.

Weaknesses in legacy underwriting models

Contemporary cybersecurity underwriting remains reliant on inputs from paper-based assessments. There have been improvements in recent years, including more available data on large losses, which has enabled underwriting models to cater more accurately to industry and company characteristics. Using extensive datasets detailing escalating losses significantly enhances the precision of risk evaluations, thereby advancing comprehension of how companies can manage and alleviate cyber threats. This advancement is illustrated by the NIST Cybersecurity Framework 2.0 and the Critical Security Controls outlined by the Center for Internet Security (CIS), which have collectively refined the conventional approach.

Nevertheless, despite the implementation of more exhaustive risk management frameworks, insurance underwriting models persist in primarily using paper-based evaluations of "control maturities" and generic risk exposure models. These serve as substitutes for assessing how efficiently an organization can deploy security measures or restoration protocols during a significant cyber event.

The expanding repository of material losses underscores the limitations of relying on paper-based assessments as practical performance indicators. Recent incidents, like the contentious settlement of Merck's $1.4 billion cyber insurance claim, underscore that exclusionary provisions are inadequate solutions for the rapidly evolving nature of cyber threats, along with their increasingly diverse methods of causing financial harm to companies.

The efficacy-based underwriting model

By changing the cyber underwriting process to center on the consistently evaluated effectiveness of a company's cybersecurity measures rather than solely relying on paper evaluations, insurers can establish the necessary threshold for accurate underwriting. By rigorously stress testing your systems in real time, insured enterprises will also have more incentives to take more of these preventative stances based on efficacy and proficiency, ready for when a real cyberattack hits your business. 

What’s encouraging is that numerous companies in the U.S. and worldwide have been conducting efficacy testing and refining their cyber controls over the last few years. From optimizing tech stacks to subjecting systems to stress tests, these methods bolster an organization's security posture across their people, process and technology. This strategy entails maintaining high-fidelity replicas of the organization's networks and subjecting them to regular attacks, ranging from minor to severe cyber threat, until failure occurs. This approach enables companies to verify that their teams, tools and procedures remain effective even against the most serious cyber threats. The objective is to continuously assess the effectiveness of individual components as well as the collective efficiency of all of a company’s controls. Consequently, metrics-based efficacy testing in cybersecurity can be, and indeed is already being, implemented.

This approach can be seen in the airline industry. Flight crews regularly practice their responses to severe engine outages and hydraulic and other systems failures in high-fidelity simulations of the Airbus or Boeing planes they fly. They are allowed to fail in this environment and often do. The data collected from such exercises points out where responses are correct and goes a long way in ensuring pilots are proficient and prepared to handle such events during real-life commercial flights. This analogy demonstrates the effectiveness and necessity of testing out-of-production networks, allowing companies to understand their flaws in a simulation rather than the real world. 

See also: The Weak Point in Cyber Security

The sum of all parts

Cyber underwriters are no longer constrained to evaluating companies solely on theoretical effectiveness. Now, companies can furnish evidence of their effectiveness against a comprehensive array of the latest potentially significant cyber threats, enabling insurers to readily leverage this evidence.

For insured organizations, this means quarterly insights into how well their people, process and technology can perform against the most severe cyber threats. The granular efficacy data collected allows them to fine-tune the performance of their defenders and their defenses, as well as perfect and provide visibility into their cyber risk exposures. In turn, companies can be rewarded with lower cyber insurance premiums. 

Cyber insurers that switch their underwriting models to ones based on proven efficacy will better understand the extent to which risk and event exposures can be contained, with minimum damage and disruption. This will speed the organization's ability to get back to business as usual post incident. This approach will also enable smaller companies, which are currently priced out of having any cyber insurance at all, to (re)access coverage. 

Efficacy-driven underwriting provides a mutually beneficial arrangement for both insurers and the insured. Insurers are able to offer lower premiums to entities that truly merit them. Consequently, businesses will be motivated to integrate cybersecurity best practices into the foundation of their operations. This eliminates the disparity between paper reports boasting effectiveness and actual severe cyberattacks revealing the contrary. Both the insured parties and their insurers will have assurance in their capacity to withstand such events in advance, leading to improved outcomes for all.


James Gerber

Profile picture for user JamesGerber

James Gerber

James Gerber is the CFO of SimSpace.  

Prior to joining SimSpace in 2022, he was the CFO of venture- and private equity-backed companies in the cybersecurity and education spaces.  During his time at the Pension Benefit Guaranty Corporation, he oversaw risk forecasting for most of the companies in the S&P 500, and he managed an institutional investment portfolio with over $50 billion of assets.

Gerber has a bachelor of science degree in mechanical and aerospace engineering from Princeton University and an M.B.A. from the Harvard Business School. He began his career as an electronics and communication systems engineer and later founded the Automated Systems Division of Morrison Knudsen.