Download

The Blind Spots in Catastrophe Models

Traditional catastrophe models fall short as climate change intensifies natural disaster risks, demanding smarter approaches to assessment.

Red and White Windsock Against Clear Sky

To outsmart the uncertainties posed by complex and related physical climate risks, organizations need to consider whether their current approach to modeling and assessing natural catastrophe is fit-for-purpose.

As the world heads toward global warming potentially beyond two degrees Celsius by 2050, we're already seeing greater volatility in weather-related natural catastrophe events, as well as an increased impact of chronic hazards, such as heat and cold stress. This is leading to greater uncertainty in economic losses and insurability.

By better understanding and quantifying the true cost implications of climate-amplified natural catastrophe risks, organizations can better prepare for the risks. This may mean checking they are not over-reliant or misinterpreting catastrophe risk models to ensure they avoid gaps in their organization's protection.

Traditional models leaving businesses exposed

Some traditional models for quantifying natural catastrophe risk are leading businesses to potentially miscalculate or underestimate their exposure to catastrophic events. Due to the lack of data and functionality limitations, traditional natural catastrophe modeling typically struggles to capture the wider financial impact due to external value chain interdependencies and operational disruption.

For example, during the 2021 flood in Western Europe, water utility companies authorized water management interventions on several major rivers. This prevented catastrophic dam failure as part of the emergency response procedures for severe/low likelihood events but increased the severity of flooding further downstream. We understand some private sector organizations did not factor these amplifying issues into their risk management and risk financing strategies, having based their decision-making predominantly on theoretical models and their own operational resilience.

Such cases illustrate the importance of moving away from relying solely on theoretical models and instead using a combination of "what-if" severe event scenario stress testing, risk engineering and theoretical modeling that looks beyond organizational boundaries. Organizations should also be prepared to review publicly available emergency response procedures of utility companies to enhance the modeled loss perspectives for flood risk of the theoretical models.

Getting these wider perspectives can enhance a company's ability to understand, quantify and manage the impact of severe events that are becoming more frequent due to climate change. This may also involve revisiting recent and historic natural catastrophe events, claims histories and the lessons learned to better evaluate and scrutinize the theoretical models and their underlying uncertainties, potentially in collaboration with academic or other external partners where an organization does not have the skills sets required internally.

Smarter modeling means harder-working risk spending

Outsmarting natural catastrophe exposures exacerbated by climate change isn't just about closing protection gaps. An evolved natural catastrophe modeling approach that is bespoke to an organization and better reflects the potential impact of different climate scenarios, puts decision makers in the driver's seat of what to spend on protection. By moving away from using a single natural catastrophe model to a more nuanced, multi-method approach, organizations are able to optimize their risk spending.

That's because a wider, clearer view on a company's risks will clarify what does and doesn't represent good value on insurance markets. Organizations will have better insight on questions like: Is my risk worse or better than my peers and, if so, why? They will also know how to better attract capital to their risk.

In a fragile insurance environment, evolving a company's modeling approach puts them in a much firmer position than those organizations that lack a clarified, data-driven view of their risks.

Secondary perils and the amplifying effects of climate change

A 'secondary' peril is a natural hazard that typically leads to small or mid-size damages compared with primary perils such as earthquakes or hurricanes. However, secondary perils, such as landslides following heavy rains or flooding, can often be as damaging as the primary events, meaning organizations need to factor these into how to assess their natural catastrophe risk.

In fact, we're seeing more organizations needing to address how perils such as landslides can be triggered by primary events like earthquakes, floods and tsunamis. Such hazards introduce additional layers of risk that traditional catastrophe models often don't capture.

For instance, a primary event like heavy rainfall may not only cause immediate structural damage but lead to landslides that block access routes, disrupt supply chains and prolong business interruptions. This can lead to further damage to the critical infrastructure and hinder recovery efforts.

That's why strengthening physical climate risk resilience means incorporating scenario testing and stress testing beyond traditional catastrophe modeling to gain that crucial, more comprehensive view of a company's risk exposures, including the potential impacts of secondary perils.

By understanding these compounded threats, organizations can better prepare and build resilience, ensuring they can maintain operations even when faced with complex and connected challenges.

To get ahead of natural catastrophe and physical climate risks today, scenario testing has a valuable role to play. By combining traditional catastrophe and climate analysis with additional stress testing, catastrophe risk engineering and scenario testing, an organization can get a more robust risk management view based on a deeper understanding of their risk profile and impacts across their value chain.

In some cases, this can lead to the business prioritizing non-insurance risk mitigation controls and action plans such as business continuity plans, recovery plans and crisis management readiness, rather than relying on traditional insurance, to improve resilience.

Advanced modeling approaches can also help inform conversations with insurance markets, help address coverage gaps and optimize decisions on risk financing and transfer. This could lead to alternative risk transfer and parametric solutions, depending on a company's risk tolerance, particularly when sufficient capacity is a challenge.

Risk managers looking to take a more strategic role can also leverage methodologies that quantify the current and future value of their company's assets and explore how investors view the organization. Quantifying the financial impact of climate-related risks in this way can enable a better response to climate risks and opportunities (while also potentially meeting certain climate disclosure requirements) and inform strategic conversations on the business's future ability to achieve targets, realize organic growth and access capital.

Pet Insurance Business Thrives

Pet insurance growth drives inland marine market changes as carriers begin reporting it separately in 2024.

A Vet Checking a Sick Rough Collie

For the first time in 2024, insurance company quarterly and annual financial statements started separating pet insurance from the rest of the inland marine premium and loss data. Because several pet insurers offer no other additional coverages that are considered inland marine, prior statement data for these identified insurers can provide insight on pet insurance.

Pet owners have been dramatically increasing purchases of pet health insurance over the last several years. Through the first three quarters of 2024, pet insurance premium came to just over $3.4 billion, meaning the full-year total could be over $4 billion—possibly even $4.5 billion. Based on reporting from the North American Pet Health Insurance Association (NAPHIA), pet health insurance premium more than doubled in the five-year period to 2023, to $3.9 billion from $1.6 billion, with at least 20% growth per year. 

The loss ratio for pet insurance through the first nine months of 2024 was higher than for the rest of inland marine insurance, due possibly to growing demand for insurance to help cover rising veterinary costs. Including pet insurance, the inland marine line's loss ratio for the period remained in the post-pandemic 44%-to-49% range.

The top 10 pet insurers account for 90% of the pet insurance market, making it a highly concentrated market. At the group level, the market is even more concentrated, because National Casualty (No. 2) and Veterinary Pet Insurance (No. 9) are both part of the Nationwide Property & Casualty Group. American Pet Insurance is the No.1 U.S. pet insurer, with $819 million in direct premium through third-quarter 2024.

Top 10 Pet Insurance Writers, by 3Q2024 DPW

Five of the top 10 write no other inland marine coverage, and for two other companies in the top 10, pet insurance accounted for more than 97% of the inland marine premiums. This concentration provides credible insight into the historical underwriting performance of their pet health coverage. Direct combined ratios are mixed, with close to an equal number on either side of the 100.0 breakeven point.

Inland Marine Outlook

For the rest of the inland marine market, results have been very consistent over the past 10 years, with 2020 being an anomaly owing to the pandemic. Event cancellation and travel insurance are two of the catch-all classes captured under the inland marine line of business, and the business line has its roots covering goods in transit, a good measure of which is the U.S. Freight Transportation Services Index.

A rising index indicates that more goods are in transit, which suggests a growing need for insurance to cover those goods. The index has fluctuated moderately the last five years, except for the severe drop early in the COVID pandemic and the subsequent rebound when operations restarted. Even if the amount of goods in transit remains flat, sustained inflationary pressures on the value of those goods could result in higher premium.

Transportation Safety Administration checkpoints at airports also can indicate the amount of travel in the U.S. and provide insight into premium volume specific to trip cancellation. During 2020, when travel was limited due to COVID restrictions, TSA throughput dropped substantially. Total inland marine direct premium written (DPW) thus experienced its only decline since at least 2011, and possibly even before that. For the first time in 2024, TSA throughput exceeded pre-pandemic levels, signaling that travel is back to normal and supporting the likelihood that trip cancellation coverage will increase for the year.

Overall, inland marine remains profitable, outperforming the entire property/casualty insurance industry by a wide margin and doing so with steady growth, buoyed by growing construction and increasing travel. The line's direct loss ratio was more than 20 points better than that of the property/casualty industry in 2023 and has been worse than the P/C industry's only once in the past 14 years—in 2020, when contingency claims (i.e., event and travel cancellations) spiked due to the pandemic shutdowns.

Given that pet insurance accounts for about 10% of inland marine insurance and pet insurance results have been only marginally profitable, inland marine results excluding pet insurance are even more favorable.

Rethinking Data Management in Commercial Lines

Automated data ingestion transforms commercial insurance operations, driving efficiency and revenue growth while laying the groundwork for AI advancement.

Group of Businesswomen Having a Meeting

The digitization of the insurance submission, quoting, and claims processes is prevalent in personal insurance, but commercial capabilities have been slower to adopt due to the complexity of the risks and the data required for underwriting. Those who master the complexities and transform these processes will be cracking the code for faster growth in the highly competitive commercial lines of insurance.

There is growing demand in the market from both retail and wholesale brokers for streamlined interactions with carriers that result in timely and accurate quotes, policies issued, and claims settled. In response, large P&C carriers such as Nationwide, Markel, and Arch Insurance have adopted digital capabilities wherein a user can upload unstructured documents to pre-populate a complex submission, generate a quote on flow business, or adjudicate a claim in a matter of seconds. In markets where speed matters, such as the excess and surplus (E&S) market, this capability can be a differentiator and give competitive advantage to carriers that prioritize data ingestion.

Taking the approach of automating the process for ingesting data has gained traction in the commercial insurance industry. It is deemed a keystone capability for carriers' and brokers' paths in their digital evolution. Additionally, as GenAI capabilities are increasingly being developed to assist underwriters and claims adjusters in their assessment of risks and claims, the breadth and accuracy of data ingested will have a strong correlation with the efficacy of the GenAI tool and become increasingly valuable.

Understanding Data Ingestion: Challenges

Data ingestion refers to the process of extracting, validating, curating, and processing data from various third-party sources into a system of record or centralized data repositories/data lakes. In the context of the insurance industry, where massive amounts of data are generated daily, the manual handling of such a task can be overwhelming, error-prone, and time-consuming. This data can span an array of use cases from risk assessment, quoting and issuing insurance policies, to filing/settling claims, or even booking premium from insurance policies that have been delegated to third parties.

The insurance industry has had challenges getting to this level, starting with the diversity of data types, including structured and unstructured data from sources such as policy documents, bordereaux forms, claims forms, and insured statement of values.

  • Structured data typically refers to system/database data that is organized in a specific and predefined manner, typically following a schema or a data model, or "semi-structured" document formats such as CSV, JSON, or XML. Structures and definitions are typically pre-determined between the two parties exchanging the data for ease of processing.
  • Unstructured data can take the form of emails (e.g. broker submission), Word/PDF documents (insured statement of values), images (e.g., car collision), financial documents (e.g., delegated premium), etc. and pose a greater challenge for organizing, processing, and synthesizing meaningful insights. New technologies are making it easier to unlock the potential within this enormous trove of untapped insights.

The sheer volume of data generated in the insurance sector is staggering—efficiently handling and processing this data is a daunting task, burdening staff whose time would be better spent on other activities. Additionally, getting to sufficient data velocity is an issue, as real-time data processing and speed to market are crucial to driving premium for commercial insurers. Processing claims in a timely manner can affect client retention and revenue from recurring business. Finally, ensuring data quality— the accuracy and reliability of data—is paramount for optimizing commercial insurers' portfolios and protecting their bottom line. Manual data entry, which even some major carriers still practice, is error-prone and can lead to inaccurate insights and decisions. Automating this process can address key challenges by streamlining the collection and integration of data from diverse sources.

Digital Maturity and the Advantages of Automated Data Ingestion

The digital maturity for deploying a data ingestion solution in the insurance industry typically progresses through several stages as the organization evolves its operational and IT capabilities. These include ad hoc data ingestion, basic automation, standardization and optimization, real-time data ingestion, advanced analytics, AI integration, and predictive and descriptive analytics.

A cornerstone capability to build into each of these ingestion maturity phases is natural language processing (NLP). NLP plays a critical role in data ingestion by helping to extract, understand, and process unstructured textual data. In the commercial insurance industry, NLP enables policy/claims document parsing, analyzing exposure documents for underwriting assessment, fraud detection, and more.

In the final stage of digital maturity, the organization is able to leverage fully ingested, curated, and enriched data, not only for descriptive and diagnostic analytics but also for predictive and prescriptive purposes. Advanced machine learning models can be trained on ingested data to forecast trends, identify potential errors or risks, and recommend optimal remediation actions. Eventually, the data ingestion approach becomes an integral part of the organization's decision-making process, driving business growth and profitability and lowering expenses.

Different insurance carriers, brokers, and intermediaries are at different stages of this data ingestion maturity curve, and some insurtechs provide unique capabilities to achieve such maturity. As these capabilities eventually come to be a necessary part of effective business practices, it is imperative for commercial insurers to make investment decisions to remain competitive in the marketplace.

Automated data ingestion solutions bring a number of advantages to the insurance industry and is a necessary prerequisite for the broad set of data needed for GenAI capabilities in the future. Having a "generative assistant capability" requires that an insurer can ingest the appropriate data and feed it into that capability.

Other benefits in the here-and-now include time-saving efficiencies that allow underwriters, operations, and claims professionals to focus on higher-value tasks; greater accuracy and reducing the risk of human error; and real-time processing that allows for up-to-the-minute insights for better decision-making, fraud detection, and customer service. In terms of adoption, automated data ingestion solutions can seamlessly integrate data from a variety of sources, including legacy systems, external databases, and IoT devices.

Once the foundation of a data ingestion tool is established, insurance organizations can scale this solution across their various geographies and product offerings, particularly in high-volume insurance programs. As the volume of data grows, machine learning kicks in to make automated solutions more robust and able to handle the increased data loads, ensuring that insurers can adapt to changing business requirements.

The sustained success of deploying an automated data ingestions solution is limited in the absence of data governance. Data governance is a vital protocol for enabling data consistency, accessibility, quality, and flow through the organization, while also delivering compliance for industries like insurance. In the context of deploying automated data ingestion solutions in the insurance industry, data governance becomes crucial for several reasons, such as: 

  1. Ensuring data quality to officiate accurate underwriting/claims decision making
  2. Abiding by state-by-state data privacy and security laws to avoid reputational risk
  3. Managing potential risks of data breaches or cyber-attacks
  4. Ensuring business continuity/data standardization for interoperability across the organization. 

Without a data governance framework in place, it becomes increasingly difficult for an insurance organization to effectively manage and leverage their data to drive business value, mitigate risks, and maintain a competitive edge in the dynamic insurance marketplace.

Case Study: Data Ingestion Grows Premium Revenue

A consulting firm worked with a leading P&C insurance carrier in the U.S. that had extremely long onboarding times—previously, it could take them six to 12 months to onboard one new customer, due to their manual data ingestion process and onerous data requirements. This sub-par data system inhibited their ability to drive incremental revenue, adopt standardized data protocols, and gain visibility into their portfolio. It also posed various operational, IT, and downstream data challenges to the organization.

In response, the company introduced a tool to automate their data ingestion process. Soon after implementing it, the carrier was able to automate manual, error-prone tasks associated with the extraction, mapping, curation, and piping of the data into downstream systems. Automating the management of this data served to optimize the carrier's operational processes.

Further, the carrier was able to grow premium revenue because it was able to onboard subsequent customers more rapidly. This led to a stronger reputation in the market, attracting additional business. The company's loss ratios were enhanced by better portfolio transparency, gained from the improved data being leveraged for actuarial analyses and risk assessment. The associated benefits with deploying this approach to data ingestion significantly outweighed the cost—$5.2 million—generating a net present value of $20.4 million for the company over the span of five years. Possibly even more important in the long term, the initiative established a foundation for revolutionizing the company's digital ecosystem.

Automating Can Be Transformational

The adoption of automated data ingestion solutions represents a revolutionary shift in the insurance industry. By harnessing the power of automation, insurers can overcome the challenges posed by the increasing volume, variety, and velocity of data. The benefits extend beyond operational efficiency to improved decision-making, enhanced customer experience, and a more competitive position in the market. As the insurance landscape continues to evolve, embracing automated data ingestion is not just a technological choice – it's a strategic imperative for success.


Brian Nordyke

Profile picture for user BrianNordyke

Brian Nordyke

Brian Nordyke is a vice president in the financial services practice at SSA, a global management consulting firm.

He leads teams as an engagement manager in areas such as organizational and operational model redesign, cost-to-serve and market profitability analysis, consolidation and relocation strategies and portfolio optimization and resource allocation. 


Billy Jernigan

Profile picture for user BillyJernigan

Billy Jernigan

Billy Jernigan is managing director, leading insurance engagements, and associate director in the financial services practice of SSA & Co., a global management consulting firm. 

Trump Tariff Uncertainty Will Whack Insurers

The "will-he-or-won't-he" debate has created an "uncertainty economy" that will raise costs for insurers no matter what Trump ultimately decides.

Image
tariff wrecking ball

Just when it seemed that auto insurers had caught up with the surge in car prices that followed COVID-19's disruptions, what I'm calling "the uncertainty economy" has tossed a hand grenade into the industry. Even if everything about U.S. tariff policy became crystal clear tomorrow — and it won't — major damage has already been done. 

The president's threats of double-digit tariffs on all imports, followed by all the waffling on what he'll do and when, is freezing the world of business. What to invest, where to invest, whom to hire, where to hire, what geographies to sell into: All those decisions depend on the rules of the road, and nobody knows what they are.

Trump could, of course, clear up the situation by disavowing his bold tariff plans entirely. The stock and bond markets are certainly signaling that he should. But backing away from tariffs would be an unfathomable loss of face.

Anything short of a total repudiation of a transition to a tariff-based economy will leave us in this uncertainty economy. Prices for cars and parts will float higher even before underlying costs increase, simply because of the possibility that Trump's tariffs could make prices soar. Prices for lumber and other imported supplies that are major factors in home insurers' replacement costs will climb, too. Insurers will face losses on policies currently in force, and already restless policyholders will be shocked when they see what a new policy or a renewal will cost them. 

And those are just the disruptions happening in the near term. If Trump follows through on the extreme form of his tariff plan that he often broadcasts, he will reorder the global economic system that has existed for the past century, in the process disrupting for years the supply chains that insurers rely on. 

To sort through all the variables here, I sat down on Friday with Michel Leonard, the chief economist at the Insurance Information Institute.

Michel says the current situation should have some parallels with the COVID-19 pandemic, given the effect it had on supply chains, especially for autos. 

"For motor vehicles, we will inevitably see a significant drop in underlying growth, even if tariffs are suspended indefinitely, because the uncertainty is already present," Michel said. "During COVID, replacement costs for motor vehicles — both personal and commercial — rose around 60%, largely due to used auto prices. I could see this happening again, following a similar timeline.

"Ultimately, I believe we'll... end up with a cosmetic renegotiation — remember, the current North American free trade agreement was negotiated by President Trump. The real question is timing — whether it takes one month, three months, or six months. Insurance underwriters and industry professionals should be prepared for increases that could approach those COVID-era numbers for motor vehicles, the longer this situation persists.

"And these double-digit increases in tariffs on lumber, auto parts and other materials don't just mean higher replacement costs – they mean many materials aren't available through the supply chain."

Michel said he remains an optimist about the economy, in general — though, given how fast things have been changing, we noted for this version of our quarterly chats that we were speaking at 1pm EST on Friday, March 7. 

"As of this first quarter, I believe we still have enough GDP momentum to weather what we're seeing now, even if these conditions continue for a full quarter," he said. "However, the risk of GDP contraction rather than growth is certainly present.... My current concern is market sentiment."

He sees inflation "hovering between 2.5% and slightly above 3%" but says the Fed won't be able to do much about inflation driven by tariffs. "Higher interest rates can drive down demand," Michel says. "However, when inflation is driven by tariffs or other factors related to consumption issues, raising interest rates doesn't work."

He cautions that other countries seem to be taking Trump's threats much more seriously and reacting much more negatively than they did during Trump's first term, even though much of his language about trading partners is similar. 

"I've been shocked by how seriously other countries are taking these developments on protectionism," Michel said. 

When I asked if he saw any historical parallels to Trump's attempt to reverse a century of increasingly free trade, Michel said:

"What we're facing is potentially comparable to Margaret Thatcher's first two years as prime minister in the U.K. [which began in 1979]. When Thatcher came in, she cut government spending, privatized industries, and moved extremely rapidly. She was actually on track to lose her position until the Falklands War came about and allowed her to recover politically. During this period, the U.K. experienced one of the deepest recessions since World War II.

"That's the kind of economic pain we could be talking about here, but potentially on an even larger scale because it's the U.S. The agenda we're seeing now is even more transformative."

He says there is the potential for surging unemployment and a major market correction. 

"I haven’t given up," Michel says. "I’m still an optimist. But the potential implications go far beyond simple replacement costs, and that's what makes this situation particularly challenging."

I'll add that I believe Trump will have to back off his tariff plan. He just doesn't have the support for it at any level outside of the circle he controls in Washington, DC. 

His rich backers during the campaign certainly heard him talking about putting tariffs on every country, but they say they thought he was either blustering or was simply staking out an extreme position to gain an edge in negotiating. 

Right-wing economists don't support him. The Wall Street Journal editorial page has run headlines about Trump's "Dumb Tariffs" and "Dumbest Possible Tariffs." Even Stephen Moore, a longtime ally whom Trump once nominated for a seat on the Fed, said recently that the tariffs aren't a good idea for now

Polls show that the public at large mostly dislikes tariffs... and that's just at the theoretical level. To the extent that tariffs are imposed, they will raise prices and cause supply disruptions and bring the costs home to people. Trump's supporters argue that tariffs will encourage manufacturers to move production to the U.S., and that's surely true, at least to an extent, but it takes an awful lot longer to build a plant and staff it up than it does to raise a price. Besides, the U.S. isn't the only country that can raise tariffs; the U.S. will lose overseas markets as other countries retaliate. In any case, I don't see how Trump can sustain support for tariffs for many months or even quarters while waiting for any benefits to kick in. 

He's certainly winning the public relations battle at the moment. He's benefiting from the normal surge of enthusiasm from supporters in the early days of a term. He's also unleashed a barrage of appearances on television, and he's benefited from the stunning pace of activity by Elon Musk and DOGE to cut government programs that Trump supporters dislike. But I think that PR wave is cresting. 

DOGE has had to back off on many of its cost-cutting claims. Cabinet members are pushing back on Musk's slash-and-burn tactics when their departments are involved. Judges are ruling in some cases that Musk has overstepped the bounds set by the Constitution. Musk himself has lost some of the Iron Man mystique now that Tesla, the main source of his wealth, has seen its stock price fall 50% since its post-election peak. And, increasingly, we'll all see what the DOGE cuts do to service at federal agencies, to recipients of the aid that is no longer being provided, to the tens of thousands or hundreds of thousands who have been fired (many of them Trump voters) and so on. 

If someone doesn't pay their mortgage one month, they may save a few thousand dollars, but that's just the first part of the story. So far, we've just seen claims about the DOGE savings. Some will be welcomed, at least by Trump voters, but some will not. There is another part of the story coming. 

And, of course, the stock market has been plummeting. The Dow Jones Industrial Average is down 2,700 points since Feb. 19, or 9.4%, including a nearly 900-point drop on Monday, almost entirely because of the uncertainty about Trump's trade war and the related possibility of a recession. The market is maybe the most important point, because Trump seems to care deeply about how the stock market reacts to him.  

But how does this all end? I simply don't know. I'm quite sure the trade war isn't sustainable politically, but I don't see how Trump can back away. 

We'll just have to wait and see. In the meantime, we'll have to keep swimming in all the uncertainty.

Cheers,

Paul 

 

Strategies for Meaningful Healthcare Reform

Healthcare reform's cost burden shifts to employer plans, driving the need for innovative solutions on pricing and risk allocation.

Man in Gray Sleeveless Shirt Riding Bike

Fifteen years after landmark health reforms were introduced under the Affordable Care Act, the financial strain on employer-sponsored healthcare plans remains significant. Reforms primarily expanded coverage through taxpayer-subsidized programs like Medicaid and public exchanges. Those actions shifted costs both directly to taxpayers and indirectly to participants in employer-sponsored plans—leading to increased premiums, higher deductibles and growing employer burdens.

Looking ahead, sustainable solutions must address persistent inefficiencies and create a more accurate distribution of healthcare costs. By embracing innovative approaches such as reference-based pricing (RBP) with participant representation, rethinking risk allocation with safeguards against excessive out-of-network charges and balance billing, employers can reduce costs while maintaining quality coverage.

The Persistent Challenges of Health Reform

Health reform aimed to increase access and affordability. However, employer-sponsored plans—covering more than 160 million Americans—bear an ever-increasing share of healthcare costs incurred elsewhere under taxpayer-subsidized programs:

  • Premium Increases: Employer-sponsored plan premiums surged over 50% in the last 12 years.
  • Higher Deductibles: Plans with deductibles exceeding $2,000 tripled, from 10% to 32%.
  • Cost Shifting: Covered charges for employer-sponsored plans average over 250% of Medicare rates for identical services, by the same provider, at the same facility.

Meanwhile, publicly subsidized coverage expanded rapidly. Medicaid enrollment increased from 54 million in 2010 to 91 million in 2022, a 68% rise. Public exchange enrollment now stands at 24 million, with over 90% receiving taxpayer subsidies.

While these programs helped reduce the uninsured rate from 48 million to 25 million, they introduced taxpayer burdens by employers and their workers in the private sector—contributing to annual deficits and our $36 trillion national debt. Almost all of the cost of providing access through the exchanges and Medicaid was shifted to employers, workers and taxpayers—today and tomorrow.

Rethinking Health Reform: Toward a Sustainable Model

True reform requires addressing inefficiencies and creating a balanced framework that allocates risk more equitably among individuals, employers and society. Below are key areas of opportunity:

1. Transparency and Reference-Based Pricing

One of the most pressing issues is the lack of pricing transparency, which enables medical overbilling and excessive charges. Employer-sponsored plans often pay inflated rates because of opaque pricing structures. RBP offers a promising solution:

  • How It Works: RBP sets reimbursement rates for medical services based on a reasonable percentage (e.g., 120%-150%) of Medicare rates.
  • Benefits: Employers can reduce costs by aligning pricing with those rates, while participants benefit from lower out-of-pocket expenses.

To succeed, RBP requires robust participant advocacy to help individuals navigate disputes with providers. Employers should partner with third-party administrators (TPAs) and auditors to ensure accurate pricing and billing, safeguarding participants from excessive out-of-network charges and balance billing. This empowers participants while reducing unnecessary costs.

2. Innovative HSA-Capable Coverage Options

High-deductible health plans (HDHPs) paired with Health Savings Accounts (HSAs) are another avenue for cost control. HSAs encourage consumerism in healthcare by giving participants financial responsibility for their choices:

  • Consumer Engagement: Participants with HSAs are more likely to shop for cost-effective services, reducing overall spending.
  • Tax Advantages: Contributions to HSAs are tax-free, and funds roll over year-to-year, helping participants save for future medical expenses.

"High deductible" is actually a misnomer, because the required minimum deductible to be eligible to contribute to HSAs is less than today's average deductible for employer-sponsored coverage.

Employers can enhance adoption and worker preparation for out-of-pocket expenses by offering HSA matching contributions, integrating HSAs with wellness programs and providing tools to help employees make informed decisions about healthcare utilization.

3. Allocating Risk Equitably

A sustainable healthcare system must balance individual responsibility with societal safety nets:

  • Individual Risk: Individuals should bear financial responsibility for healthcare costs tied to preventable conditions or risky behaviors. This provides incentives for wellness and preventive care.
  • Societal Risk: For catastrophic expenses (e.g., costs exceeding $25,000 a year), society should assume responsibility through broad-based stop-loss or reinsurance mechanisms.

This model would not only promote healthier behaviors but also improve financial resiliency when managing anticipated and unforeseen medical expenses.

Challenges From Cost Shifting and Policy Decisions

Despite well-intentioned reforms, policy decisions have unintentionally worsened cost disparities in healthcare. Legislation such as the Inflation Reduction Act will increasingly shift prescription drug costs from Medicare beneficiaries to employer-sponsored plans.

Additionally, significant gaps in reimbursement rates between Medicare and Medicaid exacerbate these challenges. Medicare rates are approximately 40% higher than Medicaid, while employer-sponsored plans often pay as much as 300% of Medicaid rates for identical services, disproportionately affecting employers and employees alike.

Reforms have successfully increased access by reducing the cost of coverage for those enrolled in taxpayer-subsidized programs. Comparatively, the new mandates, taxes, and other requirements have increased the cost of employer-sponsored plans. Rising premiums and deductibles have eroded wage gains, leaving many workers without meaningful financial relief.

Empowering Participants Through Behavioral Economics

Behavioral economics provides powerful tools to enhance decision-making and reduce inefficiencies in healthcare. One such approach is default enrollment, including both enrollment in coverage and enrollment in and contributions to a Health Savings Account.

Another effective measure is the use of advanced explanations of benefits (EOBs), which offer clear, upfront cost estimates to participants. These estimates empower individuals to make more informed choices, reducing the likelihood of surprise medical bills.

Comprehensive reporting tools that track claim status, disputed amounts and recovery rates can further enhance transparency, providing participants and employers with actionable insights.

Additionally, employers can play a vital role by investing in educational initiatives that improve healthcare literacy. By helping participants better understand their coverage options and the implications of their healthcare decisions, these programs foster greater engagement and more thoughtful healthcare utilization. Together, these behavioral economics strategies create a more efficient and participant-centered system.

Charting the Path Forward: Strategies for Meaningful Healthcare Reform

The past 15 years have made it clear that incremental reforms are insufficient to overcome the systemic challenges posed by rising healthcare costs. To create meaningful change, employers, policymakers and participants must adopt innovative solutions that target inefficiencies and promote equity within the system.

A key priority is the adoption of transparent pricing models (such as RBP with participant representation), safeguards against overbilling and support through customizable reporting tools.

Post-payment reviews and recovery processes help mitigate these billing challenges by identifying and recovering overpayments made to medical providers, correcting billing errors and recouping funds.

Expanding the use of Health Savings Accounts (HSAs) fosters consumer-driven healthcare, lowering costs while improving financial preparedness for participants.

Rethinking risk allocation is equally critical, as a balanced approach to sharing responsibilities between individuals and society builds a more resilient system.

Addressing these challenges demands swift and decisive action. Employers and policymakers can create a healthcare framework that reduces financial strain, enhances efficiency and ensures equitable access for all stakeholders. This can be achieved by embracing strategies that involve collaboration with TPAs, stop-loss carriers and cost-containment companies, while leveraging data-driven insights.

The time to act is now. Employers and policymakers must lead the charge toward meaningful reform.


Jack Towarnicky

Profile picture for user JackTowarnicky

Jack Towarnicky

Jack Towarnicky is an ERISA/employee benefits compliance and planning attorney with over 40 years of experience in human resources and plan sponsor leadership roles. 

This includes 25 years as the leader of a Fortune 100 corporation’s benefits function. 

The Cyber Insurance Checklist

As cyber threats evolve, here are tips for businesses to ensure that they're not left uninsurable. 

Ethernet Cables Plugged on a Server Rack

The cyber threat landscape is evolving from an era of large data breaches (Yahoo, Vodafone) to the modern ransomware economy (banking Trojans such as Emotet, TrickBot and Ryuk).

Understanding this backdrop is crucial as businesses need to think carefully about how they can protect themselves from an attack but also insure their assets. Cyber risks used to be almost uninsurable; however, as the landscape continues to change, cyber insurance is becoming essential for CISOs. Yet insurance also has its limitations and therefore must be integrated into a layered defense strategy to be effective.

Minimum Requirements for Cyber Insurance

Insurers today are primarily concerned with claims emanating from human-operated ransomware attacks, which disrupt systems and operations through encryption and data exfiltration and ransom demands. To purchase insurance cover, companies must demonstrate their ability to defend against threats by deploying controls to block attackers' strategies. Notable strategies include the following:

1. Preventing Attacker Footholds

• Multi-factor authentication for end users and external access

• Endpoint protection and endpoint detection and response (MDR/XDR) solutions

• Cybersecurity awareness training and phishing campaigns

• Email filtering and web security

• Comprehensive patch and vulnerability management policies

• Hardening techniques, including addressing common issues such as remote access, bring your own device, and cloud security configuration

2. Stopping Lateral Movement and Reducing Blast Radius

• Network segmentation and segregation of high-risk/high-value networks

• Privileged access management (PAM) for administrator and service accounts

• Logging, monitoring, and correlation

• Digital and service supply chain risk management

• Cyber incident response planning and testing

• Replacing or protecting end-of-life systems

3. Protecting Key Digital Assets

• Encrypted and secured, tested backups

• Enhanced protections for critical assets (encryption at rest, second-layer authentication, zoning of critical applications)

How to Improve and Obtain Value From Conversations With Insurers

The best insurer relations are developed through regular and open dialogue.

By offering you insurance, insurers make your risk their risk. Good insurers will thus offer what I call "loss intelligence": information relating to the most recent and significant claims in the cyber insurance space.

This free intelligence can help you prioritize your cyber program investments. For instance, I put "multi-factor authentication" first in the list above because insurer data tells us that over 80% of all cyber incidents are malicious and start with a compromise of user credentials.

Another example is looking at what questions insurers focus on. They will ask detailed questions about how you back up your data because they see many insureds suffer data loss, exfiltration, and extortion attacks as a result of poor controls in this area.

Measuring and Protecting Value at Risk

All modern organizations are evolving and transforming digitally, but all do so in a unique manner and at different pace. Measuring how dependent an organization is on its technology for generating revenue, meeting compliance obligations, and avoiding reputational harm is critical.

If an organization's operational resilience is materially the same as its digital resilience -- meaning there is no possibility to revert to paper-based or traditional processes in the event of a technology failure -- then its cyber program is critical. Conversely, if the organization can continue to operate unhindered, then it is not digitally dependent. For most organizations, the degree of dependence can be measured on a sliding scale we can refer to as the "percentage of value at risk."

Regulators are now also aligning to this approach to avoid major disruptions, losses, and harms, as can be seen in DORA, NIS2, and GDPR, all of which look at the criticality of assets under protection as a means for determining control level requirements.

Carrying out a Risk Assessment

To best understand and measure the value at risk, a structured risk assessment should be carried out with some key and distinctive phases. These should look at determining value at risk in the digital domain (impact), quantum of risk exposure (in financial terms), and probability of risk occurrence through expert-led assessments as follows:

Assess Impact by developing and stress-testing key loss scenarios and areas of exposure. This is a qualitative assessment that looks at the material exposures of a business to technology loss and develops a small number of significant events it wishes to avoid, mitigate, or reduce impact on.

Quantify Cyber Risk of each material scenario through the use of actuarial methods such as stochastic modeling or other industry standards such as Factor Analysis of Information Risk. This step is critical to compare cyber and digital risk with other strategic risks such as supply chain, environmental, political or competitive risk.

Expert-Led Controls Assessment via direct and indirect means. Direct, independent, expert-led controls testing through audit, penetration testing, and code review are essential for material control requirements on both first-party and third-party technology implementations. Having this independent attention is crucial should an organization's insurance attestations be challenged during a cyber insurance claim or a regulatory investigation. Indirect methods such as digital, open-source and dark web assessments are becoming commonplace as well, with insurers often conducting their own due diligence to determine if an insured has been compromised or mandating direct scans to detect vulnerabilities as a precursor to offering cyber insurance.

To Transfer or Not

Once an organization has carried out such risk assessment, it is able to determine how much insurance it may need, as well as how likely it is to suffer a cyber incident and what the severity could be.

The organization should have evaluated its cyber resilience to its key scenarios and will be in a position to discuss with insurers the feasibility and cost of insuring certain scenarios.

It is important to understand that insurance may not be the most cost-effective option for transferring risk. For example, if an insured is worried about data being stored and processed in an outsourced HR system, other efficient routes could include negotiating effective risk transfer mechanisms such as contractual penalties, security assurance, and financial liabilities directly with its third-party supplier. Another common approach is to invest in better mitigations as a precursor to cyber insurance if one of the key controls mentioned above is not yet effective.


Jano Bermudes

Profile picture for user JanoBermudes

Jano Bermudes

Jano Bermudes serves as chief operations officer at CyXcel

Prior to joining CyXcel, he served in senior cyber leadership roles at KPMG, Navigant Consulting, Ankura Consulting and Marsh McLennan.

AI Is Reshaping Insurance: 6 Trends to Watch

AI adoption in insurance accelerates as executives embrace real-time analytics and specialized technology for improved operations.

Person in Beige Long Sleeve Shirt Using Laptop

AI adoption in the insurance industry is gaining traction. According to Earnix's recent Industry Trends Report, 70% of insurance executives plan to implement AI models that use real-time data predictions within the next two years — more than double today's adoption rate. It's clear that insurers are increasingly relying on real-time predictive analytics as AI adoption accelerates.

As AI technology advances within the industry, insurers are leveraging both traditional AI, which analyzes data and predicts outcomes, and generative AI, which creates content and explains concepts. As they adopt these technologies, insurers can expect significant improvements across operations. 

Here are six trends shaping the future of AI in insurance.

1. AI-Powered Next Best Actions for Underwriting and Claims

Currently, insurers use AI for tasks like processing straightforward data inputs or automatically categorizing claims. In the future, insurers will also leverage AI to provide next best action recommendations for both underwriting and claims management.

"Next best action" recommendations will provide underwriters with specific suggestions such as adjusting coverage limits or gathering additional data sources to better assess risk, based on AI-driven insights. Similarly, for claims adjusters, AI will suggest next best actions such as seeking out additional documentation or identifying an optimal settlement path based on previous claims data. By automating complex decision-making, AI enhances efficiency, consistency, and speed across underwriting and claims operations.

2. Industry-Specific GenAI for Insurance

While many insurance companies have experimented with general generative AI tools, these often fall short when it comes to the industry-specific needs of insurers, like analyzing policy language or understanding regulatory requirements.

As demand grows, insurers will increasingly adopt specialized GenAI models designed specifically for the insurance sector. These advanced tools efficiently interpret intricate policy documents, generate custom pricing models, and draft precise policy language, far beyond the capabilities of general models. This shift enables insurers to better navigate complex regulatory environments and improve their ability to design personalized insurance products, ultimately leading to greater accuracy and customer satisfaction. As technology evolves, insurers will increasingly rely on these specialized solutions to enhance efficiency, streamline operations, and deliver more customized coverage for policyholders.

3. Phased Digital Transformation

Nearly half (49%) of insurers report falling behind in updating legacy systems, according to Earnix's report. While many insurers have attempted to modernize their legacy systems, these efforts often fail due to their complexity and scope. Replacing all core systems at once often overwhelms insurers, leading to costly setbacks.

In the future, insurers will increasingly rely on a phased approach to digital transformation, implementing targeted upgrades instead of complete system overhauls. This approach allows companies to focus on specific areas like customer portals or underwriting modules, integrating AI gradually and minimizing disruption to existing operations. This incremental strategy helps insurers avoid the pitfalls of past modernization attempts while still improving efficiency and competitiveness in the market. The key is to focus on smaller, high-impact projects that deliver immediate value, driving change and building momentum over time.

4. Specialized Data Sources for Smarter Risk Assessment

Insurers today typically rely on broad data sets like historical loss data, demographic information, and general weather reports to assess risk.

In the future, we will see AI models that incorporate more specialized data sources to improve the accuracy of risk assessments. For example, insurers may use climate change information from coastal surveillance or community-based environmental organizations to better assess property or health risks. Health insurers may leverage fitness-related information from smartwatches to differentiate between policyholders who may look the same in a policy application but actually represent vastly different risk profiles (e.g., active triathletes vs. sedentary individuals with significant latent health risks). By integrating precise data into their models, insurers gain a holistic view that enhances risk assessment and policy pricing accuracy.

5. Smarter, Scalable Document Processing

Currently, many insurers use basic document automation for tasks like summarizing brief records or extracting information from structured documents. However, processing large, complex files like medical records still requires significant manual input.

In the future, AI-driven document processing will change how insurers manage documents at scale. Insurers will be able to process thousands of lengthy, multi-format documents efficiently, handling everything from de-duplicating redundant data to categorizing unstructured records. For example, AI will swiftly summarize and organize critical information, eliminating time-consuming, manual reviews for complex claims. This means that claims processing will be faster and more accurate, allowing insurers to manage even higher volumes of claims while reducing error rates and improving customer satisfaction.

6. AI Models That Align With New Regulatory Demands

As insurers expand their use of AI to streamline underwriting and claims management, industry standards for responsible AI use are becoming more uniform across states. Recently, the National Association of Insurance Commissioners (NAIC) adopted a model bulletin outlining compliance requirements for insurers' AI systems. This bulletin clarifies expectations for development, deployment, and documentation of AI technologies to ensure adherence to state and federal laws. So far, at least 11 states, along with Washington, D.C., have issued bulletins incorporating NAIC's language. They are: Alaska, Connecticut, Illinois, Kentucky, Maryland, Nevada, New Hampshire, Pennsylvania, Rhode Island, Vermont, and Washington.

This year, expect to see insurers adopt transparent, AI-driven models that align with both NAIC guidelines and specific state regulations. These models include explainable AI components to ensure clear, auditable insights into predictive processes to meet compliance demands. For insurers operating across state lines, these advancements streamline regulatory adherence and enhance consumer trust by offering transparent, ethical risk assessments. This shift establishes responsible AI use as a foundational standard in the industry.

AI is now an integral part of the insurance industry, helping insurers streamline operations, enhance processes, and navigate regulatory requirements. As its role expands, so does its value, providing deeper insights that support better business decisions. The future of insurance will depend on how effectively companies adopt AI to meet evolving expectations while building customer trust and confidence.

6 Key Trends: How AI Will Impact the Future of Insurance

Stan Smith

Profile picture for user StanSmith

Stan Smith

Stan Smith is the founder and CEO of Gradient AI.

He has been working with AI and technology companies for nearly 30 years. Prior to Gradient AI, he held founding or executive-level roles with multiple startup companies, including MatrixOne, Agile Software, and OpenRatings. He also led development of several patents, including technology that predicts bankruptcies, a global database to improve supplier performance, and technology that enhances performance management through lean initiatives. 

Smith earned his bachelor’s degree from Dartmouth College.

How to Ensure Catastrophe Doesn’t Lead to Catastrophic Fraud

As California wildfires surge, insurers battle an $8 billion claims wave while fighting sophisticated fraud schemes.

Blazing Fire on the Grass Field

As California once again faced devastating wildfires, insurers are now dealing with the inevitable surge in claims.

Among the avalanche of genuine cases, however, lies a likely darker reality: opportunistic fraud. From outright false claims to sophisticated scams, bad actors are poised to exploit the chaos, risking both financial loss for insurers and delays for those in desperate need of payouts.

Balancing the detection of fraud and opportunistic fraud with the need to validate and prioritize severe vulnerabilities -- while ensuring fair treatment of legitimate claimants -- will be a critical challenge. At the same time, insurers must oversee extensive repair and rebuilding efforts, implement new risk-mitigation measures, and navigate the potential for misconduct across all these activities.

If the situation weren't serious enough as it is, all of this unfolds against the backdrop of an already highly challenging economic landscape. Insurance Insider reported that State Farm has around $2.5-$4 billion of reinsurance cover specifically for California risk.

However, reinsurers are braced for a total loss on the State Farm program as wildfire-related claims are estimated to reach around $8 billion. Commenting on LinkedIn, Adam Denninger, global insurance leader at Capgemini, painted a concerning picture, "Expect genuinely massive changes to rates and insurance availability over the next couple of years. Nobody can sustain these types of losses."

It's the perfect collision of scale, complexity and capital intensity, and it will test insurers like never before. So if we add human error, IT failure and bad behavior into the mix, this problem will only intensify. As I heard an industry professional say at a conference recently, "It's like pouring gasoline onto this bonfire, only we are burning actual money now."

A Homeland Security official says, "In every large-scale tragedy, greedy individuals seek to line their pockets and divert critical funds from those most in need." It's no wonder that District Attorney Nathan Hochman's office, along with Homeland Security, and every law enforcement agency from the FBI to the sheriff's office have formed a Joint Regional Fire Crimes Task Force specifically to investigate and prosecute criminal actors seeking to exploit the wildfire crisis.

Wherever there is money to be made, good or bad, you are going to have humans who are willing to take it. The GAO reported in July 2023 that the federal government has made an estimated $2.4 trillion in improper payments out of emergency assistance programs since 2003. So this is significant and systemic.

Of course, this isn't just a huge problem during a disaster; fraud is rife all the time. It is a long-term threat that demands immediate action. An estimated $308.6 billion annually is lost to insurance fraud in the U.S., according to the Coalition Against Insurance Fraud. That equates to insurance fraud costs passed on in premiums of an estimated $900 per consumer. Breaking this down further, property and casualty insurance fraud alone is around $45 billion. 

These are huge numbers, and broadly these estimates suggest that in some way or another this means that about 20% of claims are fraudulent. 

Now imagine that in the context and scale of the LA wildfires.

Fraudsters also evolve their tactics over time. Therefore, implementing advanced predictive models now will provide lasting protection against medium- and long-term schemes.

To effectively combat evolving fraud risks, insurers need advanced AI-driven models that can identify unique policy and fraud patterns. By leveraging rapid learning, contextual analysis, and adaptive technology, these models enable insurers to stay ahead of both known and emerging fraud threats across the entire claims lifecycle.

A well-designed anti-fraud system can train on an insurer's proprietary data, ensuring it adapts to their specific fraud risks rather than relying only on generalized market trends. This approach allows insurers to pinpoint fraud, streamline claims processing, and flag cases that require careful handling at an earlier stage.

Equally important is the ability to customize fraud detection thresholds and adapt to new types of fraud methods. This will ensure insurers can dynamically adjust to shifting fraud patterns while maintaining fast, efficient payouts for legitimate claims.

By centralizing all relevant internal and external data into a unified investigative platform, insurers can analyze fraud risks across multiple dimensions -- such as policy type, region, and exposure -- enabling more precise decision-making and cross-departmental insights.

Advanced fraud detection solutions like our ClaimSmart solution ensure insurers can not only detect fraud more effectively but also future-proof their fraud prevention strategies against evolving threats. However, to leverage the maximum potential from AI models, it is essential we also see a breakdown of data silos for rapid adaptation, and we need to see fraud detection applied to a full lifecycle from policy inception and in shared fraud data bureaus. This will ensure we can build ever stronger data models.

This is also why creating data-fluid ecosystems is critical for AI and machine learning to thrive generally, offering rapid learning and adaptability in the fight against fraud.

Staying one step ahead also means applying tools like Clearspeed, which can quickly assess risk through voice. When people are asked a yes/no question of consequence, their voice characteristics may change in ways that are called voice response signals. They are imperceptible to humans but can be detected and measured by Clearspeed, providing a way to fast track and help low-risk, genuine customers get back on their feet, while focusing stretched resources where high risk is alerted.

Adopting this kind of technology can make a huge difference, but to do it efficiently and effectively needs data-fluid, highly adaptive core technologies in the cloud, offering the ability to apply this technology easily to claims processes.

There will be bumps in the road to combating fraud, but if there's one thing we must see, it's more adaptive insurance businesses. Ones capable of applying risk mitigation, embedding themselves into home ecosystems and inter-operating with governments, fire departments and others.

Insurance needs to enter a new era of dynamism, and it needs to do it sooner than many might have predicted.


Rory Yates

Profile picture for user RoryYates

Rory Yates

Rory Yates is the SVP of corporate strategy at EIS, a global core technology platform provider for the insurance sector.

He works with clients, partners and advisers to help them jump across the digital divide and build the new business models the future needs.

AI, Machine Learning Trends to Watch in 2025

Seven emerging AI technologies are reshaping business operations and industry capabilities through 2025 and beyond.

An artist’s illustration of artificial intelligence

As we progress through 2025, several groundbreaking technologies in artificial intelligence (AI) and machine learning (ML) are emerging. Each is poised to address complex challenges and unlock unprecedented opportunities. 

Below, we explore the most important trends shaping the AI landscape.

1. Domain-Specific Generative AI Models

Generative AI has demonstrated remarkable versatility, but its future lies in specialization. Enterprises are increasingly adopting domain-specific generative AI models tailored for industries or business functions. These models leverage vast amounts of specialized data to produce highly accurate outputs, making them invaluable in areas like healthcare (e.g., personalized treatment planning) and finance (e.g., risk analysis). In the insurance space, this includes automated policy generation, risk assessment and underwriting, fraud detection or customer profiling.

By 2027, over 50% of generative AI models used by enterprises are expected to be domain-specific, a sharp rise from just 1% today.

By leveraging domain-specific generative AI, insurers can significantly improve efficiency, reduce costs, and deliver superior customer experiences across various aspects of their operations.

2. Multimodal AI as the New Standard

Multimodal AI assimilates diverse data types—text, images, audio, and video—into cohesive models capable of delivering more personalized and sophisticated user experiences. Applications range from healthcare diagnostics using combined visual and audio inputs to automotive assistants that respond to voice commands while analyzing visual cues. This capability is revolutionizing customer interactions across industries by enabling seamless, context-aware solutions.

3. Optimization of the AI Stack

2025 marks a shift from experimentation to optimization in AI deployment. Organizations are focusing on maximizing the value of their AI investments by refining infrastructure for training and inference. For instance, advancements in hardware like GPUs and TPUs have reduced processing times by over 50%, significantly cutting costs while improving efficiency. The emphasis on optimization also extends to selecting the most suitable models for specific use cases, ensuring long-term relevance and effectiveness.

4. Agentic AI: Autonomous Collaboration

Agentic AI refers to systems capable of performing tasks independently with minimal human intervention. These autonomous agents are expected to collaborate across networks to execute complex workflows efficiently. While still evolving, agentic AI holds promise for automating routine tasks and enabling human-in-the-loop systems that boost productivity and innovation across sectors like logistics, customer service, and software development.

5. AI for Sustainability

AI is playing a pivotal role in addressing global sustainability challenges. From optimizing energy consumption in smart grids to enhancing climate modeling accuracy, these technologies are helping industries reduce their environmental footprint. AI-driven solutions are also being employed in agriculture for precision farming and in manufacturing for waste reduction.

6. Quantum Computing Meets AI

Quantum computing is beginning to intersect with AI, offering exponential processing power for specific tasks such as cryptography and molecular simulation in drug discovery. Although still nascent, this technology has the potential to solve problems that are currently intractable for classical computers, further expanding the horizons of what AI can achieve.

7. Breaking Down Silos With Generative AI

Generative AI is democratizing access to advanced tools within organizations by breaking down departmental silos. This fosters collaboration and accelerates innovation by enabling non-technical users to leverage AI for creative problem-solving. For example, Generative AI-powered chatbots and virtual assistants can access information from various departments to provide comprehensive customer support, bridging gaps between sales, service, and claims teams.

Conclusion

The emerging technologies in AI and ML for 2025 underscore a shift toward greater specialization, enhanced collaboration, and optimized performance across industries. From domain-specific applications to the integration of quantum computing, these advancements promise not only to drive business growth but also address broader societal challenges like sustainability and efficiency. Organizations that embrace these trends early will be well-positioned to lead in an increasingly competitive landscape shaped by intelligent systems.

Insurers Must Resolve Cloud Adoption Challenges

Insurance companies face three critical challenges when migrating to the cloud: security, legacy integration and cost management.

White Clouds

Protecting data, closely adhering to pertinent rules and regulations, cleanly importing existing architecture, and controlling costs are all imperative for insurance companies when they migrate to and work in the cloud.

Recent survey data shows that 91% of banks and insurance firms are migrating to the cloud. And no wonder: Cloud migration offers significant benefits, such as better security, effective resource management, and cost optimizations. But these benefits don't come without challenges. 

Addressing Data Security and Compliance Concerns

Properly managing the vast amounts of personal data insurance companies handle – which are so critical to day-to-day operations – involves changing the infrastructure, networks, access controls, and firewalls, among other things. All of these changes create big security challenges. Fortunately, cloud providers offer multilayered security measures, such as advanced network protections, continuous monitoring, end-to-end data encryption, secure backups, and rigid user permissions.

Combining these approaches with employee best practices such as multifactor authentication, secure awareness training, and role-playing scenarios around social engineering can give insurers a multilayered data defense posture.

Cloud providers also help insurance companies adhere to industry-specific regulations (e.g., GDPR and HIPAA) that require detailed security audits to monitor access to restricted data. They do this by automatically creating and updating those logs to better prepare them for quarterly or annual compliance audits.

Clearing Legacy System Integration Hurdles

Insurance firms have huge hurdles to clear to successfully integrate legacy systems into the cloud. That's because a typical company's traditional infrastructure is weighted down with a mainframe administration system that may contain decades of policy information, claims and customer data. Systems that use obsolete programming language, unique architecture, or ancient data formats face a formidable challenge.

Maintaining legacy systems can waste time, retard digital transformation, and impair network performance. This is a priority concern for insurers, which say they spend 70% of their IT budget on that task. Moreover, per-policy IT costs can be 41% higher on legacy platforms.

Installing a modern system lowers maintenance costs by making legacy skills less necessary, fostering automation, cutting the time and energy businesses need to introduce new initiatives, and making IT and business teams more efficient.

There are ways to effectively migrate these systems:

  1. Moving one or two systems at a time, in phases, reduces the likelihood of downtime and improves customer satisfaction. Testing and validating each migration ensures the highest-level performance.
  2. By adopting a hybrid cloud approach, insurers can keep immovable, critical systems on-site or in legacy infrastructure while moving testing environments, data warehouses, or customer-facing software-as-a-service (SaaS) applications into public or private clouds. This lets insurers scale more modern systems cost-effectively without importing traditional architecture before it's ready.
  3. Microservices disassemble siloed, legacy infrastructure into smaller, independent applications, making it easier to modify outdated software. APIs take it from there, communicating in real time to cloud servers or third-party vendors so insurance companies can improve their reliability, build faster deployments, and conduct a controlled, well-paced cloud migration over time.

Optimizing Cloud Costs

Migrating to the cloud requires insurers to apply careful IT cost management to efficiently store data and process workloads. Most cloud providers have built-in cost management tools, mostly user-friendly, easy-to-understand interfaces that provide high-level pre-configured views and granular customer reports. This way, companies can see what they're spending, how to better control costs, and project costs as their businesses grow.

By right-sizing cloud resources, insurers can auto-scale to accommodate peak demand periods (e.g., the major increases in server capacity that health insurance providers experience during open enrollment season when potentially hundreds of thousands of users try to register all at once). Dynamically expanding server capacity makes it easy for customers to sign up for new policies – which cuts downtime and boosts revenue.

There are two most effective ways to optimize cloud costs. The first is reserved instances – which strengthen core business functionality with a stable, consistent system purchased for longer periods. The second is spot instances, which help use idle cloud capacity at a lower cost for testing or data processing tasks.

Taking a Strategic Approach

Providers confront unique data security, legacy systems, and cost management challenges in moving to the cloud. But the long-term benefits – faster, nimbler digital transformation, a competitive edge, greater employee productivity, and more secure network infrastructure – frequently outweigh the short-term process frustrations.


Karina Myers

Profile picture for user KarinaMyers

Karina Myers

Karina Myers is the Microsoft Cloud practice lead at Centric Consulting.

She leads teams focused on Microsoft 365 and Azure deployments and migrations, cloud governance and adoption, security and compliance, and managed services.