Download

The Hidden Costs of Standing Still

Like a flip phone in a smartphone world, legacy systems slow you down, frustrate users and make it hard to keep up, let alone get ahead.

Person standing still in the middle of a road

Remember flip phones? They could make calls, send texts (if you were patient) and maybe took a blurry photo or two. Back then, they felt like cutting-edge tech, and for a while, they got the job done. But imagine trading your iPhone for one of those old Nokias today. Good luck ordering an Uber, paying for coffee or getting to your next meeting without a printed paper map.

That's exactly what it's like when insurance companies cling to outdated legacy systems.

Sure, those platforms might still "work," in the most technical sense, but they weren't built for today's speed, scale or customer expectations. Just like a flip phone in a smartphone world, legacy systems slow you down, frustrate your users, and make it harder to keep up, let alone get ahead.

Outdated Tech, Real-World Consequences

And that's not a hypothetical problem; 74% of insurers are still running on legacy systems, and the consequences are stacking up fast.

Legacy systems are one of the leading causes of customer dissatisfaction in the insurance industry today. These systems slow insurers' ability to deliver the kind of seamless, digital-first experience customers expect, especially younger, tech-savvy ones who compare every interaction against Amazon or Apple.

And the risks go way beyond frustration. In 2021, CNA Financial was hit with a ransomware attack that shut down systems, exposed sensitive customer data, and reportedly cost them $40 million in ransom. The incident made national headlines; customers were rattled, and the brand's reputation took a hit (one it's still working to recover from). So, in today's climate, legacy systems aren't just inefficient, they are a liability. In an industry built on trust, you can't afford that kind of failure.

The damage isn't just visible. There's also the money you're bleeding without even realizing it. Research shows 5–10% of premiums are lost every year to "premium leakage." Translation: That's revenue slipping through the cracks when legacy processes and disconnected or fragmented systems fail to accurately assess or price risk.

Legacy systems may still "work," but they're quietly (or not so quietly) slowing you down, eroding trust, and draining profits.

Modernizing Doesn't Mean Breaking the Bank

Upgrading technology makes people nervous, and honestly, who can blame them? Upgrades bring up all the right, but tough to answer, questions: How much will this cost? Why do we need it now? How long is this going to take?

Those concerns are valid. But here's the catch: Sticking with outdated systems creates bigger, more expensive problems. And the industry knows it; 71% of insurance executives are frustrated by how hard it is to launch digital programs. That's not just noise; it's a red flag. The industry is ready for change, and customers already expect it.

The good news is you don't have to blow up your budget or your team's bandwidth to modernize. Today's platforms are leaner, faster and designed to make life easier for your people and your policyholders.

If you are still on the fence, know insurers using modern policy admin systems have slashed IT costs per policy by up to 41%, according to McKinsey. That's real money you can redirect into innovation, training and delivering the kind of experience that earns trust and keeps it.

Taking the First Step

Modernizing your business doesn't have to mean burning everything down and starting from scratch. In fact, the smartest companies start small, zeroing in on high-impact areas like claims or customer service, and roll out modern solutions in phases. It's a faster path to real results without throwing your operations into chaos.

Legacy systems might've done the job once, just like that old flip phone you couldn't live without. But today, they're more liability than assets. The longer you hang on, the harder it gets to keep up. The real shift isn't just about tech. It's about staying competitive, staying secure and staying relevant. The best advice I can give is to start where it matters, scale toughtfully and stay ahead.


Ewa Maj

Profile picture for user EwaMaj

Ewa Maj

Ewa Maj (pronounced "Ava My") joined Input 1 in November 1999 as vice president of operations. 

Previously, she was the chief operating officer of two large multi-state premium finance companies in the northeast U.S. 

Maj was educated in Poland, England, and the U.S. She has a degree in finance from Rutgers University School of Business.

Health Insurance Enters Uncharted Waters

Sponsored by Verikai: Developments in gene therapy and drugs hold remarkable promise, but how do insurers set premiums when there's no historical data for them?

featured
colin condie header

Paul Carroll

There are so many exciting developments in drugs, testing and treatments in healthcare, but there’s little or no historical data about them. How do you provide the information that lets underwriters evaluate risk and develop pricing?

Colin Condie

Data is received from many sources, including insurance carriers, clients, and data vendors. This data is analyzed extensively, with a focus on the frequency and severity of claims and their underlying conditions and prescription drug histories. The predictive model algorithms examine condition categories and prescriptions that occurred in the past and their correlations with claims. Risk scores are then developed based on these relationships. These risk scores are then applied to current conditions and prescriptions of the group’s members to develop expected claims predictions for the group. The risk scores that are developed are then used by underwriters in developing premiums for the groups.

In cases where there is little historical information on drugs to rely on [i.e. a new drug], additional methods such as reliance on clinical data can be used when developing the risk score. The risk scoring models are then updated continuously as new data on drugs becomes available.

Paul Carroll

What are the most significant cell and gene therapy [CGT] developments and FDA approvals that industry professionals should be monitoring for potential market exposure? 

Colin Condie

Currently, cell and gene therapy drugs are primarily targeting three main areas: rare genetic diseases, blood disorders, and certain cancers.

For blood disorders, there are treatments such as Hemgenix for hemophilia B and Roctavian for hemophilia A. For rare genetic disorders, there are treatments such as Skysona for neurodegenerative disorders [i.e. CALD], and Zolgensma for spinal muscular atrophy [SMA]. In oncology, there are treatments such as Kymriah for leukemia and Abecma for relapsed or refractory multiple myeloma.

Another significant area is retinal disease, where Luxturna is a cell gene therapy to treat vision loss that is caused by inherited retinal dystrophy. 

Looking to the future, we're seeing an expansion beyond these traditional areas. New trials are focusing on conditions with greater prevalence than the current rare diseases that typically affect a small segment of the population. For instance, treatments are being developed for refractory angina (chest pain caused by reduced blood flow to the heart) and ischemic stroke (where blood vessels to the brain are constricted).

The future FDA approvals will continue to focus on familiar therapeutic areas while expanding into new areas such as muscular dystrophies and inherited conditions. Neurology is emerging as a new frontier, with gene therapies targeting inherited neurological disorders such as Alzheimer's. 

About 80% of FDA approvals in the pipeline will be for specialty drugs, including cell and gene therapies. These specialty drugs are expected to cost on average between $200,000 and $400,000 annually, representing a continued shift toward innovative, high-impact, and high-cost therapies targeting rare diseases and chronic conditions. 

Paul Carroll

How is machine learning technology helping underwriters navigate the complexities of healthcare data analysis? 

Colin Condie

Machine learning, predictive modeling, and artificial intelligence are already making significant impacts. These technologies are particularly effective at identifying high-cost conditions and prescription histories and predicting associated costs based on this information via the risk score that is used by the underwriter. 

One key application is analyzing data for "one and done" therapies. Unlike maintenance medications that require continuous administration, CGTs are generally designed as single-dose treatments. The artificial intelligence and actuarial models are used to estimate the expected occurrence and cost of these therapies, and this information is then reflected in the risk score. 

In the early stages of CGTs, occurrence rates are relatively low, and therefore there may be insufficient claims data available to use as the basis to reflect their impact in the risk scores and the resulting prediction of future costs. Therefore, the focus is on identifying condition categories that might indicate where the therapy could be appropriate. Additional variables are analyzed, such as patient age, specific diagnoses, diagnosis codes, disease severity, and drug specific data such as FDA clinical trial information. Indicators of prior unsuccessful treatments are examined, as many cell and gene therapies are typically prescribed after other treatment regimens have failed. 

There are also external factors that need to be adjusted for in the predictive modeling. For instance, some therapies may be ineffective for patients with certain antibodies and therefore an adjustment to the assumed frequency of the therapy may be required in the predictive modeling. 

The main challenge is the limited availability of data. The focus is on gathering as much information as possible to determine the expected prevalence and costs for the CGTs, which includes an analysis of medical conditions. The machine learning tools help in the assimilation of data from different data sources from which accurate predictions can be performed by the models.

Paul Carroll

How much does AI reduce turnaround time in underwriting while maintaining actuarial integrity?

Colin Condie

The concept of automated underwriting and quick turnaround times has been a focus in the industry. The predictive modeling uses AI-based algorithms to generate risk scores that represent the predicted health status or morbidity of the members of a group. Along with other variables, the risk scores help determine expected future claims costs, creating a data-driven foundation for underwriting decisions that optimizes efficiency and accuracy.

With the predictive models, underwriting decisions can be automated for groups that are determined to be very low risk or very high risk based on the risk score that is generated. The predictive model results can also be used as an indicator when underwriter review is necessary. For example, the predictive model can flag cases where underwriter review is necessary, such as for a group that has one high-risk member driving the prediction while the other members of the group have low risk. 

Paul Carroll

How will predictive models incorporate data from wearable devices and remote patient monitoring systems into underwriting processes over the next few years? 

Colin Condie

It's interesting. Risk vendors currently base their predictions using health status [morbidity] risk scores based on historical medical and prescription data. Some risk vendors also incorporate lifestyle-related factors as predictors of risk, such as smoking habits, alcohol consumption, eating habits, exercise patterns, body mass index levels [BMI], and sleep behaviors. Predictive models that use wearable devices or remote patient monitoring [RPM] systems can provide both health status-based and lifestyle-based factors for vendors’ risk scoring models. 

Predictive models that use wearable devices and remote patient monitoring systems analyze continuous streams of biometric data, including heart rate, blood glucose levels, blood pressure, respiratory rate, activity levels, and sleep patterns. Targeted interventions can be implemented for members based on these metrics.

Diabetic members can use AI-driven RPM devices to adjust insulin doses, which can lead to a reduction in hypoglycemic events. Wearables can identify heart rate irregularities, which can lower the risk of having a stroke. AI devices can analyze sleep patterns and heart rate to predict potential anxiety or depression episodes, which can lead to improved mental health treatment outcomes.

Paul Carroll

How accurate and effective are health risk scores in predicting medical costs and outcomes? 

Colin Condie

Health status risk scores are accurate. Vendors that also generate lifestyle risk scores provide additional information in terms of the risk of the group over the longer term.

The accuracy of these AI-generated scores is continually evaluated through real-time studies, comparing predicted costs against actual claims data. The health status risk scores are evaluated by comparing the actual claims experienced with the claims that were predicted based on the risk score. The lifestyle risk scores are generally evaluated over a longer-term time horizon because the impact of lifestyle factors on claim costs generally occurs over a longer period (i.e. the number of years it takes for tobacco use to cause the onset of medical conditions). A challenge involves keeping pace with new drug approvals. Risk-scoring models require constant updating to include new drugs and their corresponding national drug codes [NDC]. Additionally, the models must account for evolving condition and drug cost and utilization patterns and incorporate scenarios where the claims experience is immature or infrequent.

An additional challenge is developing assumptions for CGT treatments that have limited frequency. For example, there haven’t been many CGT occurrences overall, and only a few of the approved therapies are widely used. When medical conditions are relied on to estimate the use of CGTs, a consideration is that the claims data may not have the specificity that is required to identify the ideal conditions for CGT treatment. To address these and other issues, data scientists, actuaries, and engineers analyze multiple data sources, including available medical and drug claims data and clinical drug trial information, to determine the impact on risk scores of future CGT utilization and cost.

Paul Carroll

Thanks, Colin. This is super informative.

About Colin Condie

colin condie headshot

Colin Condie is a senior healthcare actuary at Verikai, where he leverages his extensive actuarial expertise to enhance the company's risk adjustment solutions. With nearly three decades of experience in the field, Colin specializes in predictive analytics for both fully insured and self-insured markets, serving insurers, MGUs, stop-loss carriers, employer groups, and PEOs.

Prior to joining Verikai in September 2024, Colin served as director and actuary at ExtensisHR for over seven years, where he developed a deep understanding of the PEO industry. His career also includes actuarial roles at Aon, AXIS Global Accident and Health and Munich Re, building a diverse background across consulting, insurance, and reinsurance sectors.

Colin's expertise lies in data-driven decision-making, risk assessment, experience monitoring and reporting, and pricing models. At Verikai, he collaborates with cross-functional teams including product experts, data scientists and engineers to refine risk adjustment strategies and expand the company's presence in the fully insured and self-insured markets.

A Rutgers University graduate with a bachelor’s degree in economics and an MBA in finance, Colin brings a unique blend of analytical skill and business acumen to his role. Based in Marco Island, Florida, he approaches actuarial challenges with both technical precision and a broader business perspective.


Insurance Thought Leadership

Profile picture for user Insurance Thought Leadership

Insurance Thought Leadership

Insurance Thought Leadership (ITL) delivers engaging, informative articles from our global network of thought leaders and decision makers. Their insights are transforming the insurance and risk management marketplace through knowledge sharing, big ideas on a wide variety of topics, and lessons learned through real-life applications of innovative technology.

We also connect our network of authors and readers in ways that help them uncover opportunities and that lead to innovation and strategic advantage.

Exploring Secure Data Collaboration Tech

Munich Re's proof-of-concept explores using multi-party computation to enable secure data sharing and collaboration without compromising privacy.

lock blue tech data

As insurers leverage the power of cutting-edge technologies like machine learning, artificial intelligence, and third-party data integrations to improve their data-driven decision-making, they're faced with the critical task of ensuring the security and control of sensitive data, which is essential for maintaining customer trust and regulatory compliance. One promising solution to this challenge is a collaborative intelligence ecosystem where multiple parties come together to form a shared knowledge pool that facilitates secure and collective learning through multi-party computations. Each person’s sensitive data and information remain private and undisclosed.

Munich Re Life North America’s Integrated Analytics team believes this innovative technology has the capacity to unlock significant computing and analytics opportunities within the insurance industry. They investigated its capabilities by partnering with an external vendor to conduct a proof of concept (POC). This article shares the details of our POC experience, which evaluated the performance, accuracy, efficiency, and scalability of the vendor’s multi-party computation platform.

Purpose

A vast amount of data is generated daily and growing exponentially due to technologies like social media, cloud services, the Internet of Things (IoT), and artificial intelligence (AI). While this data holds significant value for businesses, concerns about privacy, trust, and other risks leave much of it inaccessible. This is especially true for industries like insurance, where sensitive personal information is prevalent and privacy is a priority.

However, advancements in privacy-retaining technologies offer a more secure framework for collaborative research. Multi-party computation is a cryptographic technique that allows participants to jointly compute a function on their private inputs without revealing those inputs to each other. Our overarching POC goal was to evaluate its potential promise in reducing challenges around data transfers, barriers to collaboration, and vulnerability to bad actors in driving insurance use cases for Munich Re.

How does the technology work?

We worked with a vendor experienced in developing and deploying privacy-enhancing technologies for commercial use. They have pioneered cryptographic privacy-preserving computation technology for analytics and machine learning. Their product enables sensitive data to be distributed across teams, organizations, and regulated jurisdictions by deploying privacy zones (on-premises or cloud) within the infrastructure and network of each party sharing insights.

Through these privacy zones, a compile function is run on each local network using its own data. The function then sends updates to a central server that aggregates the results across parties, effectively keeping sensitive information private within the respective local networks.

For example, if the goal is to train a global machine learning model using insights from different sources, each source would train a local model on its own data and only send the updated model parameters (e.g., weight) to the central server. In our case, the central server was a part of a cloud-based Software as a Service (SaaS) framework provided by the POC vendor.

This concept is called federated learning, which is a decentralized machine learning approach that leverages multi-party computation principles to allow parties to train a model collaboratively, with each preserving its data locally. Instead of sharing raw data, parties share model updates or encrypted data, allowing for secure computation and aggregation of model parameters.

Method of evaluation

The POC had two phases: Phase I assessed the basic multi-party computation functionality within the vendor’s sandbox environment, while Phase II ran a synthetic two-party computation within Munich Re’s cloud environment. For the latter, we trained a popular advanced machine learning algorithm on the multi-party computation platform. This allowed us to securely leverage insights from our internal historical data, which was hosted in one privacy zone, and enhance it with externally acquired sociodemographic data, which was hosted in another.

We ultimately drew insights from both datasets while ensuring privacy within each computation zone. Our goal was to evaluate the predictive accuracy of the resulting model, lift from increasing samples and features, ease of setup, and efficacy of the privacy-preserving computation process.

Our evaluation was split into four broad categories:

Functionality:

  • Preservation of data privacy and security behind each party’s firewall
  • Dedicated functions to join private data (private set intersections)
  • Granular data privacy controls
  • Complex operations available to be implemented for data preparation
  • Functionality to run advanced algorithms

Efficiency:

  • Speed and quality of technical support
  • Speed and efficiency in running functions and algorithms
  • Extent of coordination required among parties for data processing and modeling

Correctness:

  • Accuracy of results from running statistical functions, algorithms, and data processing operations

Scalability:

  • Potential to scale up for faster, optimized multi-thread processing
  • Level of data engineering expertise required for initial setup

Overall, our assessment of the technology and its potential is favorable. We believe the integration of innovative techniques like multi-party computation can provide a reliable way to enhance business capabilities, enabling secure and private data sharing and analysis across multiple stakeholders while maintaining data confidentiality and integrity.

Potential use cases

Internally within larger companies: To utilize data across entities/departments to increase data size and features for analytics or model development.

Externally with partners and third-party data vendors: For fast and efficient data evaluation. Raw data is never shared, and sensitive information is protected, mitigating data security and privacy risks.

We believe this technology holds significant potential. It offers an efficient way to meet existing privacy compliance and data sharing best practices in building collaborative intelligence ecosystems within and across organizations.

It's only a matter of time before companies use multi-party computation frameworks to enhance their informational edge.

 

Sponsored by ITL Partner: Munich Re


ITL Partner: Munich Re

Profile picture for user MunichRepartner

ITL Partner: Munich Re

Munich Re Life US, a subsidiary of Munich Re Group, is a leading US reinsurer with a significant market presence and extensive technical depth in all areas of life and disability reinsurance. Beyond vast reinsurance capacity and unrivaled risk expertise, the company is recognized as an innovator in digital transformation and aims to guide carriers through the changing industry landscape with dynamic solutions insightfully designed to grow and support their business. Munich Re Life US also offers tailored financial reinsurance solutions to help life and disability insurance carriers manage organic growth and capital efficiency as well as M&A support to help achieve transaction success. Established in 1959, Munich Re Life US boasts A+ and AA ratings from A.M. Best Company and Standards & Poors respectively, and serves US clients from its locations in New York and Atlanta.


Additional Resources

Drug deaths a concern for life carriers

A 25% increase in substance abuse death rates in the college-educated population is a particularly worrying trend for the life insurance industry.

Read More

EHRs transform life underwriting

Our extensive study confirms the value of electronic health records (EHRs) across life underwriting use cases.

Read More

Life insurance fraud trends

Munich Re’s survey reveals which types of fraud have been on the rise for U.S. life insurers in recent years.

Read More

Recent patterns in cancer claims

Cancer is the most common cause of death for the life insurance population. Munich Re analyzes recent trends.

Read More

The digital future of life insurance

Leverage emerging technologies to improve operational efficiency, enhance underwriting processes, and expand insurance accessibility.

Read More

Leveraging Agentic AI to Address Inflation

Agentic AI emerges as insurers' strategic solution to inflationary pressures across the value chain.

An artist’s illustration of artificial intelligence

My husband and I do our grocery shopping once a week for just the two of us and our five cats—though we don't include cat food in our weekly grocery budget. Typically, we purchase the usual staples: fruits, vegetables, meats, bottled water, soft drinks, and occasionally snack items. By the time we reach the checkout, our total averages more than $180 per trip. Yes, nearly $200 a week—for just two people.

If groceries alone cost us this much, what are households of three or more people paying—and how do they manage to afford it? And this is before adding in other common household expenses, like mortgage or rent, utilities, transportation, debt, internet and other subscriptions, family needs, savings—and finally, insurance.

Inflation pressure is real, and we continue to live it. MarketWatch reported that as of December 2024, "consumer prices were up 2.9% year over year," contributing to a steady rise in the cost of living.

And it's not just groceries and household expenses. We are also seeing our insurance premiums increase year over year.

Effects on Consumers and Business Owners

According to the U.S. Department of the Treasury, average homeowners' insurance premiums per policy increased 8.7% faster than the rate of inflation between 2018 and 2022. More recently, in 2024, the national average cost of homeowners insurance rose to $2,728 per year.

Auto insurance hasn't been spared either. Bankrate reports that the average cost of full coverage car insurance climbed to $2,638 in 2025, up 12% from 2024.

Consumers like us aren't the only ones affected. Commercial businesses are grappling with increased operating costs, pricing pressures, supply chain disruptions, wage inflation, rising financing and marketing expenses, shrinking consumer demand, and uncertainty in long-term financial planning. Like households, businesses also face rising insurance premiums. According to WTW's Commercial Lines Insurance Pricing Survey, U.S. commercial insurance rates increased by 6.1% during the third quarter of 2024. Specific lines, such as commercial auto insurance, have seen even steeper increases, with rates continuing to rise in double digits.

Effects on Insurance Carriers

Insurance carriers, like homeowners and business owners, are equally affected by these economic pressures. Rising repair costs for homes and vehicles are driving up claim payouts. The National Oceanic and Atmospheric Administration (NOAA) reported that damages from weather-related disasters in the U.S. amounted to approximately $92.9 billion in 2023. These factors are forcing insurers to raise property insurance rates.

At the same time, insurers face a raft of other issues—more frequent and severe weather events, rising reinsurance costs, and a surge in lawsuits and settlement amounts. This last factor, social inflation, was cited by Swiss Re as a key driver of higher liability claims costs, particularly in lines of business exposed to bodily injury claims.

Combatting Inflationary Pressure With Agentic AI

To manage these mounting pressures without relying solely on rate increases, insurers are turning to agentic artificial intelligence (AI) solutions. An agentic AI platform provides specialized AI agents that automate routine business processes across various operational domains. These agents operate continuously, delivering high accuracy in tasks such as data extraction, document classification, and workflow orchestration.

This approach equips insurers to combat inflationary pressures through smarter operations, tighter cost control, and enhanced customer service by delivering:

  • Operational optimization: Agentic AI automates repetitive tasks such as data entry, document processing, and information verification. This minimizes manual data review, reduces the risk of errors, and accelerates process timelines to increase productivity.
  • Cost savings: Boosting operational efficiency can significantly reduce administrative overhead and processing costs. Freed from routine tasks, underwriting and claims experts are able to focus on product innovation, customer engagement, and other higher-value tasks to improve overall ROI.
  • Service improvement: Agentic AI enhances service delivery across underwriting and claims management workflows. It automates critical stages such as policy intake, document triage, reserve allocation, and policyholder communications. This results in greater accuracy, reduced cycle times, accelerated claims resolution, providing policyholders with quicker, more responsive service and fewer disputes or delays.

These innovations position insurers to enhance profitability, elevate customer satisfaction, and build a lasting competitive advantage, even in today's demanding economic landscape.

Faced with escalating economic challenges, including inflation that affects every part of the insurance value chain, insurers must turn to AI as a vital tool for resilience and growth.

Financial pressures from rising claims severity, operational inefficiencies, and other challenges highlight the urgent need for transformative solutions. Agentic AI offers a strategic path forward—reducing costs, enhancing service delivery, and helping insurers do more with less. As inflation makes consumers increasingly price-sensitive, AI-driven automation enables insurers to stay resilient, responsive, and competitive. By investing in AI, insurance companies can not only navigate today's economic pressures but also build a more equitable future for policyholders.


Diane Brassard

Profile picture for user DianeBrassard

Diane Brassard

Diane Brassard serves as head of education and advocacy at Roots

Before joining Roots, she held senior roles at WR Berkley and leadership roles at Colony Specialty (Argo Group). She spent over two decades at The Main Street America Group.

What Gen Z Wants From Auto Insurers

Auto insurers must transform digital experiences to win over transparency-seeking Gen Z drivers entering the market.

Young Adult with Red Sports Car in Urban Setting

The auto insurance industry is experiencing another transformation, led by the preferences of the newest generation of drivers entering the market – Gen Z.

Born between the mid-1990s and early 2010s, this tech-savvy group has grown up in a digital world and demands more from auto insurers than previous generations. They seek seamless online experiences, personalized services, and transparency in pricing and claims processes.

As they continue to become a significant force in the market, insurance companies must adapt their strategies and processes to cater to the unique needs and values of this latest generation of drivers. Here are three key things that Gen Z wants for auto insurers to keep in mind.

1. More transparency in pricing

Gen Z drivers are less likely to shop around for auto insurance than other generations. According to a 2024 report on car insurance shopping trends, only 19% of Gen Z respondents compare auto insurance prices annually, compared with 42% of Millennials.

Hidden fees and complicated pricing structures are major turn-offs for Gen Z. They want clear, upfront pricing that outlines exactly what they are paying for. When choosing an auto insurance policy, Gen Z drivers prioritize coverage options (61%), followed by premium cost (49%), and customer service reputation (34%).

Interestingly, Gen Z also places less emphasis on premium costs compared with older generations, with coverage options being their top priority. For insurers, it's important to have transparent pricing and provide detailed explanations of policies.

2. Digital-first claims experience

To keep Gen Z drivers happy, the claims process also needs to be frictionless and digital-first. This generation of drivers has grown up in a world where digital interactions are the norm, and they expect the same level of digital engagement from their auto insurers.

Allowing drivers to instantly upload damage evidence and necessary documents, track claim status in real-time, and receive clear communication at every step can greatly improve customer satisfaction with this generation. Insurers can upgrade any outdated claims processes through the use of artificial intelligence (AI) technologies like visual intelligence. An advanced type of computer vision AI, visual intelligence can accelerate new policy subscriptions, claims assessments and cash settlements. The technology also enables policyholders to submit evidence remotely at the incident scene so insurers can analyze and potentially resolve a claim within minutes of first notice of loss.

Auto insurers are increasingly adopting AI technology to guide younger policyholders through a remote, yet thorough, evidence-gathering process. It is also helping auto insurers and body shops provide pre-repair cost estimations faster, decreasing the need for physical vehicle inspections. Once evidence is collected remotely, visual intelligence allows insurers to compare it immediately against a database of previously gathered evidence, identifying unusual cases and flagging them for human intervention. This not only speeds up the claims process but also reduces the risk of fraud.

A smooth mobile experience enables young drivers to manage their insurance policies and submit claims anytime and anywhere they need it. Furthermore, whether it's through chatbots, live chats, or social media, AI-powered customer support helps insurers meet Gen Z's demand for instant responses, freeing humans to focus on more complex cases.

3. Personalization in services and support

The era of a single policy that meets the varied needs of all drivers is over. Gen Z wants policies that reflect their individual driving habits and lifestyles. Usage-based insurance (UBI) models, which adjust premiums based on actual driving behavior, resonate well with this demographic. By using telematics data, insurers can offer personalized policies that reward safe driving and give better value for money.

Gen Z drivers also show a strong preference for embedded insurance options. 79% of Gen Z drivers would prefer having insurance integrated into the car deal itself, and 81% stated they would like the option to purchase insurance at the point of buying their vehicle.

To connect with Gen Z drivers, automated emails and generic customer support messages also won't cut it. Customer communication needs to feel personal. AI-driven tools can help insurers create personalized strategies; whether it's promotions or policy updates, every interaction can feel unique and relevant. AI algorithms can analyze customer data, driving habits, and social media activity to create messages that resonate with Gen Z drivers' interests and needs.

The next generation of auto insurance

By adopting a digital-first strategy, offering personalized services, and being more transparent, insurers can build strong connections with the latest generation of drivers. Using advanced technologies like AI can help insurers enhance the customer experience and stay relevant in the market.

Navigating the Softening Property Insurance Market

Commercial property's softening market creates strategic opportunity for insureds to reclaim lost coverage and program flexibility.

A person from the shoulders down sitting at a desk with papers in front of them and holding a small house with a key on it

The commercial property insurance market continues to soften, with rate reductions accelerating across many sectors. Yet, despite this easing, overall premium spending remains higher than pre-2019 levels. For insureds, this presents a unique opportunity and one that requires thoughtful action.

We're advising clients to use this window to restore lost coverages, reassess deductibles, and rethink how they structure their programs. These conditions won't last forever, and policyholders that act now stand to regain both value and flexibility.

From Fatigue to Relief

This current soft market follows years of steady rate increases that began in late 2017 and peaked around 2020. Clients have experienced "rate fatigue" from successive, steep premium increases, which have been exacerbated by rising rebuilding costs during and after the pandemic. Now, as rates decline, many are seeing meaningful premium relief on their property renewals.

So, what's driving the change? In short: capacity. There's an abundance of it. Incumbent carriers are expanding line sizes and aggressively targeting new business. While most new entrants are in the E&S space, where growth has been significant, even smaller players collectively add pressure by creating more competition on individual programs.

Legacy markets, particularly in the U.S., London and Bermuda, are stretching up or down program layers and leveraging facultative reinsurance to gain traction. That's reshaping pricing, risk-sharing strategies, and overall program structure.

Take Back What Was Lost

During the hard market, many insureds were forced to accept terms that reduced coverage, raised deductibles, and increased non-concurrency across shared and layered programs just to get deals done. Now is the time to reverse that.

Clients should be challenging their brokers to leave no stone unturned. A pre-renewal strategy session is essential to prioritize coverages and establish a smart approach to market engagement. Rebuilding programs means more than just rate reduction. It's about reclaiming terms and removing restrictive language that crept in over the past few years.

The New Program Design

One of the most notable shifts we're seeing is in program structure. Larger line sizes and more streamlined layering are helping eliminate expensive, opportunistic capacity. This directly improves rates and simplifies negotiation.

Captive usage is also on the rise. As clients look to gain more control and mitigate volatility, captives offer a path to retain risk strategically. In parallel, there's a renewed interest in long-term agreements, which can provide premium stability over multiple years and reduce annual pricing uncertainty. We're also seeing increased interest in parametric solutions, which complement traditional insurance placements.

What's Next: Opportunities and Uncertainties

Despite the current optimism, the market remains fragile. Several variables could quickly shift conditions. With the right broker and a strategic approach, insureds can regain lost ground, improve coverage, and build programs that are resilient, cost-effective, and better aligned to their evolving risk profile.


Duncan Milne

Profile picture for user DuncanMilne

Duncan Milne

Duncan Milne is head of U.S. property at McGill and Partners.

With nearly two decades of experience in property insurance, Milne advises clients across industries on innovative program structures. Milne's expertise spans sectors including real estate, financial institutions, hospitality and technology, with a focus on global exposures and complex captives.

Mortality Considerations for Underserved Markets

New mortality data reveals underserved insurance markets present unique risks beyond traditional underwriting solutions.

Person with dark hair and in a black suit sitting at a desk with a computer in front of him looking stern

The life insurance industry has long grappled with closing the insurance gap by reaching underserved populations – those who have traditionally not owned life insurance, particularly in the middle market.

As insurers explore new approaches to access these previously untapped segments, they face a hurdle: a lack of familiarity with these markets. Reinsurance Group of America (RGA) conducted an analysis to help address this shortcoming.

Beyond underwriting: Market matters

RGA has long understood that there is more to mortality outcomes than underwriting alone can explain. To demonstrate this, we compared fully insured lives across various carriers to determine the degree to which mortality could vary based on factors other than underwriting. The analysis focused on the best preferred class available at similar ages, policy amounts, and product types, and limited to carriers that had very similar preferred criteria. In other words, we examined policies with the same level of underwriting rigor. The results showed the differential across carriers ranged as high as 1.3 times.

Two lessons emerged:

  1. Insured mortality outcomes are driven by more than the traditional questions and data collected.
  2. All risk cannot be underwritten away in an underlying market, as there are other considerations (i.e., who is buying it, who is selling it, etc.).

Figure 1: Term mortality by company with similar best class underwriting

Given this disparity can exist for business that is known and understood – this type of insurance has been sold by the industry for nearly three decades – it piques the interest in results that may exist for less familiar markets.

Three questions arise:

  1. Is there a difference in mortality between the historical insured population and the prospective insured population?
  2. If yes, how big is the difference?
  3. Can medical underwriting resolve the issue?

(Definitions: Historical insureds are those identified as having had at least one policy reinsured with RGA at any time between 2007 and 2022. Prospective insureds are those identified as not having had any policies reinsured with RGA at any time between 2007 and 2022.)

The mortality divide

Historically, there have been material mortality differences between the insured population and the general population. The insured population has consistently demonstrated better mortality outcomes. This leaves the uninsured portion of the general population – the prospective insureds – as a group with potentially higher mortality risk.

To explore these differences, RGA conducted a comprehensive study comparing mortality outcomes between the historical insured population and the prospective insured population. The study leveraged unique data that allowed for the examination of mortality outcomes and risk profiles across a broad spectrum of the U.S. population.

The study's findings revealed significant mortality differences between the two. When compared with fully underwritten (FUW) policies, the prospective insured population showed mortality rates more than 2.5 times higher. Even when compared with simplified issue policies, the prospective insureds still exhibited about 50% higher mortality.

Interestingly, final expense insurance – typically associated with higher-risk individuals and simplified underwriting that allows for more impairments – was the only segment where the prospective insured population showed lower mortality, at about half the rate of the insured group. These stark contrasts underscore the complexity of the mortality landscape when considering underserved markets.

Figure 2: Prospective insured mortality relative to historical insured by underwriting category

What can medical underwriting accomplish?

Rigorous medical underwriting could presumably bridge this mortality gap; however, the study's results suggest otherwise. After applying the same risk assessment standards using a proxy for medical underwriting (i.e., prescription and medical claims history) to both populations, mortality differences persisted between the prospective and historical insured populations.

For fully underwritten policies, even when comparing only the lowest-risk quintiles of both populations, the mortality ratio remained materially greater than one. This suggests there may be inherent differences between these populations that extend beyond what traditional medical underwriting can capture.

The simplified issue category showed the most similarity between historical insured and prospective insured populations. As underwriting restrictions tightened, the mortality ratio crossed below the 1.0 threshold for the best risk quintiles, indicating prospective insureds might be well-suited for simplified issue products in today's market. Also, the prospective insured population consistently had lower mortality outcomes than final expense insureds.

Figure 3: Prospective insured mortality relative to historical insured by risk profile

Age and duration

The study also revealed interesting trends when examining mortality differences by age and policy duration. For fully underwritten policies, excluding the highest-risk individuals, younger ages showed a more significant mortality ratio between historical insureds and prospective populations. This finding is particularly noteworthy as younger individuals are often a target demographic when attempting to enter the middle market.

Figure 4: Prospective insured mortality relative to FUW and AUW, quintiles 1-4

Duration analysis showed that a historical insured policy assessed more than 20 years ago had a better mortality rate – well above 1.0 – than the prospective insured population. This persistent difference suggests that factors beyond underwriting continue to have a long-lasting influence on mortality outcomes.

Figure 5: Prospective insured mortality relative to FUW and AUW, quintiles 1-4

Strategies for responsible market expansion

These findings highlight the challenges of expanding into underserved markets and underscore the need for innovative approaches to responsibly reduce the insurance gap. However, this is not to suggest that this market and its challenges should be avoided.

The insurance industry must continue to unlock the full potential of digital evidence. As individuals' digital footprints grow, leveraging electronic health records and other unstructured data sources could pave the way for more effective and efficient underwriting. The shift toward digital applications also presents an opportunity to balance customer experience with accurate disclosures, potentially through the implementation of behavioral science principles.

It is crucial for insurers to have a deep understanding of not only who they are targeting but also who is seeking them. This may involve developing effective lead generation strategies and partnering with quality third-party providers, especially when entering the direct-to-consumer market. Starting with familiar or well-defined markets, such as affinity groups, could provide a valuable entry point into underserved segments.

For those underserved markets (and all segments), it is essential for the industry to ensure it is fulfilling a true insurance need with its product offerings, not merely seizing an opportunity. This may involve creating offerings that fill the gap between simplified issue and fully underwritten products, both in terms of underwriting rigor and pricing.

Insurers should strive to manage price sensitivity, striking a balance between adequate pricing for the risk and affordability for the target market. For example, individuals, particularly those who are younger, may have an appetite for paying a higher cost for a more palatable experience along with a product that is more understandable.

Conclusion: A cautious but committed approach

RGA's study provides valuable insights into the mortality considerations for underserved markets. While it reveals significant challenges, including higher mortality outcomes and limitations of traditional underwriting methods, it also provides a path toward innovative solutions.

As the insurance industry strives to close the protection gap, it must be cognizant of the realities of mortality differences in untapped markets. This means developing new underwriting techniques, leveraging advanced data analytics, and creating products tailored to the needs and risk profiles of underserved populations.

By approaching this challenge with a combination of caution and commitment, insurers can navigate the unknown territories of underserved markets, expanding their reach while maintaining responsible risk management practices. RGA is excited to work alongside carriers to provide solutions that will make financial protection accessible to all.

Secure Collaboration Tools Are Critical

Purpose-built collaboration platforms enable insurers to transform operations while maintaining regulatory compliance and data security.

Cyber connected lines in a turquoise gradient form the shape of a lock against a dark background

For insurance providers, collaboration tools are no longer just a support function— they're a mission-critical capability. Whether managing underwriting decisions, coordinating claims, responding to regulatory inquiries, investigating potential fraud, or responding to cyber incidents, insurers must collaborate in real time, across teams, and with full confidence in data security and compliance.

However, many organizations rely on outdated systems or off-the-shelf communication tools that were never designed for the complexity and regulatory rigor of the insurance industry. This results in operational silos, inefficiencies, and increased exposure to compliance risks.

A Perfect Storm of Compliance, Complexity, and Communication Gaps

As the insurance industry accelerates its digital transformation, the need for streamlined, coordinated collaboration has become more urgent – and more complicated. Insurers must find ways to bridge teams and geographies while preserving centralized oversight, transparency, and trust in how information is shared.

Handling vast volumes of sensitive data and adhering to regulations like FINRA, HIPAA, and GDPR requires more than just reliable messaging; it demands secure platforms purpose-built for transparency, data governance, and compliance. Yet many off-the-shelf tools were not built with the specific needs of regulated industries in mind.

These platforms may support quick chats and ad hoc meetings, but they fall short when it comes to protracted, multi-stakeholder collaboration. Conversations can become fragmented, key decisions get buried, and critical actions are missed, all of which undermine response times and create compliance gaps. Retrieving information for audits or litigation becomes time-consuming and unreliable.

Compounding the issue is the challenge of secure integration. Insurers need real-time access to critical data – capabilities that off-the-shelf platforms rarely deliver.

Meanwhile, insurers' tools remain attractive targets for cyber attackers. Persistent login sessions, high user volumes, and weak security controls expose insurers to threats such as phishing, malware, data breaches, and credential theft.

Best Practices for Secure and Scalable Collaboration

To meet the demands of today's insurance industry, collaboration platforms must go beyond messaging. They must break down silos, facilitate secure real-time decision making, ensure audit readiness, and boost operational efficiencies.

Below are six things to consider when evaluating secure, flexible collaboration solutions for insurance environments.

1. Self-sovereign secure data management

To reduce risk, any platform must be built with enterprise-grade security, privacy, and compliance architecture and practices.

Key questions to ask:

  • Is data always stored in environments under your direct control, whether on-premises or in a private cloud? This is essential for maintaining data sovereignty, especially when handling sensitive customer or financial data.
  • Does the provider enable compliance with GDPR, HIPAA, and FINRA standards?
  • Can data be easily exported for compliance purposes?
  • What identity and access controls are in place? Look for multi-factor authentication, session duration configuration, role-based access control, SAML-based single sign-on, enterprise mobile management, and more.
  • Is data encrypted at rest and in transit?

2. Integrated capabilities

To prevent information silos and inefficiencies, the right platform must combine messaging, task management, and documentation into a single platform. This eliminates the need for toggling between tools and enables faster, more consistent execution of critical processes such as incident response or claims investigations.

3. Dedicated channels for organized, auditable messages

Projects like underwriting, fraud detection, or regulatory reviews often span weeks or months. The ability to capture who did what, when, and why – across every conversation and workflow – is vital.

Rather than a single confusing and fractured thread for collaboration, look for a collaboration platform that offers channels or core communication spaces where team members can collaborate, share messages, and work together. This makes collaboration more streamlined and organized. These channels can be open to everyone, restricted to invited members only, allow one-on-one conversations, or span multiple teams.

4. Automated workflows and playbooks

Consider the automation workflows and out-of-the-box playbooks the vendor offers. These features can greatly improve efficiency, ensure consistent execution of critical tasks, reduce risk, and speed service delivery.

Playbooks make it easy to streamline and accelerate routine tasks such as claims processing, customer onboarding, policy renewals, compliance checks, and incident response.

For example, a top three bank implemented an out-of-the-box security playbook to support its incident response efforts. By following a structured process, the team was able to respond more quickly and effectively, reducing their response time from 20 minutes to just two.

5. Audit-ready architecture

Collaboration platforms should offer straightforward, reliable tools for generating compliance reports on user activity and communication. Features like audit logging are essential for capturing detailed records of user actions, system events, and potential insider threats. In addition, message data and file attachments should be easily exportable to support legal discovery, audits, and regulatory reviews.

6. Flexibility and ease of adoption

Any new tool should be intuitive for users and integrate with existing systems. This reduces training overhead and encourages adoption across business and technical teams.

Enabling Faster, Smarter, and Safer Decisions

When implemented correctly, a secure collaboration platform can become a force multiplier across the enterprise. Insurers can break down internal silos, reduce risk, and improve decision-making while ensuring full transparency and compliance.

Robust platforms can also increase resiliency. Outages happen often; a secure collaboration hub can keep teams connected and minimize business disruptions when primary chat systems go down.

The Bottom Line for Insurance Leaders

Senior executives in insurance must now ask: Are our communication systems truly compliant, resilient, and efficient enough to support the demands of our business? With the rise of secure, self-sovereign collaboration platforms, the industry has a clear path forward.


Gavin Beeman

Profile picture for user GavinBeeman

Gavin Beeman

Gavin Beeman is the director of sales, Americas, at Mattermost

He previously worked at Databricks, New Relic and UBM Tech.

Beeman holds a B.A. in mass communications from the University of California, Berkeley.

Past, Present, and Future of AI in Claims 

AI-powered claims management is evolving beyond efficiency, promising deeper insights while keeping human expertise central.

An artist’s illustration of artificial intelligence

The digital transformation occurring throughout all levels of the claims management industry is not new. It began years ago and is just now beginning to truly shake up this sector. At Sedgwick, we have been focused on the next generation of digital tools, but it's critical to understand where we've been, where we are now, and where we're going. If companies want to ensure they are maximizing this exciting new technology.

Where We've Been

The claims industry has been leveraging analytical AI for years at this point, such as in predictive modeling programs. More recently, the advent, and rapid adoption, of generative and agentic AI have emerged as key differentiators for third-party administrators (TPAs). Understanding how companies can leverage this technology, and creatively and effectively apply it to different lines of business to drive new and improved outcomes, will be a deciding factor in terms of success.

Most companies in our industry started in the same place when generative AI burst on the scene two and a half years ago: document summarization. While we were all starting to recognize the power of generative AI, that technology began to evolve at exponential speed. Everyone learned to fail fast in this space and spent significant time understanding the new features as they came out every month. Many companies introduced initiatives using data extraction and analysis from claims-related documents, while others explored whether large language models (LLMs) could support automated customer support options. Keeping up with AI's advances drove IT teams to prototype rapidly, but generative AI itself prompted companies to re-evaluate their initial use cases every day.

What did we learn? The keys to success haven't changed: Robust data sets, data science maturity to understand what the data means, and industry-leading claims practice are still the critical components to establish a foundation for digital transformation of the industry. Changing an age-old claims workflow is no easy task and cannot be accomplished overnight. Tech-enabled best practices must be paired with specific goals that prioritize efficiency, customer experience, and results to achieve a claims life cycle that is smarter, faster, and more engaging than ever.

Where We Are Now

Generative AI is already changing the way the industry responds to customers and handles the rapid pace of claims. However, most current efforts are now centered on incorporating agentic AI solutions into claim workflows. At a very basic level, the practice of applying this technology leverages a series of small microservices (utility agents) that are driven by the intelligence and logic of generative AI's analysis of new information. An agentic AI system breaks down a workflow into tasks and subtasks, which then use the utility agents in a "squad" to perform a particular function.

Business practices will be working through transformation exercises for years to come as they look at true agentic AI-driven digital transformation opportunities.

The next and the most vital step is looking beyond efficiency and driving valuable insights for claims examiners. The role of professionals in the claims process cannot be discounted and will never be fully automated. With examiners working in remote and hybrid roles, it has become that much harder to train new colleagues. It's critical to transfer institutional knowledge to the next generation of workers, and AI is now able to provide insights and guidance that is often missed in a remote work environment. Sedgwick has introduced tools at the desk level that support our incredible professionals in a way that ensures their success and eliminates administrative red tape. This approach allows our people to spend their time focusing on and speaking with claimants so they know they are in great hands.

This concept reinforces that, as exciting as these advances are, technologists and executives should always remember the people they serve. Our best-case customer journey is that our customers, who are experiencing a difficult time in their lives, will see a rapid resolution to their claim while engaging with caring, real people who solve their problems with accuracy and insight. This goal is perhaps the most important driver that we in the claims industry can deliver.

Where We're Going

Harnessing the power of agentic AI requires a creativity that recognizes long-term strategic goals and short-term tactical bites that guide a claims administrator from A to Z. It is about problem solving in a very human way and allowing our colleagues to take advantage of these incredible tools to broaden understanding and transform the industry forever.

I'm very excited as we think forward to the new capabilities that we expect from generative and agentic AI this year. Virtual agents will finally become proficient in holding real conversations with customers. The pairing of robust data science capabilities with agentic AI frameworks will allow companies to reimagine their claims processes from end-to-end. And tools that drive insight and action at the desk level will become a part of the daily routine as humans work side by side with agents.

This is all to say that the extensive work and experience that got us to where we are today is as important as where we are heading. Through exceptional colleagues, data science, models, the very best in the claims industry, and agentic AI, the claims industry is poised to lead a bright future for clients and claimants around the world.


Leah Cooper

Profile picture for user LeahCooper

Leah Cooper

Leah Cooper is the chief digital officer for Sedgwick.

She is a recognized thought leader at national conferences, speaking on the digital shakeup that occurs at all levels of administration, customer experience on the advancements in technology and AI within the insurance industry.

She holds a B.A in economics and theatre from Vanderbilt and a certificate of specialization in entrepreneurship and innovation from Harvard Business School Online.

Solar Farms Face Rising Storm Risks

Severe convective storm losses surge past $50 billion as utility-scale solar farms face mounting weather resilience challenges.

Hundreds of dark solar panels in lines across a field in front of bright green trees and a blue sky

In 2024, insured losses from U.S. severe convective storms (SCSs) exceeded $50 billion for the second consecutive year. This category of peril — which includes tornadoes, hail and straight-line winds — has grown increasingly prominent in recent years, posing significant challenges for risk managers across multiple sectors, including energy, agriculture, insurance, construction and transportation.

The 2024 season began at a rapid pace, with 1,264 preliminary tornado reports from January to June — the second-highest total for this period since 2010. This momentum carried into the second half of the year, culminating in 1,855 preliminary reports for 2024 (Table 1), surpassed only by the 2,240 reports in 2011.

Additionally, 2024 experienced above-average large hail activity — historically the leading cause of SCS-related property damage in the U.S. — with 829 preliminary reports. Straight-line wind activity also exceeded historical norms, with 16,701 reports, making 2024 the third most active year since 2010.

Storm clouds over solar farms

One sector facing increasing risk from SCS events is utility scale solar energy — large solar installations that generate electricity for the power grid. This industry has seen rapid growth in recent years, particularly in Texas. Since 2014, solar energy generation in Tornado Alley and Dixie Alley states has increased by almost a factor of 50, from one terawatt hour to 48 terawatt hours. Most of this growth comes from Texas due to its high solar irradiance levels and streamlined process for approving and building solar energy projects.

However, as more solar farms are built in storm-prone states, the risk of large losses for farm owners and insurers is increasing. The risk is heightened by a recent trend toward larger, thinner solar panels, which are more vulnerable to damage. In March 2024, for example, a hailstorm damaged thousands of solar panels at the Fighting Jays Solar Farm in Fort Bend County, Texas. This event resulted in costly panel replacements and reduced energy output. Insurers anticipated paying out $50 million, reaching the farm's hail coverage sublimit.

The risk is not just restricted to the Central U.S. In October 2024, an EF2 tornado that spawned from Hurricane Milton tore through a solar farm in central Florida, also damaging 30 homes in the area.

The property insurance market for utility-scale solar has struggled with high premiums and limited coverage availability. These challenges stem from significant losses in recent years and the unique vulnerability of solar panels, which complicates risk assessment. As a result, utility-scale developers have turned toward improving resiliency through engineering design and innovations in tracking technology.

Harnessing resilience in a changing climate

The most significant natural peril loss drivers for solar projects are hail and named windstorms. The solar panel modules are the primary components that have high vulnerability to windborne debris and hail-related damage, which is dependent on the module glass thickness. While the exposure value of solar modules is project-dependent, they typically account for a significant proportion of the insurable risk.

As a result of the recent increase in SCS loss activity, risk managers for solar projects are increasingly considering a range of mitigation strategies, such as:

  • Stowing solar panel modules at specific tilt angles, decreasing the angle of impact for hailstones and reducing the likelihood of wind-related damage
  • Implementing real-time weather monitoring and automation, automatically initiating protective measures, such as tilting, as storms approach

New and existing solar projects can also benefit from a comprehensive risk assessment, including geographic and historical analyses of hail, tornado and straight-line wind events. WTW works with utility-scale solar developers and operators evaluate and quantify probable maximum losses, considering site-specific engineering design, risk mitigation and tracking system stow strategies for both wind and hail to quantify natural catastrophe risk precisely.

Additionally, understanding how the risk is evolving over time is vital for effective risk management. WTW Research Network partner Columbia University has found a two-to-threefold increase in tornado outbreaks across the southeastern U.S., particularly during winter and spring, over the past four decades.

By combining these risk assessment and mitigation methods into a comprehensive approach, the solar industry can better prepare for severe weather events and navigate an evolving risk landscape.