Download

Renovations Create Critical Insurance Risks

Nearly half of homeowners plan 2025 renovations, but insurance adjustments remain overlooked despite potentially catastrophic consequences.

Brown Wooden Ladder Beside Painting Materials

Renovation remains a defining trend in the U.S. housing market. In fact, nearly half of homeowners (48%) plan to renovate in 2025. Median budgets are climbing to around $24,000, while high-end projects often top $150,000. For high-net-worth homeowners, those numbers multiply — expansions, specialty rooms and luxury finishes are increasingly common.

Yet amid design plans and contractor negotiations, insurance is often overlooked. Homeowners should always notify their insurer before any renovation project begins. Failing to do so can result in higher deductibles, denied claims or policies that no longer fit the new risk profile. For affluent households, the stakes are especially high: The wrong coverage approach could mean hundreds of thousands in uncovered loss.

1. Unreported Exposure

Risk: Projects that cost more than 10% of a home's insured value, extend beyond a year or require moving out temporarily alter the home's risk. If insurers aren't informed, claims could be contested.

Best Practice: Notify your broker early. A simple litmus test: If you're moving out, disabling security systems or investing more than 10% of insured value, call your advisor. This allows for adjustments before the risk materializes.

2. Policy Reclassification and Deductible Shifts

Risk: Large-scale renovations can require a shift from a standard homeowner's policy to a builder's risk or course of construction policy. If overlooked, deductible surprises can surface. Some carriers apply a construction-related deductible many times larger than a typical homeowner's deductible.

Best Practice: Confirm with the insurer whether builder's risk coverage is required. These policies are designed for homes "in transition." Establishing them early prevents costly disputes if a fire, water loss or theft occurs mid-project.

3. Contractor and Subcontractor Liability

Risk: Renovations introduce third-party exposures. Hiring a contractor with inadequate general liability (GL) or workers' compensation (WC) coverage creates liability exposure. If a subcontractor is injured or damages property without proper insurance, a carrier may be left without recourse.

Best Practice: Require certificates of insurance from all contractors and subcontractors. For high-value properties, ensure GL limits are consistent with the replacement value of the home. Carriers frequently request this documentation and can help validate that coverage is adequate.

4. Underinsurance During and After Renovation

Risk: Renovations increase replacement costs. Without a coverage adjustment, reimbursement may only be for the pre-renovation value. Replacement costs surged more than 55% between 2020 and 2022, driven by inflation and supply chain challenges. If the homeowner's coverage hasn't kept pace, a catastrophic loss could leave the homeowner significantly underinsured.

Best Practice: Request periodic revaluation during and after construction. Policies with extended replacement-cost features or inflation guards can help, but they aren't substitutes for accurate dwelling limits. Insuring your home to value is critical after a renovation project.

5. Vacancy, Theft, and Fire Hazards

Risk: Many renovations involve temporary vacancy or disabled security systems, which dramatically change exposure. Standard homeowner's insurance often excludes theft or vandalism after 30 or 60 days of vacancy. Fire hazards from activities like sanding floors or rewiring electrical systems elevate risk.

Best Practice: Inform the carrier if living elsewhere during a project. Confirm that belongings in storage remain covered and that valuables such as artworks, if moved off premises, are stored in approved environments. Ask whether endorsements for theft of building materials, or a course of construction policy should be added while work is underway.

Closing Perspective

The numbers are clear: 98% of homeowner's insurance claims involve property damage, with average claim severity approaching $24,000 in higher-risk areas. For wealthy homeowners undertaking renovations, those costs can climb into six figures, so you need to make sure the proper coverages are in place prior to starting your project.

Renovation is a fundamental change to a home's risk profile. Treat it accordingly. By contacting the broker early, validating contractor coverage, adjusting limits during construction, and re-evaluating after completion, the insured is protecting both their property and their investment.

How to Manage Rising Stop-Loss Premiums

Rising stop-loss costs and the transparency advantages of self-funded arrangements are creating a fundamental shift in how smart employers approach healthcare benefits.

Focused woman with documents in hospital

After weathering the initial shockwaves of the pandemic, employers thought they had seen the worst of healthcare cost volatility. They were wrong. What started as delayed screenings and deferred care in 2020 has morphed into a sustained surge in catastrophic claims that's pushing stop-loss insurance premiums to breaking points.

According to the Segal Group, medical stop-loss premiums increased an average of 9.7% for plans that maintained their deductibles – a figure that understates the pain many employers actually experienced.

The scale of these increases is a perfect storm of cost pressures that have been building since COVID-19. The delayed impact of missed cancer screenings during the pandemic is now showing up as advanced-stage diagnoses requiring expensive treatments. Meanwhile, specialty drug spending continues its relentless climb and will account for more than half of all drug spending this year, according to Mercer.

The industry is also seeing significant increases in costs associated with premature birth and neonatal care as medical advances allow healthcare providers to save babies who wouldn't have survived in previous decades. While these outcomes represent medical miracles, they come with substantial financial implications for employers and their stop-loss carriers.

Managing the unmanageable

This cost crisis is accelerating a shift toward self-funded arrangements that provide something fully insured plans cannot: visibility into where healthcare dollars are actually going. While employers can't stop specialty drug prices from rising or prevent the continuing impact of delayed screenings, claims data transparency has become a strategic necessity for managing these unavoidable pressures.

The difference comes down to who controls the data. In fully insured arrangements, carriers essentially own the claims information. When employers receive renewal quotes, they get limited visibility into what's driving their costs.

Self-funded arrangements with third-party administrators (TPAs) break open this black box. TPAs work for the employer, not the insurance carrier, and provide accurate, meaningful claims data on a regular basis. This transparency creates opportunities that simply don't exist in fully insured plans.

The symbiotic relationship between TPAs and stop-loss carriers amplifies this advantage. Stop-loss insurers require detailed claims data to assess risk and process payments, which means employers gain access to comprehensive information about their healthcare spending patterns. This visibility enables strategic decision-making about how to navigate an increasingly expensive environment.

Strategic cost management

Armed with detailed claims data, employers can move beyond simply absorbing premium increases to actively managing their healthcare costs. The transparency provides insights that enable targeted interventions and strategic adjustments to plan design.

Given the overall higher costs associated with providing health coverage to employees, employers must analyze their deductible levels and associated claims activity on an annual basis. The data helps employers evaluate critical questions: What impact would a high-dollar claim have on cash flow at a specific deductible amount? What is the risk tolerance for higher deductibles versus the cost savings from lower premiums?

Many employers are increasing their risk tolerance to manage stop-loss costs. A common approach involves raising deductibles, which increases the employer's financial exposure but significantly reduces stop-loss premiums. This strategy requires careful analysis of cash flow capacity and risk tolerance, but the premium savings can be substantial. The key is using claims data to make these decisions strategically rather than just reacting to price increases.

Claims transparency also enables employers to identify trends before they become expensive problems. For example, if data shows a high propensity for diabetes among employees, employers can implement targeted interventions like nutritional counseling or fitness programs. Early intervention costs far less than treating advanced diabetes complications.

Best practices for the new reality

Several strategies can help employers manage rising stop-loss costs while maintaining quality coverage. The foundation is comprehensive data analysis combined with plan management.

Wellness programs are one of the most effective cost-containment strategies. Simple initiatives like offering $100 incentives for cancer screenings or annual physicals can prevent much more expensive treatments down the road. Disease management programs can be particularly effective for common conditions like diabetes, where lifestyle interventions can dramatically reduce complications and costs.

Employers should also consider comprehensive preventive care programs that extend beyond basic screenings. On-site health screenings, flu shot clinics, and partnerships with local healthcare providers can catch health issues early when they're less expensive to treat.

Finally, employers should regularly benchmark their stop-loss coverage against market alternatives. The current environment of rapidly changing costs means that yesterday's optimal coverage structure may no longer be appropriate.

The transparency imperative

The combination of rising stop-loss costs and the transparency advantages of self-funded arrangements is creating a fundamental shift in how smart employers approach healthcare benefits. The tail from COVID is still there, and it remains very prominent. The delayed impact of missed preventive care will continue driving costs for years to come.

Employers that gain access to their claims data and use it strategically will have significant advantages over those operating in the dark. The transparency enabled by self-funded arrangements with TPAs and stop-loss coverage allows employers to take an active role in managing one of the largest expenses on their balance sheets.

Healthcare costs will continue rising, but employers can choose how they respond. For many, that path leads directly to self-funded arrangements that put claims data back where it belongs: in the hands of the employers who ultimately pay the bills.

Farmers Breach Reveals New Security Paradigm

Farmers Insurance's 1.1 million-person breach shows why insurers must abandon prevention-focused security and implement rapid detection strategies.

Brown Farm Gate and Green Grass Field

1.1 million. That's how many people were affected in the Farmers Insurance breach carried out by the ShinyHunters group. It should be a wake-up call across the insurance industry because it shows just how much the ground has shifted under us.

For decades, security strategies focused on keeping attackers out with firewalls, endpoint agents, and endless patching. But that model no longer matches reality. Today, attackers don't need to break in. They simply log in.

The End of the Perimeter

What makes the Farmers breach so striking is how ordinary it was. Attackers did not need to develop novel zero-day exploits or brute-force their way through hardened defenses. Instead, they exploited valid credentials, likely stolen or phished from employees, and used them to move through SaaS and cloud as if they belonged there.

Once an attacker holds the right username and password, or tricks a user into granting a malicious authentication token, the perimeter collapses. To the system, the hackers are "trusted" users. And that's exactly how attackers prefer it. This isn't the first time we've seen these tactics, and it won't be the last.

This shift changes everything for insurers. If your threat model is still dominated by malware signatures and intrusion prevention systems, you're preparing for yesterday's war. The front line has moved to identity, SaaS, AI, and cloud.

Why Prevention Is Doomed to Fail

Insurers understand risk better than anyone. You don't build an underwriting model on the assumption that every accident can be prevented. You assume loss will happen, and you plan for how to mitigate and recover. Cybersecurity requires the same realism.

Preventive measures are not useless. They remain essential for hygiene. But they can't be the centerpiece of strategy. Credential compromise, phishing, malicious third-party apps, and insider threats will always get through. Attacks are inevitable. What matters is not whether attackers get in but how quickly you detect them once they do.

Why Speed Is the Differentiator

There's a world of difference between an attacker inside for five minutes and one inside for five days. In the first scenario, the blast radius is limited. In the second, the attacker blends in, leverages cloud and SaaS nature to siphon data, escalates privileges, moves laterally, and exfiltrates terabytes of sensitive information.

This is where many organizations, including those in the insurance industry, struggle. Security operations centers routinely get flooded with alerts, many of them false positives. Distinguishing signal from noise is slow, manual, and heavily reliant on already-stretched analysts. That delay turns intrusions into breaches.

Speed, therefore, is the true differentiator. Not perfect prevention. Not larger firewalls. Speed of detection, speed of triage, and speed of response. It's survival of the fastest.

What to Watch For

The practical question is, what exactly should we be monitoring? Attackers using stolen credentials don't raise obvious alarms. But their behavior does.

  • Unusual account activity: A claims processor suddenly accessing systems at 3 a.m. from a foreign location.
  • Data access at scale: A single account downloading thousands of policyholder files in a short window.
  • Privilege abuse: An ordinary user suddenly creating admin accounts or changing access rules.
  • Cross-platform anomalies: A login from one identity provider (Okta, Entra, Ping) that doesn't line up with activity in SaaS platforms like Salesforce or Microsoft 365.

These signals don't always mean compromise. But they are the kinds of weak indicators that, if correlated and investigated quickly, allow defenders to spot intrusions while they're still containable.

Lessons for Insurers

The Farmers breach is one more reminder that the insurance industry, by virtue of the sensitive data it holds and the trust it represents, is a high-value target. Attackers are chasing scale, not some esoteric technical exploits. And there's no richer dataset than millions of customer policies, claims histories, and personal identifiers.

For insurers, the lessons are clear:

  1. Assume breach. Just as you assume loss when underwriting, assume intrusions will occur. Build security models around resilience, not perfection.
  2. Invest in visibility. You can't respond to what you can't see. Make sure you have comprehensive logs, correlated across cloud, SaaS, AI infrastructure, and identity systems.
  3. Focus on speed. Measure detection and response not in days or weeks but in minutes. The faster unusual activity is flagged and investigated, the less costly the breach.
  4. Prioritize identity. User credentials are the new perimeter. Multifactor authentication, least-privilege access, and continuous monitoring of account behavior are now the basics.
  5. Test your response. Tabletop exercises, red-team simulations, and cross-team drills aren't luxuries. They are the only way to ensure your organization is ready to act when (not if) an intrusion happens.
A New Mindset

After a breach like Farmers, the instinct is often to add more prevention tools. But history has shown that attackers will simply find another door, often one left ajar by human error.

The real differentiator is preparing for the moment someone inevitably gets inside. Not building a taller fence. Insurers that embrace this mindset, assuming breach, prioritizing visibility, and investing in speed, will be the ones best positioned to protect their customers, their reputations, and their bottom line.

Farmers' experience should be a turning point. Prevention is no longer enough. Detection and rapid response separate a minor incident from a front-page disaster.


Ariel Parnes

Profile picture for user ArielParnes

Ariel Parnes

Ariel Parnes is the co-founder and chief operating officer of Mitiga

Prior to co-founding Mitiga in 2020, he had a 20-plus-year career in the Israel Defense Forces’ elite Unit 8200. He rose to the rank of colonel and founded and headed the unit’s cyberwarfare department. He was awarded the prestigious Israel Defense Prize for technological breakthroughs in cybersecurity. 

Parnes holds a master’s degree in computer science from the Hebrew University of Jerusalem.

Why ‘Settle at All Costs’ Is a Bad Idea

Fear of nuclear verdicts stops some carriers from pursuing winnable cases.  Here’s how to strike the right balance.

Person Signing Document Paper

As a wholesale insurance provider, our mission is to mitigate risk for our transportation clients and provide the best service possible, and this includes paying timely claims where warranted. Where warranted is the important qualifier. There is nothing more distressing in insurance than facing a nuclear verdict where the facts are on the side of the insurer but the sentiment is not. 

Recently, we had this experience. In the case at issue, public sentiment was not favorable to us as an insurer, but the claims adjusters believed in their files, knew their facts and were confident they could win. And they did, helping to deliver a unanimous full defense verdict against a plaintiff in North Carolina who was seeking $10 million in bad faith.

The pressure to settle lawsuits at all costs is building, and trial attorneys know it. Nuclear verdicts, those higher than $10 million, increased by 116% in the past year, according to Marathon Strategies' 2024 report. Carriers representing industries like commercial trucking, which had the third-highest number of nuclear verdicts in 2024 (carrying a $790 million total price tag), feel the squeeze more than others.

As nuclear verdicts become more commonplace, the insurance industry will need to adapt. While it is important for insurers to settle when necessary, there are times when going to trial is your best move. Sometimes the confidence you demonstrate in taking a case to trial can help bring about a more reasonable settlement than if proposing a settlement was your first step. If a plaintiff's attorney does not see a pre-existing pattern of settlements, they are less likely to push for more costly payments.

Unpacking the fear factor

Many carriers settle because they feel like the deck is stacked against them in court, and for good reason. Today's juries are sharper and more informed than ever. The belief that jurors don't grasp the financial stakes of an insurance case no longer holds water. Juries understand that insurance companies with deep pockets stand behind large trucking companies in court, and they are willing to make them pay.

The commercial trucking industry also suffers from negative public perception, spurred by poor over-the-road driving habits and a spate of recently proposed restrictive trucking legislation. Prosecutors are eager to play into the frustration of any plaintiff or juror who has ever been stuck following a slow-moving tractor-trailer on the interstate.

Then there's the very term "nuclear verdict," which trial attorneys use to scare off insurance companies. While the risk for a high-dollar verdict is always possible, there's a difference between what's right, what's wrong and what's unreasonable. And nuclear verdicts are almost always unreasonable.

Why settle-at-all-costs doesn't work

Nobody wants to go to trial. Not the defense counsel. Not the carrier. And sometimes, not even the client. Yet while a settle-at-all-costs mindset may prevent a nuclear verdict, it is not a wise long-term solution.

Like close-knit family members, plaintiffs' attorneys talk with one another regularly. They know a carrier's risk tolerance well, sometimes even better than the carrier itself. If they perceive you will always settle to avoid a costly verdict, they will use that to their advantage in negotiation, potentially costing your organization large sums of money.

Even worse, carriers with a settle-first mentality inevitably submit offers too quickly, without having all the facts. This happens because of a disconnect among carriers, adjusters and defense attorneys. A talented adjuster is worth their weight in gold, but when carriers don't get their staff involved in pending litigation until the trial deadline approaches, they don't give adjusters time to do their job properly. As a result, carriers settle, only to find a key piece of evidence later that would have exonerated their client. The key is to loop in a claims adjuster as soon as a claim is filed so they can use these strategies early on and avoid missing key pieces of evidence.

Finding the right balance

Instead of settling everything, carriers should weigh each case on its merits and choose which ones to pursue. Claims that carry large policy limits and inherently higher reputational risks, like those involving fatalities, should be settled quickly. Juries tend to react emotionally to these cases, increasing the odds of a nuclear verdict.

Other cases require more discernment. To make the best decision, carriers should investigate claims as quickly as possible and trust their adjusters' opinions. You should pursue litigation, build your case on facts and not hypotheticals, and commit to knowing your case better than the plaintiff.

Preparing to win

A well-defended case will protect your organization from financial loss and strengthen your reputation as a carrier that won't fold under pressure. These five strategies can help you gain confidence, even in the face of a potential nuclear verdict.

  1. Partner with the right legal team. Seek lawyers who will stand by you and focus solely on the claim, without getting caught up in petty details or behind-the-scenes politics. If a firm isn't the right fit, don't be afraid to make a change. In our recent win in North Carolina, there was the potential for a nuclear verdict. Prior counsel wanted us to settle, but we knew we had a strong case. We brought in a firm that helped us score a unanimous jury verdict in our favor.
  2. Trust your attorneys. Once you find the right firm, give them the confidence to succeed. Don't expect them to win at all costs. Trust their judgment, give them the latitude to be aggressive and strong, and maintain open, honest communication throughout the litigation process.
  3. Stay connected with your insured. Don't go to court if your client doesn't believe in the fight. A client with wavering beliefs can derail the defense, and the jury will be able to read their lack of conviction. Ensure your client is on board and prepared to support the case from deposition all the way through verdict.
  4. Be brutally honest in court. Nuclear verdicts happen most often when jurors perceive lawyers or carriers aren't genuine. Juries are smart. Show respect by being honest and admitting when you're wrong. We recently did this in a case where we knew our client was at fault. Instead of arguing about fault, we apologized, admitted the error, then presented the facts about why we should pay less than what was being asked. Our honesty paid off. We achieved a $116,000 verdict in a case where the plaintiff asked for $1.6 million from the jury on medicals that were valued at $250,000.
  5. Accept the results. Going to trial is always a risk. There will be both wins and losses. But if your organization establishes a track record of winning often, you will offset the losses and set yourself up for future success.
Don't be a pushover

Settling may feel like the safest route, but giving in too easily can hurt your organization's reputation and its bottom line. Carriers that choose their battles wisely, trust their adjusters and choose the right attorneys won't be pressured into quick payouts or fear-driven decisions. With the right preparation and mindset, you can go to trial confidently and win.

What If FEMA Is Eliminated?

Home insurers must adapt as Trump administration plans reshape FEMA's role in disaster coverage.

Flooded Forest with Warning Sign Amidst Autumn Trees

For the vast majority of 2025, the future of the Federal Emergency Management Agency (FEMA) has been in doubt.

President Donald Trump originally announced plans to phase out the agency by the end of the year, and the administration has already helped dismantle programs FEMA manages, such as the advisory group responsible for updating national flood maps.

Since then, the Trump administration has backed away from a full cancellation of FEMA, instead expressing plans to "remake" the agency.

Point being, there's quite a lot we don't know. What is clear, though, is that FEMA is already operating much differently than it has before, and this distinction is likely to grow sharper.

That's difficult news for home insurance providers, which work hand-in-hand with the agency on a number of issues, from flood insurance to disaster preparedness and relief. Below, we've broken down four big ways that providers can prepare for the uncertainty to come.

Expanding disaster coverage

If FEMA disappears or is greatly diminished, insurance companies will see the earliest, largest repercussions in the National Flood Insurance Program (NFIP), which works with private providers to sell policies in areas with a high flood risk.

The NFIP is also, notably, very popular. According to data from 2023, more Americans purchased flood insurance from the NFIP than they did from all private insurers combined, by a split of 43% to 35%, respectively.

For providers that can, the easiest solution is to simply expand coverage offerings and fill the gap in this much-needed arena.

As of now, only about 4% of Americans have flood insurance, even though around 10% live in an area with a significant flood risk. With flooding and extreme weather becoming increasingly severe due to climate change, those at risk from flood will only increase.

While the NFIP is a helpful and much-needed resource for many, there are limitations to its coverage. For example, the program typically offers building and contents coverage separately, with restrictions around what's included in both. Here, private providers can improve on an existing framework by offering policies with greater flexibility and customization.

Partnering with state and local governments

Another way providers can adapt to a future without FEMA is by shifting focus.

Instead of working on the national scale, insurers will be wise to form partnerships with state and local governments — particularly in areas with a high flood risk — to create similar programs that benefit all parties, especially the homeowners who need this coverage most.

This principle goes beyond flood insurance, as future collaborations could also include disasters like wildfires and earthquakes, neither of which FEMA directly provides coverage for.

The California Earthquake Authority is a great example of cooperation in action. Much like the NFIP, the publicly managed nonprofit works with private insurers in California to offer earthquake protection where it's needed.

Companies operating in California — or in any of America's other most earthquake-prone states, which include Alaska, Hawaii and Texas — should seek strengthened partnerships with these programs where they exist, and should make efforts to help create them in places where they don't.

The same goes for damage from wildfires, which are not always covered by standard home insurance policies, despite their widespread impact.

In fact, unlike earthquakes, fires are no longer a regional issue. In 2023 alone, there were 23 states that had over 10,000 acres of land burned by wildfires. The prevalence of this problem is a sign that insurance providers could be increasing their work with state and local governments across the country, ensuring that more homeowners are able to get the coverage they need.

Becoming preparedness experts

Even with FEMA working at its full capacity, most Americans are not ready for an emergency like flooding, fires, earthquakes or storms.

That fact is evident in survey data from 2024 — before Trump was in office for a second time — in which 57% of Americans said they felt unprepared for a natural disaster, the highest that figure had been since 2017.

The reality is that many people are not getting the information they need, at least not in the places they're looking. Regardless of what happens to FEMA, insurance companies are in a prime position to become greater subject matter experts on disaster preparedness.

In practice, this could be as simple as educating new homeowners about the most common disasters in their area, with the next level being a custom policy plan that bundles flooding, earthquake or fire insurance with more regularly grouped options, like home and auto.

Insurance providers have a great opportunity here, as they have a captive audience of customers who both trust and rely on them for good information. It should be easy for insurers to activate their customers around the issues they need to be aware of.

Monitoring risk independently

One of FEMA's most notable jobs is its role in monitoring risk for all major disaster types nationwide. This information is relied on by both consumers and insurance companies to calculate risk in a given area.

With this role in jeopardy, providers may be left to navigate the data themselves. For some providers, this could mean creating more advanced internal tracking systems and risk assessment tools, while for others it may mean relying more heavily on the data released by state and local governments.

For some companies, this change will require more effort than for others. However, like all items on this list, it's a case of a short-term problem turning into an avenue for a major long-term opportunity.


Divya Sangameshwar

Profile picture for user DivyaSangameshwar

Divya Sangameshwar

Divya Sangameshwar is an insurance expert and spokesperson at ValuePenguin by LendingTree and has been telling stories about insurance since 2014.

Her work has been featured on USA Today, Reuters, CNBC, MarketWatch, MSN, Yahoo, Consumer Reports, Consumer Affairs and several other media outlets around the country. 

6 Pillars of Specialty Underwriting

Specialty underwriting demands precision over scale as market dislocation and complex risks reshape insurance landscapes.

Black chess piece upright with a white chess piece tipped over on its side in front of it

Specialty insurance underwriting plays a critical role in markets shaped by dislocation, heightened uncertainty, or generally greater complexity. Typically, higher margins are required to compensate for higher levels of volatility, but navigating this volatility is no easy task.

What Is Specialty?

Around 1000 BC, David defeated the larger, better-armed Goliath with a sling and a stone, highlighting that battles can be won through scale (Goliath) or skill (David). In insurance, neither is inherently superior, and many companies use both scale and skill across teams, business units, or subsidiaries.

Specialty risks are those excluded from standard insurance. Take inland marine, which covers property that is in transit, is under construction, has high values, or has other idiosyncratic traits. This could run the gamut from medical equipment to infrastructure to bitcoin mining to fine art. These are all excluded in common property coverage, and each requires a highly bespoke solution.

There are four types of specialty insurance risk:

Expertise. These risks require a deep understanding of the exposure and underlying loss drivers, along with prior experience and a healthy dose of battle scars. Classic liability examples would be grain elevators, snow-plow operators, or liquor liability. Inland marine is the quintessential property example. Tax liability is a niche professional lines class, focused on unintended tax liability associated with transactions or other changes in tax treatment.

Structure. These are property and liability coverages with unique structural characteristics. The classic example is excess & surplus, where freedom of rate and form gives underwriters flexibility on terms and pricing. Alternative risk transfer, often for larger clients, similarly varies retention, limits, caps, coverage options, and more. Channel relationships (binding authority, MGAs) may also include variable, loss-sensitive performance features. 

Dislocation. For these, the demand for insurance exceeds the supply, resulting in excess rate. Often, the lack of supply is due to loss-driven distress, leading to the pullback of capacity. Cat-exposed property generally represents this risk in any hard market point in the cycle.

Service. These risks require solutions in addition to risk transfer, which in turn requires non-insurance expertise. Examples include property engineering, cyber risk mitigation, or auto telematics. The intention could be to prevent or mitigate loss or provide some insight that allows an insurance carrier to have superior risk selection.

These archetypes aren't mutually exclusive, as some businesses can have several of these features. I tend to de-emphasize certain specialized risks, especially those with higher volatility across the insurance cycle such as terror or remote-return-period property, like earthquake or other non-peak zone perils. These can be profitable but (in my view) resemble picking up pennies in front of a steamroller. It works until it doesn't, and when it goes bad, the losses can be severe.

Specialty can also mean emerging risks with little track record and higher uncertainty, such as intellectual property or contingency, two of the more recent P&C market innovations – which also happen to be distressed insurance products where ultimate losses were underestimated.

Specialty Underwriting Requires Slingshot Precision

Specialty underwriting is about skill over scale. It requires more nimbleness, creativity, and precision than standard risk. There are six core pillars of great specialty underwriting:

1. Scale within the niche. Average line size needs to be balanced relative to the total portfolio. Losses inevitably will happen, and without scale there is less room for error. Balance is commonly measured by premium-to-limit ratios, to ensure there is enough depth to reasonably absorb loss when it happens.

2. Surgical underwriting thesis. Every specialty segment needs a clear rationale. The underwriter might have some unique edge or expertise. In any case, markets inevitably shift, and usually specialty niches become less attractive over time once the crowd catches on and there is more capital availability. Cycle management is a critical feature for any underwriting thesis.

3. Quantifying upside and downside. It's difficult to plan for precise outcomes, particularly in a short horizon. Underwriters need to understand the stochastic distribution of results – the probability of profit relative to the probability of loss. Underwriting and actuarial need to be deeply intertwined with underwriters who understand the quantification of the upside/downside, and actuaries needing business judgment, so the quantification is not mechanical and superficial.

4. Street smarts. It's critical to understand when math might be wrong and avoid over-reliance on models. This applies to any catastrophe model, any probable maximum loss (PML) calculation, and any return on capital model with diversified capital. Street smarts means appreciating that models are directional at best.

5. Exceptional talent. Great specialty portfolios are built by talented and passionate underwriters. Not just technically strong but with market followership across all stakeholders: brokers, reinsurers, and other underwriters. Great underwriters are humble, appreciating what is unknown. The best underwriters have passion, which they exude when they talk about their business.

6. Portfolio Balance. Given specialty's inherent volatility, it requires a portfolio of niches, ideally with non-stacking, non-correlating exposure. Diverse exposures will lower the standard deviation of results, meaning, the overall average performance should be less volatile. Portfolio breadth also allows more flexibility to dial up or dial down specific niches in response to the market cycle. There is a critical caveat, the need to avoid "de-worsification." Every niche needs to have a strong thesis and favorable outlook, or it risks dragging on results.

Conclusion

Like David defeating Goliath, specialty underwriting is all about precision and skill honed through practice. Success in specialty lines requires ensuring every line has a clear thesis for market success, a path to scale within the niche, and the right balance of risk and reward.


Ari Chester

Profile picture for user AriChester

Ari Chester

Ari Chester is head of specialty at Argo Group.

He previously served as head of reinsurance for the U.S. and Canada at SiriusPoint. Prior to that, he was a partner at McKinsey, where he held several leadership roles in the insurance practice, focusing generally on commercial lines and specialty markets. 

Chester has a master of business administration from the Wharton School, University of Pennsylvania and a bachelor of fine arts from New York University. He holds the CPCU and ARe designations. 

Physician Performance Measures Must Be Transparent

Opaque physician performance evaluations by AI fuel payer-provider mistrust; evidence-based transparent analytics could rebuild relationships.

Two doctors standing side by side looking at a scan and standing against a blank wall

The way physicians are evaluated has profound consequences — not just for reimbursement but also for clinical practice, professional trust, and ultimately patient outcomes.

Yet too often, performance measurement relies on opaque or "black-box" analytics that lack transparency and fail to resonate with the clinicians whose behavior they are meant to influence. Evidence-based, transparent, and traceable methodologies are essential if health plans and providers are to find common ground and use performance data as a tool for genuine improvement and change.

Among the many friction points in payer–provider relationships, few are as consequential as performance evaluations, which — like prior authorization and reimbursement rates — directly affect both financial outcomes and professional identity.

Like the other two hot-button issues, evaluations affect income, but they also touch on the sensitive matters of clinical outcomes, practice habits, and professional judgment. Low evaluations can be viewed as criticism of a physician's performance, which strikes at the heart of their practice and their personal brand.

A longstanding lack of mutual payer-provider trust compounds this contentiousness. Plans suspect providers try to boost their income by performing as many procedures and ordering as many tests as possible, often with little thought to necessity or wasteful low-value care. Providers often perceive plans as focused primarily on financial outcomes rather than patient care.

This friction between payers and providers has been exacerbated by health plans' use of opaque methodologies – even AI – to analyze provider behavior. Health plans have long used analytics that providers consider obscure, unfair, or irrelevant. The lack of transparency and accurate attribution in these approaches has fueled the abrasion between these two crucial healthcare stakeholders.

Today's Typical Performance Reviews: Group Level and Aggregate

Health plans today primarily rely on claims data to evaluate provider performance. While clinical data would be ideal and clinical data interoperability is improving under TEFCA, it is not yet widely available at scale.

Most performance reviews occur at the medical group, practice, or health system level. Common approaches include:

  • Cost-efficiency metrics such as total cost of care, usage, and readmission rates.
  • Quality measures like HEDIS scores, chronic disease control, and hospital-level outcomes.
  • Patient experience scores that are typically aggregated through Consumer Assessment of Healthcare Providers and Systems (CAHPS) surveys.
  • AI-driven insights that are increasingly used to identify patterns and trends.

These methods provide a broad view of performance but do not identify or evaluate the wide performance variation that exists between individual clinicians. It's hard for a single physician to see himself or herself in this data — or to trust and act on it in meaningful ways.

For performance measurement to change behavior, physicians must trust it. That trust comes when systems have three essential attributes:

  • Transparency – physicians can see precisely how results were derived, from evidence sources to algorithm design to data application.
  • Traceability – every measure can be linked back to the clinical guidelines or research from which it was derived.
  • Comprehensibility – physicians can understand the methodology and validate the logic themselves.
Evidence-Based Standards: The Foundation for Fair Measurement

The best sources for physician performance measures are evidence-based clinical practice guidelines published by medical societies and professional organizations. These guidelines are based on scientific findings, cumulative clinical experience, and the consensus judgment of practicing clinicians. They are stewarded by respected leaders in each specialty.

Another essential source is peer-reviewed research from leading medical journals such as The Lancet and The New England Journal of Medicine, which can provide convincing evidence that one clinical practice is safer or more effective than another.

Then there is the data that the measure is based on. Today, claims data is the largest and most widely available data set for measuring physician performance, and a great deal about clinician performance can be determined with claims data if it's applied correctly.

Equally important, evaluations should be applied at the individual physician level, not just at the group or system level. Aggregated metrics can mask unwarranted variation in care that lower quality and increase cost. Individually attributed measures ensure accountability, highlight clinical excellence, and surface opportunities for targeted improvement. Physicians who undergo individual reviews often report feeling empowered by evidence-based data specific to their own practice — and they are often more willing to make meaningful changes.

Of course, some clinicians, in spite of research and professional guidelines, may persist in doing things that are not aligned with evidence. In those cases, plans can apply pressure through mechanisms such as tiered or selective networks, limiting referrals, adjusting reimbursement incentives, or requiring prior authorization and more.

Finding Common Ground

Fee-for-service reimbursement fuels payer–provider mistrust by rewarding volume rather than outcomes. But even under value-based care, disagreements about performance measurement persist.

The path forward lies in performance analytics that are scientifically sound, mutually acceptable, evidence-based, and transparent to both parties. Only then can health plans and providers share a language that reduces friction, builds trust, and inspires clinicians to improve care delivery.

AI will continue to revolutionize healthcare in many ways. But when it comes to evaluating physician performance, black-box algorithms are not the answer. Evidence-based clinical analytics, grounded in transparency and traceability, remain the fairest and most effective approach — for plans, providers, and patients alike.

Only then can we engage and inspire physicians to change practice behaviors, reduce waste from unnecessary low-level care, enhance patient outcomes and truly arrive at value-based care.

Insurance at a Crossroads

Insurance companies confront mounting litigation, shrinking capacity and regulatory pressures demanding faster, smarter decision-making.

Aerial View of Flyover Roads

The insurance market has been under increased pressure throughout 2025 from every direction. Litigation is becoming more aggressive, capacity is tightening, and regulations are changing fast. For brokers, MGAs, carriers, and capital providers, these forces aren't abstract — they're reshaping day-to-day decisions, from pricing and reserving to partner selection and tech investment.

Stitching it all together is the urgent need for real-time insight into data, operational agility, and underwriting accuracy. Insurance companies that respond quickly, make better-informed decisions, and provide great customer service are already pulling ahead of the competition.

Litigation: Less Predictable, More Costly

In Florida, tort reform has changed the rules, compressing claims timelines and shifting litigation incentives. However, elsewhere in the U.S., third-party litigation funding (TPLF) is making those same rules harder to follow.

In July 2025, Reuters reported that litigation financiers narrowly avoided a proposed 41% tax on their returns — a sign of how embedded and influential the sector has become. At the same time, Burford Capital is poised to collect $6 billion from its investment in a massive oil-and-gas arbitration case against Argentina.

Add in social inflation driven by mass tort advertising and shifting jury sentiment, and the result is a claims environment that's harder to predict, price, or reserve for.

Capacity Is Tightening — Especially in E&S

At the beginning of 2025, I predicted the continued boom of the E&S market. While this market has seen growth, it's no longer as wide open as it was in 2024 due to carriers getting more selective. Appetite is narrowing. Loss ratios are under pressure due to record losses from climate change. Across property and casualty (P&C) and professional lines, underwriting discipline is no longer optional; it's a threshold to even stay in the room.

And as margins tighten, speed matters. Launching products, testing appetite, and adjusting pricing dynamically is now a core advantage.

Reporting and Regulation Are Raising the Bar

The pressure isn't just coming from courts and carriers; it's also coming from regulators and capital partners.

For MGAs and hybrid fronting carriers, real-time bordereaux reporting, audit readiness, and live profitability tracking are now essential to maintaining trust and capacity, and the innovators in these markets are already rethinking their tech infrastructure to meet demand.

The old way — manually assembling spreadsheets to send weeks after the fact — just doesn't cut it any more.

Agents and Brokers Use AI to Stay Ahead

Insurance agencies and brokerages are not passengers in the AI journey—they're pilots. A recent Agents United report shows how independent agents leverage AI and predictive analytics to gain efficiency, improve client outcomes, and unlock new revenue streams. But getting there will be a massive undertaking. Data will need to be cleaned, unified, and stored in a single, dynamic repository that acts as a reliable source of truth across the organization. I described it previously as a secure container for information that the agent only needs to enter into the system once.

Why This Matters
  • Personalized, Real-Time Client Proposals: AI synthesizes client data such as claim history, risk exposures, and market trends to craft tailored policy suggestions instantly, helping agents win trust and gain bandwidth. Personalization wins new policyholders and helps retain existing clients, as well.
  • Efficiency Gains in Operations: Automating routine tasks, such as lead scoring, document generation, or renewal reminders, frees agents to focus more on advisory and client relationships rather than admin overhead.
  • Regulatory and Risk Alignment: Predictive analytics help flag potential compliance or fraud concerns early in submissions or renewals, supporting agents in maintaining client integrity and agency credibility.
  • Competitive Differentiation: With nearly 70% of brokerages adopting generative AI in some form, early adopters who integrate AI deeply into the sales and underwriting workflow gain a decisive edge.

By integrating AI insights, brokers and agents can operate more strategically, offering personalized, faster service without sacrificing quality. This allows them to stay relevant even as market turbulence increases.

The Bottom Line

From Florida's tort environment to tightening carrier appetite, the story is the same: Faster insight, stronger controls, and greater transparency are now table stakes.

For brokers, MGAs, and hybrid fronting carriers, this means:

  • You need underwriting precision supported by real-time data — not just historical loss trends.
  • You need agility to adapt, launch products, and adjust pricing as litigation and capacity trends shift.
  • You need audit-ready compliance and accurate, transparent bordereaux to maintain relationships with fronting carriers and capital providers.
  • And most of all, you need a tech stack that doesn't just record activity, but drives better decisions.

In a market that is this volatile, leadership depends on how fast you can adapt. The rules have changed, and your tech must keep up.

The U.S. insurance landscape is dynamic, driven by litigation, capacity issues, and regulation. The industry demands agility, precision, and transparency. Brokers, MGAs, and fronting carriers must leverage advanced technology for real-time insights, optimized underwriting, and compliance. Adapting swiftly with innovative tech stacks will ensure survival and leadership in this evolving market.

Where Insurers Fall Short on CX

Fragmented data across legacy systems prevents insurers from delivering the seamless omnichannel experiences customers expect.

Close-up of woman typing on keyboard of laptop

Customer experience (CX) has always been vital to the insurance industry, but fundamental aspects of it have changed. Historically, agents and customer service representatives were the main points of contact with consumers and clients, and they defined CX. But today, CX is distributed across a more complex, hybrid structure; customers interact with insurers through multiple digital channels as well as trusted intermediaries, meaning insurers must support both direct and agent-led experiences to ensure the client is receiving the best customer experience possible.

Many carriers fail to meet the demands of this multichannel CX environment due to outdated, batch-based processing, lack of access to real-time data, and aging or poorly designed systems that don't support digital-first engagement. A survey of 250 producers revealed that agents increasingly support multiple lines of business—life, annuity, and P&C—demonstrating the necessity of a unified view of customer experience without the current inefficiencies and disjointedness.

Improving customer experience starts with addressing one of the biggest obstacles in insurance: data complexity.

Insurance data is complex, inconsistent and often redundant.

A single carrier can have 35,000 different data attributes in their life products alone. In addition to the natural complexity of the industry, legacy systems and decades of product layering have created overlap between data structures, making them extremely inconsistent. In some cases, a single data attribute is replicated 10 to 18 times across various internal systems.

The result of this overlap and inconsistency is that insurers lack a single source of truth when it comes to their customers. Holistic views are hard to assemble because data is spread across many systems and, in many cases, inaccessible. Business users struggle to find what they need, often using shadow systems and workarounds to piece together elements of a fragmented customer picture. Although it feels more challenging to implement, data modernization is equally important as system modernization. Without a clean, unified data foundation, carriers struggle to deliver real-time transactions, enable intelligent automation, or personalize experiences in meaningful ways.

And, if the picture can't be fully drawn, then how can a carrier build customer personas, map customer journeys, or any of the other more advanced steps in optimizing CX?

Solving the data problem isn't optional — it's the foundation for modern CX.

Unified data is essential for omnichannel success.

A single source of truth is essential for analytics, AI implementation, and optimized client service, but it remains elusive for many insurers. Legacy platforms create data silos, and multiple generations of products cause data to be inconsistently transformed and stored — determining the authoritative source at any given moment becomes a challenge. Traditional approaches to centralizing data often backfire, resulting in rigid structures that restrict access. Instead, carriers should focus on data fabrics, governance models that support usability, and democratized access. If CX platforms rely on outdated or conflicting data, any improvements will be short-lived.

True omnichannel experiences require more than channel availability. Omnichannel experiences demand consistent, connected service across every touchpoint. Agents and customer service representatives need visibility into all prior interactions, whether through digital self-service, a call center, or an in-person meeting. Agents should be able to see online transactions, even if they're incomplete, to help clients pick up where they left off. They should be able to see the attempted transaction and how it can be completed to create total understanding. Data governance across all channels is vital to making holistic CX possible.

New PAS technology helps insurers meet CX expectations.

Full spectrum transparency requires modern policy administraton systems (PAS) with real-time application programming interfaces (APIs), common data services, and unified interaction histories. Only then can the entire ecosystem of clients, agents, and employees operate efficiently to deliver a cohesive experience.

The latest PAS technology helps insurers enhance CX with a focus on modularity — like API-first design, microservices, and event-driven architecture. Modern PAS solutions support the real-time data flow critical for creating smooth and responsive CX experiences, allowing changes to propagate instantly across systems without replication.

Carriers are also embracing cross-system product bundling, intelligent workflows, advanced analytics, and, increasingly, agentic AI. These technologies reduce manual intervention, accelerate underwriting and claims, and enable dynamic, personalized engagement. Ultimately, the new generation of PAS empowers insurers to evolve with customer expectations — not just react to them.

Successful CX requires rethinking core technology.

Insurers that treat digital transformation as a front-end exercise will continue to struggle. True CX gains come from rethinking the core — modernizing policy admin systems, untangling data complexity — and embracing omnichannel strategies built on real-time, API-driven infrastructure.

In an age of automated processes, customers' expectations for a fast and responsive customer experience are only rising. The carriers that succeed will be those that can deliver seamless, data-driven, omnichannel experiences by aligning the right technology with a clear, execution-focused strategy.


Brian Carey

Profile picture for user BrianCarey

Brian Carey

Brian Carey is senior director, insurance industry principal, Equisoft.

He holds a master's degree in information systems with honors from Drexel University and bachelor's degrees in computer science and mathematics from Widener University.

How to Build Products Without IT

Insurance product configurators eliminate traditional IT bottlenecks, reducing time-to-market from months to days.

Close up of computer hardware

You work at an insurance company, in an industry where time plays a critical role in gaining a competitive edge. Your team has an idea and a vision for a new insurance product that answers real market needs. You take it to IT, and the response is: We can deliver in two to three months. Do you really have time to wait?

For decades, the process of introducing a new tariff, modifying terms and conditions, or updating underwriting rules resembled a slow, multi-stage cycle between business, IT, and legal. Every single change, even the smallest, required developers to translate business logic into code, followed by lengthy testing and deployment.

New technologies are changing this picture for good.

A configurator for change and innovation

More and more often, the industry is talking about product configurators that, powered by business rules engines (BRE), flip the traditional dynamics of product launches. Instead of waiting on IT, business teams can create and adjust product logic on their own. With intuitive, no-code or low-code graphical interfaces, users define every aspect of how a product works. They decide how pricing is calculated, which variants and options are available, who qualifies for a policy, and under what conditions. All those complex dependencies that used to be buried deep in the code are now transparent and fully configurable.

The fundamental shift is that these tools are designed with non-technical users in mind. Instead of writing complicated scripts, they define rules in decision tables, build calculation functions, or even model entire processes through visual diagrams.

What can product configurators be used for?

One of the key roles of product configurators is speeding up time-to-market for new products and modifications. Business teams can also set up pre-defined benefit packages or dynamically segment customers to offer personalized terms.

In underwriting, configurators become the central tool for defining and updating risk assessment rules. Instead of relying on static guidelines, underwriters can continuously adjust logic to support both manual and fully automated processes. The same applies to pricing – all aspects of rating logic, from simple validations and discount/markup conditions to complex premium calculation algorithms, can be managed centrally and in real time.

Configurators also bring order to managing policy terms and conditions and integrating with policy administration systems (PAS). Mapping products and their rules into the core system becomes a straightforward, configurable process, ensuring consistency throughout the policy lifecycle. In addition, these tools often serve as a central repository for reference data such as address dictionaries, transaction codes, or vehicle classifications, ensuring data consistency across the organization and boosting operational efficiency.

How can you be sure this will work?

Traditionally, the guarantee that a solution would function as expected came from IT. When business takes on the role of product creator, there's a natural fear that something might go wrong.

However, modern configurators have built-in testing mechanisms. For example, an analyst creating a discount rule doesn't need to wait for a deployment cycle to verify it. They can instantly run single test cases or entire regression test suites to see how the change affects the entire product portfolio.

Equally important are full version control and auditability. In insurance, being able to track, compare, and roll back changes when needed is essential. Configurators maintain a complete history of every modification, making it easy to manage multiple product versions - for instance, rolling out new terms on a specific date, tailoring offers to different sales channels or customer segments. Detailed audit logs ensure complete transparency and regulatory compliance.

More than just speed

Using a product configurator should be seen as an investment that quickly pays off. The first benefit you'll notice is a dramatic reduction in time-to-market - from months down to days. That allows you to respond faster to competitor moves or regulatory changes.

You'll also gain independence from IT.

When a new product idea or modification can be tested and rolled out quickly, the organization becomes more agile and responsive.

Finally, automating manual processes directly reduces operational costs and minimizes the risk of human error.

What's next?

Analysts agree that the next stage of evolution for these tools is the integration of rule-based logic with predictive models and artificial intelligence. Imagine a system where the configurator not only executes defined rules but also leverages AI recommendations to optimize pricing in real time, automate underwriting decisions based on predictive analytics, or flag potential fraud attempts.

Personally, I can't wait to see this future unfold.


Piotr Biedacha

Profile picture for user PiotrBiedacha

Piotr Biedacha

Piotr Biedacha is the CEO and head of delivery at Decerto

A graduate of software engineering and postgraduate management studies, he has been working in the insurance industry for over 20 years.