Download

6 Pillars of Specialty Underwriting

Specialty underwriting demands precision over scale as market dislocation and complex risks reshape insurance landscapes.

Black chess piece upright with a white chess piece tipped over on its side in front of it

Specialty insurance underwriting plays a critical role in markets shaped by dislocation, heightened uncertainty, or generally greater complexity. Typically, higher margins are required to compensate for higher levels of volatility, but navigating this volatility is no easy task.

What Is Specialty?

Around 1000 BC, David defeated the larger, better-armed Goliath with a sling and a stone, highlighting that battles can be won through scale (Goliath) or skill (David). In insurance, neither is inherently superior, and many companies use both scale and skill across teams, business units, or subsidiaries.

Specialty risks are those excluded from standard insurance. Take inland marine, which covers property that is in transit, is under construction, has high values, or has other idiosyncratic traits. This could run the gamut from medical equipment to infrastructure to bitcoin mining to fine art. These are all excluded in common property coverage, and each requires a highly bespoke solution.

There are four types of specialty insurance risk:

Expertise. These risks require a deep understanding of the exposure and underlying loss drivers, along with prior experience and a healthy dose of battle scars. Classic liability examples would be grain elevators, snow-plow operators, or liquor liability. Inland marine is the quintessential property example. Tax liability is a niche professional lines class, focused on unintended tax liability associated with transactions or other changes in tax treatment.

Structure. These are property and liability coverages with unique structural characteristics. The classic example is excess & surplus, where freedom of rate and form gives underwriters flexibility on terms and pricing. Alternative risk transfer, often for larger clients, similarly varies retention, limits, caps, coverage options, and more. Channel relationships (binding authority, MGAs) may also include variable, loss-sensitive performance features. 

Dislocation. For these, the demand for insurance exceeds the supply, resulting in excess rate. Often, the lack of supply is due to loss-driven distress, leading to the pullback of capacity. Cat-exposed property generally represents this risk in any hard market point in the cycle.

Service. These risks require solutions in addition to risk transfer, which in turn requires non-insurance expertise. Examples include property engineering, cyber risk mitigation, or auto telematics. The intention could be to prevent or mitigate loss or provide some insight that allows an insurance carrier to have superior risk selection.

These archetypes aren't mutually exclusive, as some businesses can have several of these features. I tend to de-emphasize certain specialized risks, especially those with higher volatility across the insurance cycle such as terror or remote-return-period property, like earthquake or other non-peak zone perils. These can be profitable but (in my view) resemble picking up pennies in front of a steamroller. It works until it doesn't, and when it goes bad, the losses can be severe.

Specialty can also mean emerging risks with little track record and higher uncertainty, such as intellectual property or contingency, two of the more recent P&C market innovations – which also happen to be distressed insurance products where ultimate losses were underestimated.

Specialty Underwriting Requires Slingshot Precision

Specialty underwriting is about skill over scale. It requires more nimbleness, creativity, and precision than standard risk. There are six core pillars of great specialty underwriting:

1. Scale within the niche. Average line size needs to be balanced relative to the total portfolio. Losses inevitably will happen, and without scale there is less room for error. Balance is commonly measured by premium-to-limit ratios, to ensure there is enough depth to reasonably absorb loss when it happens.

2. Surgical underwriting thesis. Every specialty segment needs a clear rationale. The underwriter might have some unique edge or expertise. In any case, markets inevitably shift, and usually specialty niches become less attractive over time once the crowd catches on and there is more capital availability. Cycle management is a critical feature for any underwriting thesis.

3. Quantifying upside and downside. It's difficult to plan for precise outcomes, particularly in a short horizon. Underwriters need to understand the stochastic distribution of results – the probability of profit relative to the probability of loss. Underwriting and actuarial need to be deeply intertwined with underwriters who understand the quantification of the upside/downside, and actuaries needing business judgment, so the quantification is not mechanical and superficial.

4. Street smarts. It's critical to understand when math might be wrong and avoid over-reliance on models. This applies to any catastrophe model, any probable maximum loss (PML) calculation, and any return on capital model with diversified capital. Street smarts means appreciating that models are directional at best.

5. Exceptional talent. Great specialty portfolios are built by talented and passionate underwriters. Not just technically strong but with market followership across all stakeholders: brokers, reinsurers, and other underwriters. Great underwriters are humble, appreciating what is unknown. The best underwriters have passion, which they exude when they talk about their business.

6. Portfolio Balance. Given specialty's inherent volatility, it requires a portfolio of niches, ideally with non-stacking, non-correlating exposure. Diverse exposures will lower the standard deviation of results, meaning, the overall average performance should be less volatile. Portfolio breadth also allows more flexibility to dial up or dial down specific niches in response to the market cycle. There is a critical caveat, the need to avoid "de-worsification." Every niche needs to have a strong thesis and favorable outlook, or it risks dragging on results.

Conclusion

Like David defeating Goliath, specialty underwriting is all about precision and skill honed through practice. Success in specialty lines requires ensuring every line has a clear thesis for market success, a path to scale within the niche, and the right balance of risk and reward.


Ari Chester

Profile picture for user AriChester

Ari Chester

Ari Chester is head of specialty at Argo Group.

He previously served as head of reinsurance for the U.S. and Canada at SiriusPoint. Prior to that, he was a partner at McKinsey, where he held several leadership roles in the insurance practice, focusing generally on commercial lines and specialty markets. 

Chester has a master of business administration from the Wharton School, University of Pennsylvania and a bachelor of fine arts from New York University. He holds the CPCU and ARe designations. 

Where Insurers Fall Short on CX

Fragmented data across legacy systems prevents insurers from delivering the seamless omnichannel experiences customers expect.

Close-up of woman typing on keyboard of laptop

Customer experience (CX) has always been vital to the insurance industry, but fundamental aspects of it have changed. Historically, agents and customer service representatives were the main points of contact with consumers and clients, and they defined CX. But today, CX is distributed across a more complex, hybrid structure; customers interact with insurers through multiple digital channels as well as trusted intermediaries, meaning insurers must support both direct and agent-led experiences to ensure the client is receiving the best customer experience possible.

Many carriers fail to meet the demands of this multichannel CX environment due to outdated, batch-based processing, lack of access to real-time data, and aging or poorly designed systems that don't support digital-first engagement. A survey of 250 producers revealed that agents increasingly support multiple lines of business—life, annuity, and P&C—demonstrating the necessity of a unified view of customer experience without the current inefficiencies and disjointedness.

Improving customer experience starts with addressing one of the biggest obstacles in insurance: data complexity.

Insurance data is complex, inconsistent and often redundant.

A single carrier can have 35,000 different data attributes in their life products alone. In addition to the natural complexity of the industry, legacy systems and decades of product layering have created overlap between data structures, making them extremely inconsistent. In some cases, a single data attribute is replicated 10 to 18 times across various internal systems.

The result of this overlap and inconsistency is that insurers lack a single source of truth when it comes to their customers. Holistic views are hard to assemble because data is spread across many systems and, in many cases, inaccessible. Business users struggle to find what they need, often using shadow systems and workarounds to piece together elements of a fragmented customer picture. Although it feels more challenging to implement, data modernization is equally important as system modernization. Without a clean, unified data foundation, carriers struggle to deliver real-time transactions, enable intelligent automation, or personalize experiences in meaningful ways.

And, if the picture can't be fully drawn, then how can a carrier build customer personas, map customer journeys, or any of the other more advanced steps in optimizing CX?

Solving the data problem isn't optional — it's the foundation for modern CX.

Unified data is essential for omnichannel success.

A single source of truth is essential for analytics, AI implementation, and optimized client service, but it remains elusive for many insurers. Legacy platforms create data silos, and multiple generations of products cause data to be inconsistently transformed and stored — determining the authoritative source at any given moment becomes a challenge. Traditional approaches to centralizing data often backfire, resulting in rigid structures that restrict access. Instead, carriers should focus on data fabrics, governance models that support usability, and democratized access. If CX platforms rely on outdated or conflicting data, any improvements will be short-lived.

True omnichannel experiences require more than channel availability. Omnichannel experiences demand consistent, connected service across every touchpoint. Agents and customer service representatives need visibility into all prior interactions, whether through digital self-service, a call center, or an in-person meeting. Agents should be able to see online transactions, even if they're incomplete, to help clients pick up where they left off. They should be able to see the attempted transaction and how it can be completed to create total understanding. Data governance across all channels is vital to making holistic CX possible.

New PAS technology helps insurers meet CX expectations.

Full spectrum transparency requires modern policy administraton systems (PAS) with real-time application programming interfaces (APIs), common data services, and unified interaction histories. Only then can the entire ecosystem of clients, agents, and employees operate efficiently to deliver a cohesive experience.

The latest PAS technology helps insurers enhance CX with a focus on modularity — like API-first design, microservices, and event-driven architecture. Modern PAS solutions support the real-time data flow critical for creating smooth and responsive CX experiences, allowing changes to propagate instantly across systems without replication.

Carriers are also embracing cross-system product bundling, intelligent workflows, advanced analytics, and, increasingly, agentic AI. These technologies reduce manual intervention, accelerate underwriting and claims, and enable dynamic, personalized engagement. Ultimately, the new generation of PAS empowers insurers to evolve with customer expectations — not just react to them.

Successful CX requires rethinking core technology.

Insurers that treat digital transformation as a front-end exercise will continue to struggle. True CX gains come from rethinking the core — modernizing policy admin systems, untangling data complexity — and embracing omnichannel strategies built on real-time, API-driven infrastructure.

In an age of automated processes, customers' expectations for a fast and responsive customer experience are only rising. The carriers that succeed will be those that can deliver seamless, data-driven, omnichannel experiences by aligning the right technology with a clear, execution-focused strategy.


Brian Carey

Profile picture for user BrianCarey

Brian Carey

Brian Carey is senior director, insurance industry principal, Equisoft.

He holds a master's degree in information systems with honors from Drexel University and bachelor's degrees in computer science and mathematics from Widener University.

Unlocking the Power of Agentic AI in Insurance

Insurance enters the Agentic Age as autonomous AI systems redefine industry speed, precision, and competitive economics.

Image of a brain inside blue and green hexagons emanating outward

Insurance is entering the Agentic Age. Intelligent, autonomous systems that can perceive, reason, act, and learn are redefining how insurance stakeholders operate, compete, and grow. This is not simply automation taken a step further. It is a structural shift that changes the speed, precision, and economics of the entire industry.

Agentic AI consists of intelligent agents that can sense changing conditions, interpret context, make decisions, take action, and learn from results autonomously. These agents orchestrate complex processes, uniting data, enterprise logic, and contextual memory to improve continuously.

Across the industry, scaled deployments of Agentic AI are beginning to deliver measurable results. In P&C, underwriting expense ratios will decline by 15 to 20%, and claims expense ratios by more than 15%. In life, underwriting costs will drop by more than 25%, with benefit expenses reduced by nearly 20%. Claims resolutions that once took weeks will be shortened to hours or less, and payment error rates will fall by more than 30%. These are not incremental gains but step-change improvements.

Agentic AI moves beyond workflow automation and analytics. It empowers systems to combine historical, contextual, third-party, and synthetic data with connected platforms to coordinate complex processes and make informed decisions. The result:

  • Faster cycle times: Underwriting processes cut by up to 75%
  • Improved retention: Customer loyalty increases by 10 to 20%
  • Higher productivity: Output per colleague more than doubles
  • Enhanced economics: Marginal cost trends toward zero while precision improves

Agentic AI enables firms to optimize price, product, experience, and operating economics simultaneously, at scale. This is something that was previously beyond reach.

Why it matters now: Markets are moving toward real-time, predictive, and adaptive operations. Firms that deploy Agentic AI early can capture structural advantages such as lower marginal cost, faster execution, and stronger retention that compound over time. Late adopters will struggle to close the performance gap and forgo learning curve effects.

However, many firms are not ready to capitalize on Agentic AI. Legacy technology, disconnected data, manual workflows, and fragmented governance can slow execution and block leverage. This capability debt will further widen the gap between leaders and laggards.

To help overcome such challenges, consider the following strategies:

  • Design connected systems: Modernize infrastructure with orchestration layers, application programming interfaces (APIs), and cloud extensions to connect legacy cores to agentic systems.
  • Rethink your operating model: Redefine roles, governance, and incentives to support enterprise-wide AI adoption.
  • Create consistency: Standardize workflows and embed business logic to enable intelligent orchestration from triage to resolution.

These strategies are supported by five enablers that ensure sustainable scale and impact:

  • Strategic alignment
  • Organizational readiness and performance management
  • Governance and risk management
  • Process and workflow design
  • Data and technology enablement

Agentic AI is not a future concept: It is here. The question for industry firms is whether you will lead or follow. This is a strategic decision, not a tactical one. Acting now will unlock superior economics, faster execution, and durable competitive advantage. Waiting means falling behind in a market that is rapidly accelerating away from traditional operating models. The time for decisive action is now.

For the full white paper this article is based on, click here.

What Life Insurers Can Learn From P&C

Life insurers lag behind P&C carriers in claims digitization, creating an unsustainable innovation gap in today's digital landscape.

Person in the foreground with a notepad in front of them sitting at a table with a computer with two other people sitting in front of a window

The insurance industry has undergone rapid innovation over the last decade, but not all sectors within the industry have evolved equally.

Property & casualty (P&C) insurers, for example, have made impressive strides in digitizing and optimizing claims. Life insurance has been slower to modernize. This disparity has resulted in a significant innovation gap between life and P&C claims processing. It stems largely from fundamental differences in claim volume, complexity, and customer expectations. But as the gap grows, it becomes increasingly unsustainable in today's fast-moving, digitally driven world.

Outlined below are the key lessons life insurers can learn from P&C's digital transformation, as well as a road map for how life insurance carriers can accelerate claims modernization while preserving trust, empathy, and compliance.

Why the Innovation Gap Exists

P&C insurers handle millions of claims annually, many of which are low-severity, high-frequency events like fender-benders or storm damage. Keeping pace with claims volumes is what provided incentives for early investment in automation, AI, and self-service to help optimize processes for both the insurer and the insured.

In contrast, life claims are often low-frequency but high-emotion, and manual processes were deemed the most acceptable approach for these emotionally charged events.

Consumer expectations have since shifted.

Wider industry pressures – demographic changes, labor shortages, the pervasiveness of AI, and more – are further catalyzing this transformation. Gartner predicts that by 2026, 30% of enterprises will automate more than half of their network activities.

The call to action is clear: Life insurers must digitize to meet modern expectations without compromising on a customer experience that balances empathy, accuracy, and compliance.

Leveraging AI for Automation

P&C insurers have already built a foundation of speed and efficiency by embracing digital-first operations. From first notice of loss (FNOL) to straight-through processing (STP) and digital documentation, many previously manual claims processes are now automated, rules-based, and augmented with AI. The result is better fraud detection and faster triage at scale.

Major auto and home insurance carriers already use automated workflows and proprietary mobile apps to resolve minor claims blazingly fast. Insurers use these platforms to submit photos of vehicle or home damage, which AI tools instantly assess, use to estimate repair costs, and process in real time. In some cases, claims are approved and paid within minutes.

Alternatively, life claims remain labor-intensive and complex, with paper death certificates, manual policy validation, and disconnected systems leading to long delays. These laggard operations no longer align with customer expectations or enable operational sustainability.

The AI models that are widely used in P&C to assess claim complexity, detect anomalies, and flag fraud in real time can similarly be applied to life insurer workflows – flagging incomplete claims, triaging straightforward cases for fast-track approval, even personalizing communication based on behavioral or demographic data.

McKinsey anticipates that by 2030, more than half of claims activities will be automated.

Digital With a Human Touch

From mobile-first claims submission to real-time status update chatbots, many P&C carriers now offer seamless self-service options that keep customers happy and informed. This has reshaped customer expectations across all lines of insurance – and life insurance is no exception.

But the inherently emotional and often painful nature of life insurance claims make clarity, transparency, and speed essential when adopting practices from P&C.

Rather than being a one-to-one template for life insurance innovation, P&C's use of customer journey mapping and design thinking offers life insurers a model for where to begin when modernizing their operations. By mapping the end-to-end life claims experience – from the beneficiary's first contact to final payout – insurers can uncover and address friction points such as multiple document requests, long silences, or poor communications.

The X-factor for implementing these changes is that, alongside automation and personalization techniques such as instant document upload and multi-channel status updates, life insurers must also create precedents for enabling swift human intervention at key moments. For life insurance, the human touch must never be far away.

Tech solutions must then strive to make the process easier without making it feel cold. Automation should never come at the expense of empathy.

Intelligence Through Data and Ecosystem Integration

Advanced data usage has long been a defining feature of P&C claims transformation. Carriers routinely use third-party data – weather reports, IoT and telematics data, government records – to populate claims automatically, assess risk, and identify anomalies in real time, resulting in faster decisions.

Life insurers can achieve the same effect by integrating with government databases, obituary application programming interfaces (APIs), health records, and even social media, to validate deaths quickly and securely.

A Matter of Life and Death

The innovation gap between P&C and life insurance claims has finally become a solvable challenge, with the barrier to entry for automation, more empathetic customer experiences, and smarter, more connected data ecosystems lower than ever before.

By adopting the automation tactics honed by P&C insurers and anchoring them in the empathy that life insurance demands, life insurers can modernize claims in a way that enhances trust, improves efficiency, and delivers lasting value.

But life insurers must act now. Because reimagining life claims through a digital lens isn't just possible: It's imperative for long-term competitiveness and customer loyalty.


Gayle Herbkersman

Profile picture for user GayleHerbkersman

Gayle Herbkersman

Gayle Herbkersman is Sapiens’ head of property & casualty, North America, responsible for its software and services.

She has over 25 years’ experience working within the global insurance industry, holding insurance leadership roles in P&C software, professional services, and software-enabled business process outsourcing. Prior to Sapiens, Gayle held leadership positions at DXC Technology, CSC, and Capgemini.

The Customer Revolution in Insurance

Insurers sit on data goldmines yet fail to leverage customer insights like tech giants, missing trillion-dollar opportunities.

View of the top of a globe of Earth with data points coming off the globe represented with white verticle lines

Today's digital giants didn't just change the game, they rewrote the rules. They turned customer insight into capital, behavioral data into billion-dollar products, and user experience into enduring brand loyalty. They've built trillion-dollar empires by knowing their customers better than the customers know themselves.

It's mad to think that there's only a handful of these ecosystem drivers that include the likes of Amazon, Alibaba, Apple, and Google. But that's not the craziest part. What's incredible to me is that these ecosystems don't exist in insurance. After all, what these established ecosystems do well is simply to maximize the value of a customer by maximizing their value to a customer. This is achieved through continuous, data-driven innovation and activated through a well-orchestrated ecosystem of partners.

Now consider insurance: an industry that holds more data than most tech platforms could dream of. Not just consumer data but also operational, behavioral, environmental, and risk data. To top it all off, even more data is only an arm stretch away and available from connected cars, smart home devices, wearables, and IoT systems.

The insurance industry collects fresh, high-value insights from millions of interactions every single day, yet most of it sits idle, trapped in outdated systems, fragmented across silos, and rarely used to its full potential. This is a massive missed opportunity, and it's not a stretch to say that the sector really does have the opportunity to emulate e-commerce's proven, multitrillion-dollar, customer-centric business model.

However, the issue is far more than a technology change. This shift and the huge commercial upsides that accompany it require a business model and mindset change. Rather than seeing customers as policyholders, insurers must recognize them as the central product. By harnessing the extensive data sets at their disposal, the sector can create hyper-personalized experiences, optimize pricing strategies, and drive entirely new revenue streams.

This customer-centric shift isn't just about meeting consumer demand for digital services, it's about fundamentally reshaping profitability by applying a successful, established approach.

There are many ways that these business models drive growth through value creation. Building around the customer means you integrate experiences, partners, products and services around people. However, insurers are not typically built this way. Most are built in legacy policy administration systems with data models that sit atop, trying to abstract that policy-focused data into a customer cohort, drive insight and then reapply it back into experiences.

This is far too slow. Like a hot sales lead, customer data is a perishable asset. Its value fades fast if not acted on in the moment. Customers want buying insurance to be fast and frictionless, without being dragged through a long list of opaque, hard-to-answer questions. When we are in a claims process, we need insurers to see and respond to our data in real time. Even when we are being sold new coverage, we ideally need it a few clicks away, or, better still, embedded and in context. And when we move to experiences centered on helping us understand, navigate or even mitigate risks, we need that real-time, too. Tomorrow is nearly always too late.

But this isn't just about speed or seamless claims. It's about making sure the cover we receive fits our individual needs, now and as they evolve. It's about removing stress from the claims experience, not adding to it. And it's about transforming renewal from a transactional moment into a meaningful interaction.

That means having the data to offer genuine advice, based on how a customer's life has changed, or, better still, eaching out when a change is detected through partner data. That's what it means to value a customer - using insight to anticipate their needs, build trust, and position the insurer as a true partner - not just a silent presence that reappears at renewal time with a price hike.

In essence, insurers must massively increase their knowledge of their customers, not just acquire their data. Insurers must then act on this knowledge through embedded, adaptive, and risk-mitigating propositions that meet the demand of dramatically changed demographics, economics and lifestyles.

This requires a business model change, enabled by a new technology foundation and driven by an evolving culture. Core technology built for insurers - especially when built on MACH foundations and designed to function like a true ecosystem driver - can only realize its full potential if it's matched by changes in mindset, structure, and culture.

  • No more silos. Everyone in your organization needs to be customer only, not just customer first. Teams must look and act more like agile software development squads than artificial clusters of mixed functions. Actuaries, developers, product owners, experience designers, data engineers, etc., must all work together constantly with clear goals and outcomes.
  • Change must become a constant, and roles must move from operational management to customer experience improvements. Claims handling becomes claims optimization.
  • Experimentation needs to rise dramatically. Learning fast means never failing. Where all data is mined as a perishable asset and acted on, this includes ways of making people's understanding, use, and experience better, as well.
  • Technology must become an enabler of new, unimagined futures, not just an operating entity and IT constraint. Any line of insurance and even complementary non-insurance products needs to be managed and operated from one core platform. There can be no IT bottlenecks or downtime for any reason. Where interoperating partners aren't just about application programming interface (API) models, the issue is about how quickly those partnerships can be applied to experience outcomes.

All of this needs to happen in a business model where the time to generate value from new insights is attainable in minutes, not days.

There's a compelling commercial imperative behind all of this. Happy customers, who easily self-serve through digital interactions and access human support when it counts, are more satisfied and more loyal. Ultimately, they form more trusted relationships and will buy and do more with their insurer.

When your customers buy multiple things from you, their value over risk will start to look far more interesting. We aren't just talking about "multi-car" type propositions, as useful as they can be, we are talking about insurance portfolios or relationship products.

Take life insurance, a market set to transform over the next 18 months to five years. It suffers from increasingly low relevance and low penetration rates. Lifestyles, demographics and life stages have changed dramatically. The propositions this market offers should change as well, adapting as people's financial and health profiles change. Current products, sold once and then engaged when someone dies, need to give way to more holistic protection and life models.

Perhaps underwriters and actuarial roles will finally be fused with customer experience and analytics functions, creating holistic models that when combined will stretch far beyond "policy" thinking.

However, as a result of this need for technological and business model shifts, insurers with their current legacy and modern legacy footprints will struggle. As it is, adaptivity is too slow and expensive. New insurers are emerging, and the market dynamics are now forcing legacy insurers to change -- from regulation asking them to treat customers better, all the way through to new digital and intelligently orchestrated experiences.

Insurance has a new battleground. Deeper relationships, more loved products and services, generating more value through more propositions. This has to replace price-led competition.

The value chain model is broken. Ecosystems aren't optional, and customers aren't things you bolt on to your technological core. They should sit at the center, and everything should interoperate around them.

The reality is that even if you want to operate as a "value chain" business, your best way of minimizing costs and maximizing distribution still lies in being able to value customers and service them in any channel, 24/7, in an increasingly intelligent and personalized manner.

This is the new commercial battleground for insurers. It seems most don't realize it yet. But the emergent competitive forces are beginning to bite, and we see many new emergent forces acting on the industry. Shareholders will start to see this gap, along with capital investors who are already diving in.

We are in exciting times for an industry long held at a tipping point.


Rory Yates

Profile picture for user RoryYates

Rory Yates

Rory Yates is the SVP of corporate strategy at EIS, a global core technology platform provider for the insurance sector.

He works with clients, partners and advisers to help them jump across the digital divide and build the new business models the future needs.

September 2025 ITL FOCUS: Resilience and Sustainability

ITL FOCUS is a monthly initiative featuring topics related to innovation in risk management and insurance.

itl focus resil

 

FROM THE EDITOR

Doing a major remodel on a home for the first time, I was struck by the builder’s comments when he saw the architectural drawings—comments along the lines of, “Oh, why did he specify this material, or take this design approach? If he had just done X or Y, he’d have saved you a lot of money.”

At that point, we could have gone back to the architect, but that would have meant more fees and caused a long delay as we restarted the approval process with the city, so we went with the original plans.

With our second major remodel, we knew better but were still trapped by the sequential nature of the process: An architect does the design, and then you put the project out to bid with builders. We finally succeeded on introducing cost to the design process the third time around, but only because I had formed a partnership with a builder to buy and remodel a home on spec. The builder would earn a share of the profits, so he happily dove into the design discussions.

In this month’s interview, Francis Bouchard, managing director of climate at Marsh McLenna, says efforts to make property more resilient in the face of escalating dangers must move toward the collaborative approach that worked in my third remodel. And, happily, he sees real progress.

Historically, someone built a building, a house or a community, then insurers came in and priced the risk. Instead, Francis says, the issue of “insurability” should be baked in from the beginning of the development of a property.

“Focusing on insurability allows us to enlist other critical players in the housing space to adopt this same, shared accountability approach,” he says.

“When you aggregate this approach across every player in the value chain, you create transformative results. You get architects incorporating resilience, developers considering wildfire protection, fully certified contractors who understand requirements, and properly prepared supplies that don't cause delays.”

He offers a long list of ways that the “insurability” conversation is taking hold. I think you’ll find it encouraging, even as we all see the headlines about soaring damages from natural disasters—perhaps especially as we all see those headlines.

Francis pointed me toward Nancy Watkins, a principal and consulting actuary at Milliman, who is building a “data commons” on what works and what doesn’t work when it comes to reducing risk in the wildland-urban interface (WUI), where so much of the risk from wildfire sits. Mitigating the risk for existing homes obviously has to be a huge part of any resilience effort.

She and her colleagues have completed the first two phases of the project (he report on Phase 2 is here) and are embarking on Phase 3, which will see them shepherd major mitigation efforts in 30 to 50 communities in as many as seven states. (She says she’s “trick or treating” for sponsors, so contact her if you’re interested in getting involved.)

I’m sure there will be lots of disappointments. As she noted to me, it’s not enough just to have the data on what works, you have to get it out to people and have to get them to act on it, both as individuals and as a community. And getting good data is hard enough.

But I’m more encouraged than I was before talking with Francis and Nancy and think you will be, too, once you read this month’s interview and check out the recent ITL articles I’ve shared on resilience and sustainability.

Cheers,

Paul

 
 
An Interview with Francis

The New, Much-Needed Conversation on Resilience

Paul Carroll

It was almost exactly a year ago that I attended a gathering you helped put together in Atlanta for a group that helps universities and insurers collaborate on research concerning climate risk, so this feels like a great time to catch up. What would you say are the major advances in the past year in making the world more resilient, and in the insurance industry’s efforts on that front?

Francis Bouchard

Things are starting to coalesce. As someone who's been active in this space almost exclusively for four years, I'm starting to see some real positive signs. Some of that is from insurers themselves, who are leading efforts on risk reduction opportunities, whether through IBHS [the Insurance Institute for Building & Home Safety] or other standards.

I see more industry activity—concrete, real activity—than I've seen at any other time in the last four years. Kudos to those companies that are really starting to look at these challenges in new and different ways. I see more and more non-insurers looking at insurance as a viable part of the solution and wanting to create an environment where homes and communities are insurable.

read the full interview >

 

 

MORE ON RESILIENCE AND SUSTAINABILITY

Transforming CAT Modeling: The LLM Imperative

Large language models are transforming insurance risk management from reactive assessment to proactive, real-time catastrophe mitigation.
Read More

Managing Hyper-Volatility in the Modern Age

Climate change intensifies geopolitical risk. How can organizations protect themselves against extreme, rapid and unpredictable changes?
Read More
phones

3 Key Steps for Climate Risks

83% of insurers view predictive analytics as "very critical" for the future of underwriting, but just 27% say they have the necessary capabilities. 
Read More
hands in a meeting

Lessons for Insurers From the LA Fires

California wildfire survivors battle insurers over systematic underinsurance while navigating complex recovery efforts.
Read More

Secondary Perils Are Now a Primary Threat

Outdated catastrophe classifications hinder insurers' ability to effectively manage escalating threats from all perils.

Read More

 

megaphones

Role of ILS In Traditional Risk Transfer

The insurance-linked securities market reaches the $50 billion milestone as investors seek uncorrelated returns amid increasing catastrophic risks.

Read More

 

 
 
 

FEATURED THOUGHT LEADERS

Jaimin Das
 
Ester Calavia Garsaball
Lance Senoyuit
Biswa Misra
Jack Shaw
 
Rory Yates
Garret Gray
Amir Kabir

 

 


Insurance Thought Leadership

Profile picture for user Insurance Thought Leadership

Insurance Thought Leadership

Insurance Thought Leadership (ITL) delivers engaging, informative articles from our global network of thought leaders and decision makers. Their insights are transforming the insurance and risk management marketplace through knowledge sharing, big ideas on a wide variety of topics, and lessons learned through real-life applications of innovative technology.

We also connect our network of authors and readers in ways that help them uncover opportunities and that lead to innovation and strategic advantage.

Cut Costs & Strengthen Security by Tackling Technical Debt 

Unify risk systems to reduce costs, boost resilience, and improve oversight. 

man touching screen

eBook | Is Technical Debt Holding Back Your Risk Strategy? 

 Is your organization weighed down by fragmented risk systems and rising IT costs? Origami Risk’s latest guide reveals how integrated risk management (IRM) can help you overcome technical debt, reduce your total cost of risk, and improve operational efficiency. 

Discover how leading organizations are:   

  • Consolidating risk, compliance, and audit tools
  • Reducing vendor complexity and licensing costs
  • Enhancing visibility and response times across the enterprise 

  Download the eBook to start building a scalable, secure, and cost-effective risk management strategy. 

Download the eBook Now  

 

Sponsored by: Origami Risk


Origami Risk

Profile picture for user OrigamiRisk

Origami Risk

Origami Risk delivers single-platform SaaS solutions that help organizations best navigate the complexities of risk, insurance, compliance, and safety management.

Founded by industry veterans who recognized the need for risk management technology that was more configurable, intuitive, and scalable, Origami continues to add to its innovative product offerings for managing both insurable and uninsurable risk; facilitating compliance; improving safety; and helping insurers, MGAs, TPAs, and brokers provide enhanced services that drive results.

A singular focus on client success underlies Origami’s approach to developing, implementing, and supporting our award-winning software solutions.

For more information, visit origamirisk.com 

Additional Resources

ABM Industries

With over 100,000 employees serving approximately 20,000 clients across more than 15 industries, ABM Industries embarked on an ambitious, long-term transformation initiative, Vision 2020, to unify operations and drive consistent excellence across the organization.  

Read More

Webinar Recap: Leveraging Integrated Risk Management for Strategic Advantage

The roles of risk and safety managers have become increasingly pivotal to their enterprises' success. To address the multifaceted challenges posed by interconnected risks that span traditional departmental boundaries, many organizations are turning to Integrated Risk Management (IRM) as a holistic approach to managing risk, safety, and compliance. 

Read More

The MPL Insurance Talent Crisis: A Race Against Time

Managing Medical Professional Liability (MPL) policies has never been more complex — or more critical. With increasing regulatory demands, growing operational costs, and the ongoing talent drain, your team is expected to do more with less.  

Read More

MGA Market Dominance: How to Get & Stay Ahead in 2025

Discover key insights and actionable strategies to outpace competitors and achieve lasting success in the ever-changing MGA market. The insurance industry is transforming rapidly, and MGAs are at the forefront of this change. Adapting to evolving technologies, shifting customer needs, and complex regulatory demands is essential for staying competitive.

Read More

Automating the Garbage Can

Despite $30 billion to $40 billion in AI investment, 95% of organizations achieve zero return, MIT study finds.

Digitized image with blocks and a camera lens tinted blue overlayed across cars on a street in a city

MIT's NANDA Project—established to help drive AI integration in enterprise settings—recently released its mid-year report. The key finding is stark: Despite $30-$40 billion in enterprise investment into generative AI, 95% of organizations are getting zero return.

From the report: "The core barrier to scaling is not infrastructure, regulation, or talent. It is learning. Most GenAI systems do not retain feedback, adapt to context, or improve over time."

This admission departs sharply from the GenAI industry's long-held narrative that scale—more infrastructure, more training data—is the key to success. Thus, Big Tech has funneled over $500 billion into new AI datacenters over the past two years, betting that technical expansion alone would lead to better outcomes.

Blaming the technology and the technology alone for the 95% failure rate would be a mistake. Organizational realities must also be considered.

The Garbage Can theory—a seminal framework introduced by Michael D. Cohen, James G. March, and Johan P. Olsen in the early '70s—sees organizational decision-making as a random, chaotic process where problems, solutions, and decision-makers mix like garbage in a can. Decisions are often made not through linear analysis, but when a pre-existing solution (a technology, a pet project) goes looking for a problem to solve, and they connect at the right moment.

In "organized anarchies"—such as the insurance enterprise—decisions surface more from political realities, business urgencies, happenstance, and fragmented routines than from structured analysis.

MIT NANDA's findings reveal that AI pilots frequently reflect this "garbage can" environment. Rather than deploying disciplined, contextualized programs, organizations launch generic AI tools with unclear goals, disconnected stakeholders, and insufficient governance. High failure rates stem from this context vacuum: Solutions chase problems but lack clarity on objectives or pathways for integration.

Where measurable success emerges, automation is tightly linked to specific workflow tasks—especially in finance, HR, and operations. In these areas, context and routine enable AI to deliver quantifiable savings and efficiencies, making back-office automation a financial standout.

In contrast, customer-facing applications often attract investment due to hype but rarely deliver robust returns. These projects suffer most from the garbage can effect: fragmented pilot teams, fluctuating requirements, and poorly defined goals.

The lesson is not that AI lacks potential but that organizational learning and context are prerequisites for meaningful automation. The prevailing narrative in AI casts it as a source of algorithmic precision, promising to banish organizational mess. But the garbage can will abide. The deeper challenge of AI adoption is organizational, not technological.

Deployed naively, AI becomes just another item in the garbage can—an expensive tool in search of an application, championed by some departments and ignored by others. The outcome: fragmented initiatives and wasted investment.

The best results always come when humans and AI collaborate, with humans providing context and ethical nuance, and AI bringing data-scale and pattern recognition. Ultimately, the strategic imperative is not simply to "implement AI" but to orchestrate its confluence. Consider these three recommendations:

  • Ask: "What does it improve, and by how much?" Focus on business outcomes before technology. Pick a metric and desired result, first.
  • Frame problems, not just solutions. Rather than asking "What can AI do?" define critical business problems, then determine how human-AI collaboration can address them.
  • Create deliberate choice opportunities. Design forums—cross-functional teams, innovation labs, strategy sessions—where problems and solutions connect intentionally, reducing randomness and supporting strategic adoption.

Human catalysts—those with fusion skill sets—are the drivers. Investments in training and culture change should always exceed spending on the technology itself.


Tom Bobrowski

Profile picture for user TomBobrowski

Tom Bobrowski

Tom Bobrowski is a management consultant and writer focused on operational and marketing excellence. 

He has served as senior partner, insurance, at Skan.AI; automation advisory leader at Coforge; and head of North America for the Digital Insurer.   

Google's AI Nailed Its Hurricane Erin Forecast

Google's machine learning approach will likely keep improving hurricane forecasts, too.

Image
hurricane storm on the earth

For the longest time, the basic approach to developing an AI was for the humans to teach the machine everything they could, then have the software take it from there. That approach worked. It's how IBM's Deep Blue defeated world chess champion Garry Kasparov in a six-game match in 1997 and how Google's DeepMind's Alpha Go defeated arguably the world's top Go player in five games in 2016. 

Then the scientists had a different idea: What if they let the AI learn entirely on its own, without regard for any human preconceptions, after just being given the rules of a game? That worked even better. By playing millions of games against itself, what DeepMind called Alpha Go Zero learned Go so well in three days that it defeated Alpha Go in 100 straight games. 

DeepMind then went the next step and developed an AI that hadn't even been taught the rules of Go. It trounced Alpha Go Zero. 

DeepMind is taking that sort of approach with hurricane forecasting. Rather than use the traditional approach — feeding massive amounts of data to supercomputers loaded with physics equations that spend hours and hours calculating forecasts for storms — DeepMind left out the physics equations piece, as well as all other guidance. Basically, DeepMind says: Here is all the data we have on hurricanes. You figure out what it means for future storms. 

The approach has shown promise with earlier storms, and DeepMind's AI just nailed the forecast for Hurricane Erin, outperforming both the official, supercomputer-based forecast and other commonly used models.  

Let's have a look at how far the AIs have come, so very fast, as well as where they can go from here. 

The promises of the deep learning approach first showed up on my radar not quite two years ago. In September 2023, I wrote a commentary lauding what advancements in supercomputing and satellite imagery were doing for forecasting. Just a month later, I found myself writing about AI models that, according to the Washington Post, had shown during that hurricane season that they "portend a potential sea change in how weather forecasts are made."

Now, Ars Technica reports that Google's AI outperformed the official forecast and numerous other of the best physics-based models on both intensity and the storm track, even after the other models were corrected for known biases. 

The article notes that the outperformance occurred with predictions reaching out to as much as three days ahead, while the most important forecasts are those three to five days ahead, because that's when many key decisions about evacuations and other preparations are being made. 

"Nevertheless," Ars Technica says, "the key takeaway here is that AI weather modeling is continuing to make important strides. As forecasters look to make predictions about high-impact events like hurricanes, AI weather models are quickly becoming a very important tool in our arsenal.

"This doesn't mean Google's model will be the best for every storm. In fact, that is very unlikely. But we certainly will be giving it more weight in the future.

"Moreover, these are very new tools. Google's Weather Lab, along with a handful of other AI weather models, has already shown equivalent skill to the best physics-based models in a short time. If these models improve further, they may very well become the gold standard for certain types of weather prediction."

Let's hope that the AIs continue their remarkable progress and, if so, that the public comes to trust them. A lot of damage and injury could be avoided.

In the meantime, fingers crossed that this year's hurricane season stays relatively quiet. 

Cheers,

Paul

The New, Much-Needed Conversation on Resilience

As natural catastrophes intensify, Marsh's Francis Bouchard says the focus should shift away from how to price risk and toward "insurability." 

 Resilience and Sustainability itl focus interview

Paul Carroll

It was almost exactly a year ago that I attended a gathering you helped put together in Atlanta for a group that helps universities and insurers collaborate on research concerning climate risk, so this feels like a great time to catch up. What would you say are the major advances in the past year in making the world more resilient, and in the insurance industry’s efforts on that front?

Francis Bouchard

Things are starting to coalesce. As someone who's been active in this space almost exclusively for four years, I'm starting to see some real positive signs. Some of that is from insurers themselves, who are leading efforts on risk reduction opportunities, whether through IBHS [the Insurance Institute for Building & Home Safety] or other standards.

I see more industry activity—concrete, real activity—than I've seen at any other time in the last four years. Kudos to those companies that are really starting to look at these challenges in new and different ways. I see more and more non-insurers looking at insurance as a viable part of the solution and wanting to create an environment where homes and communities are insurable.

There are discussions happening with builders that weren't happening a year or two ago. There are discussions happening with architects that weren't happening a year or two ago. This system-level awareness that's growing is really encouraging because this is not an insurance problem—it's a risk problem and an insurability problem.

Many sectors are accountable for reducing risk before a home presents itself to an insurance company to be insured and priced. The fact that meaningful discussions about what other players in the value chain could do to reduce the risk of these homes is wildly encouraging. Some of that's happening in the context of the California rebuild, while some is happening with organizations trying to coalesce stakeholders to pursue a national or larger-scale solution.

I'm encouraged because people are talking, more people are acting, and people are starting to see the connection points more clearly than perhaps they had before.

Paul Carroll

What other programs, similar to IBHS’s FORTIFIED, are making strides in promoting resilient construction?

Francis Bouchard

I'd point to the LA Delta Fund, dedicated to the 12,000 homes burned in the Eaton fires. It focuses on closing the gap between what insurance proceeds will pay for and what it takes to achieve a truly resilient construction level. We often debate who should bear this cost—consumers or insurers. This organization has found a way to attract both return-bearing capital and philanthropic capital to create a blended capital fund that pays the difference—the delta—between insurance proceeds and the cost of resilient construction. They are close to launching the fund and beginning to facilitate a much higher level of resilient reconstruction in LA following the fires.

This initiative is, in many ways, epic. It's never been done before, certainly not at this scale. The fact that they can raise money from markets indicates that the interest in ensuring resilient rebuilding extends well beyond the insurance sector.

Paul Carroll

Any other examples leap to mind?

Francis Bouchard

There's the Triple-I project with PwC in Dallas that is aligning stakeholders to facilitate the rebuilding or retrofitting of homes to the IBHS standards. This is another concrete example of insurers coalescing to change the risk profile of a community.

Then you have individual firms pushing the envelope. Milliman is doing an immense amount of work, with Nancy Watkins focusing on the WUI [wildland-urban interface], where the interaction between communities and wildfire is the most extreme.

Mercury Insurance is engaging with communities about what it takes to convince them to take steps that would make them insurable. We're starting to see a shift from thought leadership to community engagement.

Paul Carroll

What industry-academia research projects have generated the most interest, and where do they stand?

Francis Bouchard

Nothing has been launched yet, as we are still waiting on a funding announcement from the NSF [National Science Foundation] and corresponding funding from industry partners. We’re cautiously optimistic about the NSF and think industry funding will follow. 

The project that generated the most interest last September was a platform to facilitate dialogue between the atmospheric science community and the insurance underwriting community and help both sides better understand the value and use of available data sources. Considering the recent changes and, in some cases, wholesale dismantling of government departments or capabilities, this issue has become even more pressing and will likely appeal to numerous companies.

Dialogue is already occurring in multiple forums. We're hoping to coalesce these discussions and create a trusted pipeline of information flowing between federal data sources and the insurance sector.

Another well-received proposal focused on improving decision-making by narrowing uncertainties and addressing them differently. This proposal will likely garner attention from the insurance industry as companies seek to systematically understand and address uncertainties from weather, policy, and FEMA perspectives. The uncertainties simply accumulate.

The community-based catastrophe insurance project is another initiative we'll likely pursue. This topic is particularly ripe given the need for more innovative risk-bearing solutions.

Paul Carroll

What about developments at major insurance industry players?

Francis Bouchard

We [Marsh McLennan] recently announced our participation in a carbon trading mechanism to derisk the issuance of carbon credits. You're seeing more insurers and brokers focusing on this as a way to facilitate the projects that generate the credits.

There's also a more macro-level shift emerging—a growing awareness around shared accountability for the insurability of homes. The debate today typically centers on the technical nature of pricing risk. What we're trying to do is use this notion of insurability to reframe the conversation.

The right question isn't about pricing; it's about understanding the thousand decisions made that led to a home having its particular risk profile. We in the insurance industry are not the end-all, be-all. We are simply reflecting the thousand decisions made prior to receiving the submission.

Focusing on insurability allows us to enlist other critical players in the housing space to adopt this same, shared accountability approach. Non-insurance professionals often expect mind-numbing analytics and modeling. When you simply ask, "What can you do to reduce the risk that a house faces when it's finally built?", people respond with, "Oh, that's it? That's doable." And it should be doable.

When you aggregate this approach across every player in the value chain, you create transformative results. You get architects incorporating resilience, developers considering wildfire protection, fully certified contractors who understand requirements, and properly prepared supplies that don't cause delays.

When all these stakeholders understand their role in reducing risk, it makes our role significantly easier.

Paul Carroll

Thanks, Francis

About Francis Bouchard

francis headshot

Francis Bouchard is an accomplished global public affairs professional who has served as an advisor, catalyst and contributor to a series of climate resilience and insurance initiatives. He is currently the managing director for climate at Marsh McLennan, and earlier served as the group head of Public Affairs & Sustainability for Zurich Insurance Group, where he focused on aligning the group’s government affairs, sustainability and foundation activities. He originally joined the insurance sector in 1989 and since has held a series of industry-focused advocacy, communications, sales, citizenship and public affairs roles, both in the U.S. and in Switzerland.

Francis also chairs the board of directors of SBP, a national non-profit focused on disaster resilience and recovery, serves on the board of the climate-focused insurtech incubator InnSure, and is a member of the advisory council of Syracuse University’s Dynamic Sustainability Lab.


Insurance Thought Leadership

Profile picture for user Insurance Thought Leadership

Insurance Thought Leadership

Insurance Thought Leadership (ITL) delivers engaging, informative articles from our global network of thought leaders and decision makers. Their insights are transforming the insurance and risk management marketplace through knowledge sharing, big ideas on a wide variety of topics, and lessons learned through real-life applications of innovative technology.

We also connect our network of authors and readers in ways that help them uncover opportunities and that lead to innovation and strategic advantage.