Tag Archives: spatialkey

How P&C Can Use Crime Data Better

For insurers that underwrite commercial properties, crime risk data can often be overlooked or under-prioritized in comparison with other hazard data. In the P&C industry, we’re often quick to note the insured or economic losses associated with catastrophes. But did you know that crime costs billions each year? The FBI reported nearly 7.2 million property crimes in the U.S. in 2018, with an estimated $16.4 billion in property crime losses, not including arson.

Although crime data can be spatially expressed just like flood, hurricane, hail and other perils, it is commonly an overlooked piece of the property risk puzzle, and therefore not often supported by underwriting guidelines. One of the factors contributing to the underuse of crime data is that it has generally lagged behind other hazard data in research and development. Now, however, leading data companies are creating models that account for the location where the crime actually occurred, allowing crime risk to be geographically represented.

Traditionally, crime data has not been detailed or accurate enough for underwriters to gain a comprehensive understanding of the true risk related to a property or portfolio. That’s because there is a lack of crimes reported at the geographic level versus the law enforcement agency level. Data from Location Inc. and Pitney Bowes, however, can be applied at the point of underwriting or portfolio management to help insurers assess the likelihood of violent crime, theft, vandalism and even behavior-based fire risk at the street level.

See also: Using High-Resolution Data for Flood Risk

Here are a few examples of how P&C insurers can apply crime risk data within a solution like SpatialKey for more informed underwriting and property risk assessment:

Use Case 1: Violent Crime & Property Crime

The below image shows a top tier college campus with a moderate crime score according to Location, Inc.’s SecurityGauge crime data. A moderate score is the national average for crime; however, the area around this college’s location has an above-average crime rating.

As you can see, there is a clear delineation between the school’s campus (near average risk relative to the nation, shown in yellow) and the surrounding city (in orange/red), which has an elevated crime score. So while the school itself is at or below the national average for crime risk, knowing that there is some high crime nearby could raise a flag to ensure proper coverages and adequate premiums are in place when underwriting this risk.

Use Case 2: Arson

When underwriting property risk outside the U.S. and Canada, for example in the U.K., Pitney Bowes crime data, for England, Wales, Scotland, and Northern Ireland, can be used within SpatialKey to understand factors driving overall risk, as shown below.

The location has a very high score (borderline extreme, in fact) for arson. When looking at this map, you can see that there is a bus depot across the street from the location in question. This information should factor into your risk assessment due to the flammable nature of the bus depot.

These use cases demonstrate how using expert crime risk data wcan help insurers:

  • Gain a more comprehensive view and reduce adverse selection by determining the overall crime rate for an area at the street level.
  • Adequately or more accurately price for the associated risk — For example, setting a higher theft deductible if the crime data shows increased crime in the area, or lower premium if the property has security measures on-premise such as cameras, lobby security, etc.
  • Determine which coverages to limit or even exclude based on the characteristics of a particular neighborhood/community (e.g. for an apartment complex, review the crime data to determine the level of property and violent crime in the area and limit/exclude coverages accordingly).
  • Evaluate concentrations of exposure in particular for a schedule of risks, as crime codes can vary greatly within the boundaries of a single city.

See also: Fighting Fraud With Data Analytics

Check out this article for more information about how to use expert crime data within SpatialKey to inform your underwriting.

A Maritime Metaphor for Change in P&C

“The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails.” – William Arthur Ward, Writer

New digital technologies, increased competition and changing customer demands are forcing 61% of insurance carriers and financial services firms to move away from traditional business models, according to a recent global study of C-suite insurance and financial services executives. While we don’t need a study to tell us that digital disruption is real, what’s mind-boggling is what the other 39% of organizations are (or aren’t) doing about it.

To use a maritime analogy, there are essentially three types of organizations out there. The pessimist says, “Humpf, this storm will surely wreck our ship, so we’re staying in port.” The optimist says, “We’ll stay the course and hope the storm passes.” The realist says nothing—and plots a new course.

Where does your organization fall? One thing is clear: The seas are changing, and the time has come to make some pivotal choices about where and how you’re going to steer your ship.

Ecosystems and interoperability are the waves of the future

All the waste and operational inefficiency that exists in the current P&C environment is simply not sustainable. By getting on board with a more open ecosystem, organizations can accelerate innovation and move our entire industry forward faster.

See also: Road to Success for P&C Insurers  

The bigger the ship, the harder it is to stop or turn on a dime. You understand the need for change, but inertia is keeping you from dealing with a host of challenges—from complex, inflexible legacy systems to regulatory considerations, sunk development costs and just plain skepticism about whether new solutions can deliver on their promises.

I understand and can empathize with all of these hurdles, having spanned the spectrum of the insurance value chain in my career, from broker to modeler and now solution provider. Change is fraught with risk. But staying the course is its own risk.

“Well, in insurance, we move slowly,” isn’t an argument you’ll hear from those in the 61%. Not when there are a host of practical technologies and platforms, such as innovations in data and analytics, that have been built to complement existing systems—and can be implemented right now, not years from now.

Pragmatic innovations, ecosystems and interoperability accelerate change

Scott McConnell, divisional president for NTT Data Services, who published the global C-suite study previously mentioned, wrote in an Insurance Thought Leadership article:

“Modernization and core systems have been a conversation for years, but insurers no longer have to face the costly and time-consuming option of replacing legacy technology – or continuing on the same limited path. With a digital business platform (DBP), they can adopt and integrate new technologies with their existing core systems, allowing them to work with a global ecosystem of partners to become more nimble and customer-focused.”

No matter the size of your ship or the complexity of your systems, reaping the benefits of more pragmatic technologies means tapping into an ecosystem of partners that can accelerate change without disruption to your legacy and core systems.

But, while having a host of practical point solutions to assist in core workflows is necessary, it’s not entirely enough. The ability to advance innovation and market efficiency hinges on improved connections between systems, or interoperability. To effectively leverage practical solutions, investment and attention must be paid by insurers, reinsurers, brokers and solution providers to advance interoperability among systems. This includes the formation of open data standards for the transfer of data in the marketplace, open modeling and data platforms to allow the market to leverage a best-of-breed view of risk across a multitude of expert providers. Likewise, open APIs are needed to facilitate seamless workflow integrations between in-house systems, technology providers and modelers/data providers.

Keep it smart and simple

It’s clear we’ve reached uncharted waters in our industry. Will you stay the course or brave new seas? Sure, there are regulatory and change management considerations along with competing priorities—all of this is true. The first step starts by acknowledging all of it and recognizing you can’t just wait. There are practical innovations right in front of you that you can do and that can create momentum without heavy investment in time and resources, and without totally redoing your legacy systems.

See also: Provocative View on Future of P&C Claims

I challenge you to think about how you can bring simplicity to some very complex problems. Look to your partners. Look to your customers. Look to pragmatic technologies. And then plot a course for change.

7 Keys for Automated Event Response

This is the fourth article in a four-part series. You can find the first part of this series here, the second here and the third here

I’ve worked in the insurance industry for more than 35 years, and I’ve never been more energized about the possibilities before us than right now. Working in both personal and commercial lines, including excess property, I’ve seen how technology has enabled the practices of exposure management, underwriting and claims to evolve from manual processes and “pins on a map” to complex, computer-driven workflows that enhance an insurer’s ability to provide superior products and services to their customers.

At SpatialKey, we’ve been working diligently with several of our insurance clients to develop an automated event response solution that addresses key challenges:

  1. Meeting growing customer demands in a highly competitive insurance landscape.
  2. Driving cost-savings and increased profitability through more efficient event response and claims processes.
  3. Anticipating and preparing for more frequent and extreme peril events, particularly in parts of the U.S. that are more susceptible to climate change.

Some of the event response challenges I’ve heard directly from insurers, brokers and MGAs include:

  • “I need to know what happened when I was sleeping, traveling or working on something else—without having to jump through hoops to find out.”
  • “We’re dealing with time-sensitive situations, but the manual nature of exposure data collection, event monitoring, as well as data research and procurement, delays our ability to respond to events expeditiously.”
  • “I need a solution that not only focuses on events that I need to be concerned about but also allows me to filter out the noise from events that I don’t care about.”

These comments point to the pressing challenges insurers face during catastrophes—specifically around speed, efficiency, accuracy and how an automated solution can help to solve them.

See also: Moving Toward Prevention, With IoT  

Insurers are in a squeeze play to find places where they can drive operational efficiency and reduce expenses. Event response and claims automation is a great place to start. It doesn’t require large financial commitments or heavy investments in time and IT resources, and, better yet, the impact is immediate. I know first-hand that event response automation is on the “transformation radar” for many of the organizations I work with. They simply have to make it a priority to automate pieces of the event response process to meet growing business and customer demands.

So, the question isn’t whether you should automate your event response operations. But rather…What are the key requirements of an automated solution? And, can a solution meet my specific business needs by delivering on criteria that will set my team up for success?

7 key questions an automated event response solution should answer for you, include:

  1. What are my current exposures? You can’t have an accurate understanding of an event’s impact without the most recent exposure data. A data import API ensures your data is updated regularly, and that you’ll always have a current snapshot of exposures to work from.
  2. What expert data is available? Streamlined and centralized access to trusted third-party hazard data as it becomes available is imperative. You shouldn’t have to procure and process expert data yourself. Likewise, you should be readily informed of new data sources as they become available.
  3. What happened? You should be the first-to-know about an event and its impact to your portfolio—so, when management looks to you for answers, you can be confident in your preliminary assessment. To achieve this, you need an automated system pushing you results so you don’t have to pull reports and analytics yourself. By clicking a single link in an email, you should be informed of the geography and severity of an event. This means analyses are executed automatically based on your latest exposure, as well as your predetermined financial and peril-specific thresholds.
  4. Do I need to care? Relevance is a critical asset because it prevents information overload during a time-sensitive process like event response. A custom approach to event notifications enables you to operationalize peril and exposure specific thresholds based on your company’s exposure knowledge and claims experience (e.g. $10 million in limit affected by hail that is two inches or greater). This filters out the noise by enabling you to define what’s important and then act expediently.
  5. What’s my financial impact? Instead of scrambling to manually pull information together for stakeholders, a pre-packaged report should be automatically generated for you. The ability to quickly assess financial impact, provide input on capital expectations and manage stakeholder expectations are all critical to your company’s preparedness and requires a financial modeling engine that delivers results that matter most.
  6. Where do I need to focus my outreach and service to affected insureds? To differentiate your business, customer outreach is imperative. By quickly pinpointing exact locations and accounts affected, you can serve your insureds—whether that means picking up the phone or putting boots on the ground. An event response solution should provide you with actionable information along with advanced analytics that enable you to further plan and communicate your strategic claims response.
  7. How can I dive deeper into the event? Because automation has saved you so much time answering the last six questions, now you can dig even deeper into the event. This requires an advanced analytics solution like SpatialKey that enables you to ask more questions of your data, analyze the event progression, pull in claims history and rate/premium information and average annual losses, etc.

You can think of the seven questions I’ve answered here as requirements for success in the new competitive landscape of P&C. Insurance organizations are facing greater scrutiny as catastrophic events become increasingly volatile. As such, how effectively you prepare for and respond to these events can either be an asset to your business or a detriment.

It’s time to move from “react and respond” to “prepare and serve”

A company’s ability to follow through on its commitments and service is a competitive differentiator. If your event response processes run more smoothly—if they’re built for performance—this translates to a more satisfactory customer experience. As one of my clients recently noted, “We’re not the cheapest coverage out there. So when it comes to shopping for insurance at renewal time, our service is what makes the decision to renew a no-brainer.”

See also: Natural Disasters and Risk Management

A solution like SpatialKey can modernize your event response operations without disruption or heavy investment, creating both operational efficiency and customer satisfaction. By moving from “react and respond” to “prepare and serve,” you are modernizing your processes to meet the growing demands and expectations of your customers and shareholders.

Technology will always be a moving target, and you may feel like you’ll never get ahead of the curve. But when you’re pursuing transformation initiatives, it’s important to consider your total investment. Automating your data and analytic operations shouldn’t require major service disruptions or heavy hardware spending.

How to Improve Event Response Workflow

This is the third in a series. The first two articles can be found here and here

When catastrophes strike, you have no time. You’re under pressure to quickly understand the financial impact of an event and provide estimates to management. At the same time, you (and your team) are constantly tracking the event, processing hazard data, making sure exposure data is accurate, pulling reports and (let’s hope) beginning outreach to insureds. The last item—customer outreach—may suffer, though, when the other to-dos consume your time and resources.

Speed and quality of response following catastrophes can be an asset to your organization—and a key reason why your customers choose you over your competitors—but only if you can make your event response operations run like clockwork. This entails moving away from the status quo and integrating elements of automation into your event response processes. Let’s take a look at some of the challenges you may face and how to implement a more proactive approach for minimal cost and disruption.

Hurricanes, in particular, illustrate the problem of quickly deriving insight from data. For example, does the following scenario sound familiar?

Imagine a hurricane strikes…

…and it’s affecting Texas, Florida or the Carolinas (probably not too hard to imagine, actually). Management is asking for the estimated financial impact of this event, and your stress levels are rising. It’s all hands on deck!

1) Get event data
You go to the NOAA website, pull down wind datasets from the latest update and work to get them into a usable format.

2) Intersect with your portfolio
Now, it’s time to intersect the footprint with your portfolio data, which may take another hour or so.

3) Update portfolio
After you get everything set up, you realize your portfolio is six months old, which may over- or underestimate your actual exposure. Do you pull an updated snapshot of your exposures? Probably not, because there isn’t enough time!

4) Run financial model SQL scripts
With a manual intersection process, you are likely unable to easily access the impact of policy terms and conditions, so you’ll need to run some financial model scripts to determine the actual exposure for this event.

5) Create and share reports
You finally get some financial numbers ready and format them into a nice report for management.

Then, you think about what you actually had on your to-do list for the day before the hurricane was in the picture…or, wait, maybe not…because just then you see that NOAA has published the next snapshot of the hurricane.

Rinse and repeat. It’s going to be a long night.

See also: How to Predict Atlantic Hurricanes  

Let’s face it, if you can’t extract insight from data fast enough to mitigate damage or provide a timely course of action, your operational efficiency and downstream customer satisfaction go downhill fast. And just think, this was for a single data source. Realistically, you have to perform these same steps across multiple sources to gain a complete understanding of this event. (e.g. KatRisk, Impact Forecasting, JBA flood, NOAA probability surge).

What makes the process so inefficient?

  • You had to source the data yourself and operationalize it (i.e., get it into a usable format)
  • You had to navigate the complexity of the data, which can be exceptionally time-consuming (depending on the source, resolution and other variables)
  • You realized your portfolio data was out of date (this is a big problem because how can you determine actual financial impact against outdated information?)
  • You had to manually run a financial model after determining the exposures that could be affected by the event
  • And, of course, you had to manually pull the information together into a report for stakeholders

So what can you do?

Application programming interface (API) integrations help to solve these challenges by ensuring you always have the latest hazard data and portfolio snapshot available. If you invest just a few hours to get your data configured with a data import API like SpatialKey offers, you’ll always have the latest view of your exposures ready to analyze—without ever lifting a finger. You’ll save countless hours by investing just a few up front. This also enables quicker and more accurate analyses downstream because you won’t be over- or understating your exposures (not to mention making errors by scrambling at the last minute to get a refreshed snapshot).

Imagine another hurricane strikes…but this time you’re set with automation

Those couple hours that it took to get your portfolio data integrated and automation in place with a solution like SpatialKey are paying off (no deep breaths required).

Within moments of NOAA publishing an update, you receive an email notifying you of the financial and insured impact. With the click of a button, you’re in a live dashboard, investigating the event, your affected exposures and more.

You still have to get those numbers to management, but this time you can breathe easy knowing that your numbers are not only accurate, but that the whole process took a fraction of the time. Now when NOAA (or any other public or private data provider) pushes the next update, you’ll be set with a highly scalable infrastructure that enriches your data, calculates financial impact and produces a report within minutes.

Why was this process much more efficient?

  • Because you invested a couple hours up front to integrate API technology, your exposure data was up to date
  • You had access to pre-processed, ready-to-use hazard footprints as they became available
  • The event was monitored 24/7 so you didn’t have to constantly track it and pull reports to understand what changed
  • Custom filters and thresholds ensured you were never inundated with notifications and only received metrics that you care about
  • You saved a bundle of time because a financial report was auto-generated for you to pass along to upper management
  • You were able to quickly share reports across teams so claims could get a head start on their customer outreach

Now, you’ll never be a bottleneck in the process of understanding and communicating the impact of an event to your stakeholders. And, with all the time you’ve saved, you can use advanced analytics solutions to contextualize the event and dive deeper into investigating it some more.

Tick tock: It’s time to make your event response run like clockwork

It’s clear there’s a better way to tackle the growing challenge of deriving insight from data and quickly understanding the impact of an event. If you lack the ability to operationalize and extract insight from time-critical data, you’re operating in status quo when your management team and customers expect to know more about an event, and sooner.

Fortunately, automation doesn’t have to be a time-consuming or costly endeavor. There are simple ways to automate your manual processes, such as API integrations, that save time and steps along the way. “Automation” can carry with it preconceptions of disruption and heavy investment, but this is not true of a data enrichment and geospatial analytics solution like SpatialKey. Automating your event response operations can improve your customer retention and drive efficiencies now—not years from now.

Industry Demands an Open Ecosystem

Can you imagine a world where the open ecosystem dream is a reality? A world where our collective insurance platforms talk to each other? A world where the industry moves faster and better by working together? Oasis and Simplitium, along with a host of others, including SpatialKey, are on this path. While the dream feels idealistic, it is possible. Making data more portable between platforms—interoperability—is not something novel. It’s just fundamental and increasingly vital for long-term survival whether you’re a re/insurer, broker, MGA or solutions provider. We all have a stake in this conversation, and a responsibility to move our industry forward.

Industry demand for an open ecosystem is overwhelming. We increasingly depend on ecosystems, and we need greater interoperability to overcome inefficiencies and redundancies. Matthew Jones of Simplitium provides three key stepping stones we must embrace for greater interoperability:

  1. Avoid a monolithic “one system does all” approach
  2. Minimize the number of catastrophe risk modeling platforms, while maximizing choice in models across multiple vendors
  3. Design systems so that the possibility of change is embedded

Leading organizations are already heading down this path. Lloyd’s recently announced that after losing £1 billion in 2018 it’s looking to drive efficiencies, and one way is through “an ecosystem of products and services that all market participants have access to.” One size does not fit all—and a monolithic approach has proven unsuccessful time after time. Rapid innovation in risk management requires systems that are flexible, scalable, designed for change—and built in close collaboration with those who serve the industry.

See also: The Insurance Lead Ecosystem  

Interoperability drives efficiency

Across our industry, we need to find ways to drive efficiency gains by making data more portable between core systems. If premium is scarce, then finding ways to eliminate waste in the system is not just how you save money, but rather how you make it.

Consider this: How much time do analysts spend keying information into different systems of record? Or, underwriters for that matter. Now, think about how much that costs your business. According to McKinsey, underwriters spend 30% to 40% of their time on administrative tasks like rekeying data or manually executing analyses. It’s inefficient and redundant and increases the risk of error, yet it’s a standard in our industry across every insurance workflow. This creates a massive amount of waste.

Now, imagine if analysts could pass exposure data seamlessly from system to system —with just the push of a button. We work with clients to perform these types of integrations all the time at SpatialKey. Core systems must talk to each other so that insurers can reap efficiency gains while leveraging the best that each chosen provider has to offer. Modern technologies and well-designed solution architectures allow us to integrate disparate value-driving systems easily—and the only thing in our way is us!

The market is advocating cooperation for the greater good. There will be more commercial opportunity and innovation generated through “coopetition” than by trying to knock each other out of the market. Solutions providers must find ways to differentiate that aren’t in opposition to the industry they serve.

Interoperability is “perfectly possible”

You may think it’s not possible—that the type of interoperability I’m advocating for requires too much change. To quote Dickie Whitaker of Oasis: “Don’t think it’s impossible, because it is perfectly possible.” He goes on to say at a climate change conference last year: “What’s important in solving these big problems is not to be beholden to our existing culture. Our existing view. Our existing experience. We’ve got to look to others that may be able to reframe the problem in a way that actually gives us insight into solving [it].”

So, if you’re not leveraging or supporting creative partnerships and ecosystems, perhaps it’s time to consider that they present a “perfectly possible” path to interoperability.

See also: Building Ecosystems Requires Guts  

Let’s make the open ecosystem dream a reality

We’re in an era where your solutions are only as powerful as your connections. Interoperability is the name of the new game. We must make systems do a better job of talking to each other. Doing so is a step change for the industry. And, while an open ecosystem may appear to be a dream, it’s already well on its way to reality. Like we’re seeing with Lloyd’s and elsewhere, purposeful change happens when the status quo is no longer sustainable.

It’s time to reach out to your partners and tell them what you need to be successful. Discuss your requirements for interoperability. Drive change that inspires innovation. Edward de Bono, an authority on creative and “lateral” thinking, said, “The system will always be defended by those countless people who have enough intellect to defend but not quite enough to innovate.”

Will you defend the status quo or innovate the future? The choice is yours.