In recent years, the combination of heightened customer demands for a more seamless user experience, combined with the rise of purpose-built insurtechs, has forced insurers to accelerate their innovation cycles. Data-driven processes such as marketing, underwriting and portfolio monitoring, to name a few, require a deeper understanding of the customer, based on requests for as few data points as possible. Insurers are also seeking opportunities to customize user journeys to enhance satisfaction with onboarding and claims processing.
In response to these challenges, insurers are developing world-class innovation teams, expanding their data science and technology capabilities and partnering with purpose-built MGAs to “test the waters” in a more limited scope. While some insurers are pursuing a combination of these strategies, almost all of them are putting an emphasis on improving data usage, particularly in the context of external data.
In the last decade, the market for external data has grown tremendously, with large bureaus such as Experian, Equifax and CoreLogic offering more data products than ever and hundreds of new data vendors emerging on the scene. Insurers were already consuming vast amounts of external data to run decision-making functions on risk, but now they are inundated with thousands more products to choose from across many more data types.
Consider the following examples. Aerial imagery data can provide more accurate property details than tax assessor data, while also automating certain claims around roof damage. Footfall data can be used to verify the hours a business is open and whether the property is regularly hosting large events or gatherings outside of normal business hours, suggesting the presence of additional risks. And digital intent data can provide clues about customer satisfaction, helping insurers reduce churn through preventative methods.
As insurers innovate across the new data landscape, they are faced with a series of challenges. First, not all data vendors are equal, and careful attention must be paid to data quality, collection methodologies and usage restrictions. Second, onboarding each data vendor for testing can take months to even a year depending on whether customer data must be enriched for an accurate data test. Third, with so much new data, enterprises need tools to help them “sift through the noise.”
In this context, an External Data Platform (EDP) is a valuable tool to accelerate data discovery processes. EDPs provide rapid access (via API or Data Share) to a curated collection of external data products from hundreds of different upstream providers, data dictionaries and due diligence certifications focused on usage restrictions and collection methodologies. When it comes to testing, EDPs also structure data in ways to feed into automated machine learning platforms that can test thousands of attributes and hundreds of model variations to identify the most predictive attributes across a sea of data sources.
See also: How Analytics Can Disrupt Work Comp
EDPs are also helpful in the context of solution deployment as they can provide access to production-grade data via APIs or full files delivered to in-house data lakes. Moreover, they provide the ability to procure external data from multiple vendors simultaneously via a single API. Thus, it can be said that these platforms provide access to hundreds of sources, via a single API and a single contract.
For insurers seeking to accelerate their analytical innovation cycles and move ideas “out of the lab,” data discovery tools such as EDPs are vital. Insurers are in a competitive arms race with insurtechs and other carriers as they seek to leverage the best technologies, data and people to maintain and grow their businesses. Those that fail to innovate will almost certainly lose market share, and pushing the bounds of analytical innovation will be the primary mechanism for innovation in the coming years.