Download

Bringing Innovation From Australia to U.S.

Severe environmental threats, together with Australia’s regulatory landscape, have catalyzed insurers to adopt new technology quickly.

A light bulb hanging from the ceiling lit up against a dark background

KEY TAKEAWAYS:

--A combination of advanced technologies, including satellite imaging, allow for significant mitigation of risks.

--After a storm does hit, damage is assessed on the fly — so people can get the help they need faster, without having to wait for assessors to hit the ground or policyholders to call their claim in over the phone.

---------- 

Natural disasters cost insurers approximately $120 billion last year, according to multinational insurance company Munich Re. For an industry that’s valued in the trillions, this number may not concern most people; however, what affects the insurance industry affects the economy, public policy, climate change regulations and policyholders — to name a few.  

In Australia, climate change-induced weather events have cost each household an average of $888 over the past 10 years, according to the Insurance Council of Australia, which says the figure will soar to $2,500 a year by 2050. The continent is especially vulnerable to climate change, from extreme heat and brushfires to floods and cyclones. The Climate Council estimates that, by 2030, one in every five Australian properties will be uninsurable.  

Lessons Learned From Oz 

Those environmental threats, together with Australia’s regulatory landscape, have catalyzed insurers to embrace and adopt new technology quicker than what we’ve seen in the U.S.

For starters, in Australia we’ve reimagined traditional event response. Given the pace and amount of severe weather events, we don’t have the luxury of relying on reactive tactics. If insurers continue waiting until storms clear to initiate recovery and assessment efforts, the cost to them, policyholders and the economy will continue to tick upward. 

At Arturo, our approach is proactive. We use AI-powered models that draw on strategic partnerships with ICEYE and Vexcel Imaging and that have integrated weather, proximity and other key geographic information, bridging disparate data sets to unlock insights on property. We integrate past, present and future-modeled insights and environmental impacts so carriers can warn people of potential threats and recommend steps to safeguard property. 

After a storm does hit, we’re able to assess damage on the fly — so people can get the help they need faster, without having to wait for assessors to hit the ground or policyholders to call their claim in over the phone. 

We’ve worked with the two largest insurance carriers in Australia to make the claims process faster and shorten the time it takes to repair properties. Carriers benefit, too, because we’ve seen assessors shave off an average of 30 minutes per claim — that’s nearly nine years saved if a carrier is processing over 152,000 claims annually.

See also: How Blockchain Enhances Reliability, Speed

Pervasive Threats Require Innovation and Reimagination  

Carriers cannot underestimate the systemic impacts of climate change. The increasing frequency of catastrophic events and regulatory pressures will require insurers to rethink their business models. We saw this firsthand throughout our work in Australia as more stakeholders, from policyholders to regulators, demand carriers go beyond assessing risk and implement mitigation measures.

Catastrophic events can be truly devastating for a community that has to contend with the lives that have been lost and take on the overwhelming task of figuring out how to rebuild. That’s why the proactive approach is so powerful for people — there is one less thing for them to worry about.  

The U.S. insurance industry has an opportunity to reshape the way carriers approach event response and risk mitigation and can learn from Australia. The time to refocus on the policyholder experience is now. 


Greg Oslan

Profile picture for user GregOslan

Greg Oslan

Greg Oslan is the CEO of Arturo.

Before joining Arturo, Oslan was chairman or CEO of multiple companies, including Narus, Fixmo, Risksense and Wireless Online. He also led the commercial efforts in standing up both the defense innovation unit in Silicon Valley and the Joint Artificial Intelligence Center at the Pentagon. He currently serves on the board of directors of the National Cyber Security Center and Atreides and on multiple advisory boards.

He has a track record of leading high-growth businesses from inception through exit in the AI/ML, cybersecurity and information technology markets. He's raised hundreds of millions of capital from various venture, private equity and debt sources.

Oslan received his MBA from the Kellogg School of Management at Northwestern University and a bachelor's of science from Indiana University.

Claims Leaders Face a Paradox

They need to challenge their assumptions and expectations, redefine business optimization and draw up a new map.

Overhead view of a winding country road with cars on it among tall red trees

It’s almost impossible to turn on the news and not hear about inflation affecting everything from food, gasoline and building materials to housing costs and automobile parts.    

Nowhere is the impact of these disruptive winds felt more than in the claims operations of insurance carriers. Today’s claims leaders face a paradox: Parts to fix vehicles cost more and are harder to get; building materials to repair damaged homes are also in short supply and rising dramatically in cost. While the time and cost of the claims journey are both increasing, the expectations of customers are running in the opposite direction—people want everything faster for less.

Time for a New Map

Over the past two years, we’ve worked with clients who share a similar story: Our claims counts are down; our staffing levels have remained level, yet claims morale is lower; and customer satisfaction scores are declining. What is going on?

  • It all starts with the process: While everyone is familiar with the headlines about the business environment, carriers have been slow to change their processes to adjust. In some cases, they have invested in administration systems that were designed in a pre-pandemic economy. Want a better journey? Start with a new map.
  • People still make the difference: While insurers have implemented tools and technology that can triage and assign claims to the right resources for faster processing and settlement, claimants continue to look to talk to someone who can solve their problem or explain to them what is happening with their claim and when they should expect to be made whole. Self-service portals and chatbots do not lend themselves to these types of inquiries. The adjuster is still the key figure in creating a great customer experience for both the claimant and the policyholder. Have you adjusted your staffing models to match the environment they are now operating in? Were your models designed in the same pre-pandemic time frame as your systems? If so, it’s time for a new staffing model, as the people aren’t the problem.
  • Technology can be an enabler but not the answer: New technology needs to do more than just improve the customer user interface. It needs to help the adjuster manage the current environment and efficiently communicate with the customer. How are the investments you’ve made in technology improving the journey for both your customers and your people? Not sure? What were your expected outcomes? Were your expectations wrong, or did your assumptions reflect a different landscape?

See also: Disparate Systems Kill Response Time

As the forces shaping the claims journey continue to evolve, leaders need to challenge their assumptions and expectations, redefine business optimization and chart a new path to satisfying stakeholder demands amid a changing environment.


Raymond Mazzotta

Profile picture for user Raymond Mazzotta

Raymond Mazzotta

Ray Mazzotta is an accomplished insurance professional with more than 35 years of industry experience. He has in-depth knowledge and experience in underwriting, operations effectiveness, profit optimization, turnarounds and brand and reputation building.


Richard Vonesh

Profile picture for user RichardVonesh

Richard Vonesh

Rich Vonesh is senior management consultant at Nolan Co.

Prior to joining Nolan, Vonesh was director of casualty claims at Nationwide Insurance. His experience also includes claim director, express at CNA, where he helped build and lead a commercial P&C organization in the handling of high-frequency, low-severity losses; claim director, service center at CNA, where he led a standard line commercial and excess and surplus P&C claim organization; regional operations consultant at CNA, responsible for organizational governance and controls; vice president of operations at AmeriClaim Group, responsible for an independent adjusting operation; and leadership and technical claim roles at AIG Claim Services, Providence Washington Insurance, Commercial Union Insurance and Farmers Insurance Group.

'Digital Twins': The Race Is On

The concept is widely adopted in manufacturing and supply chain. Insurers that integrate digital twins will significantly out-compete rivals.

An electronic city of tall buildings lit up in a bright circle against a dark blue background

KEY TAKEAWAYS:

--Digital twins are already letting auto insurers use telematics to model the behavior of drivers, assess risk and provide feedback on how they can improve their performance. 

--Soon, digital twins will allow us to increase the inspection rate of commercial properties from 10% to 100%. Digital twins will let us monitor real-time data from IoT sensors in insured assets such as homes or vehicles and identify potential risks or issues before they occur, allowing insurers to mitigate them. 

--The possibilities are endless. And the race begins in earnest now because digital twins will dominate the insurance lifecycle by 2035.

----------

As the world becomes increasingly digitized, the insurance industry must take advantage of the rapid advances in the availability of IoT data, sophisticated analytics and generative AI. One technology in particular, digital twins, holds immense transformational promise.

Digital twins are virtual representations of physical assets, systems and processes that are used to monitor and analyze performance in real time.

The digital twin concept is widely adopted in manufacturing and supply chain – and is expected to gain broad adoption in the insurance industry within the decade. Insurers that integrate this technology will significantly out-compete in the evolving insurance landscape.

The use of such real-time data and analysis is expected to help insurers create a more precise and current understanding of risk. AI and machine learning models will be applied to these digital twins to produce highly accurate predictive and prescriptive analytics.

This is not some distant future vision. The insurance industry has long relied on data, with vast amounts of information being gathered throughout the rating, underwriting and claims processes. Digital twins are now capable of leveraging that data in specific insurance applications. Automobile telematics is a familiar example. Here, digital twins enable insurers to assess the risk of drivers and provide feedback to the drivers on how to improve performance.

Some may think that digital representations are inferior to their real-world counterparts. There are many examples to prove that this is not the case. The P&C insurance industry has long used parametric securitization as efficient and rapid access to post-catastrophe capital. In these structures, the real-world insurance losses are secondary to the modeled parametric loss that actually triggers the bond payment.

Another example of digital twins is Swiss Re’s collaboration with Microsoft to establish its novel Digital Market Center that focuses on integrating connected vehicles, industrial manufacturing and natural catastrophe data. While insurers can find immediate value in initiatives such as these, we are only beginning to scratch the surface.

In the near future, digital twins will be applied across the entire insurance life cycle providing real-time data and insights into customers’ insured assets – including data reported by IoT and smart sensors from their vehicles, property structures and businesses.

For example, in underwriting, if we expand the definition of “inspection” to include digital twins, we can rapidly increase the inspection rate of commercial properties from 10% to 100%. While it may seem daunting to digitize so much of the world, generative AI can be used to produce up to 85% of digital twin content.

See also: Good, Bad and Ugly of Going Digital

In risk management, their application could look like Allianz’s use of predictive maintenance for wind turbines. Predictive maintenance involves the use of cloud-connected smart sensors on the turbines being fed into analytical models and monitored by AI. Analyzing past performance and known characteristics can identify patterns and anticipate and allow for maintenance to be carried out before a breakdown occurs.

More broadly applied, digital twins could be used to monitor and analyze real-time data from IoT sensors in insured assets such as homes or vehicles and identify potential risks or issues before they occur, allowing insurers to mitigate them. Digital twins could also be applied in monitoring the cyber risk environment and informing a customer’s cybersecurity organization of new potential viruses, attack vectors and risks.

The possibilities are endless. And the race begins in earnest now. It is reasonable to expect that digital twins will come to dominate the insurance lifecycle by 2035.

How can you prepare and help drive your business toward this future?

For starters, embrace the idea of digital twins. Expect to see the insurance industry drive the adoption of IoT devices, potentially connecting to 25% of all devices by 2035. An insurance company that ingests and discerns insights from that much data looks fundamentally different than today. How do we get that data in the first place? By encouraging IoT connectivity via new policies, greater limits and preferential renewal terms.

See also: Digital Future of Insurance Emerges

Recognize that virtual digital technology has the capability to manage physical environments. Digital twins will regularly intervene in the physical world to mitigate risks and drive preferred outcomes. So, build this potential into your vision for customers.

One very promising application is that as digital twins become sufficiently realistic, this will increasingly allow for remote operation of dangerous machinery or in dangerous conditions, dramatically reducing accidents in industries like mining and oil extraction.

Understand that digital twin technology will complement insurers’ traditional role of indemnifying policyholders against losses. So, prepare to use digital twins to more robustly assess and actively manage risks associated with insured assets, to monitor performance and to take actions to mitigate those risks.

Digital twin technology holds immense promise.


Roger Arnemann

Profile picture for user RogerArnemann

Roger Arnemann

Roger Arnemann serves as the general manager and senior vice president of analytics at Guidewire Software.

He has over 20 years of expertise in technology solutions, spanning catastrophe modeling, insurance analytics, cyber risk and fintech.

He holds bachelor of arts, bachelor of science and master of science degrees from Stanford University.

Quantum Technologies, Cybersecurity and the Change Ahead

Uncover insights about the transformative potential of quantum technologies and cybersecurity.

Quantum Tech

In the 1980s, one of the pioneers of quantum computing, Nobel physicist Richard Feynman[i] observed that “Nature isn’t classical, dammit, and if you want to make a simulation of nature, you’d better make it quantum mechanical”[ii].

According to a study by Precedence Research[iii] “The global quantum computing market size is projected to hit around USD $125B by 2030 and poised to reach a CAGR of 36.98%.”

What if you could envisage a world where millions of entangled energised atoms are connected across the planet and into space all working in parallel at the speed of light to compute a specific solution for say cancer cure, a new battery or carbon emission reduction. 

This paper is a primer for quantum computing[iv] that experts say is 10 years away but there is need to understand and act now because of –

  • Recent adoption of generative AI[v]  has greatly sped the race to quantum computing.
  • Cybersecurity outcomes of quantum computing need to be addressed today.
  • Hybrid computers interfacing classical computers with quantum computers are now available and can address some key world issues.
  • Climate change urgency requires faster development of quantum computing.
  • Error correction that challenges quantum computing today is fast being addressed.

Quantum technologies have existed for some time. The transformational spotlight of this paper is the shift from classical to quantum computing. This aligns computing with physics, chemistry, biology and laws of nature, moving from a binary transistor state to a world as envisaged by Richard Feynman. Explosion of data and smart devices pushes classical computing to the limit hence the evolution of quantum computers. This evolves data mining which collects data and extracts patterns, to data farming which grows and cultivates data for user designed computational experiments to analyse using quantum computing for forecasting and modelling to obtain insight into complex problems. It is likely that only quantum computing technology can address and solve the climate change crisis.

Quantum computing combines computer science, physics, and mathematics utilizing quantum mechanics to solve complex problems faster than on classical computers. Applications where quantum computers can provide such a catalyst now include machine learning, optimization, and simulation of physical systems. The following diagram shows the diversity of quantum technologies. In physics, the term “quantum” also refers to the smallest possible unit of a physical object at atomic level.

Quantum Technology

Countries and corporates set ambitious targets for reducing emissions at the 2021 United Nations Climate Change Conference (COP26)[i]. A pledge of $4 Trillion annual investment by 2030 for measures reducing warming to between 1.7°C and 1.8°C by 2050 falling short of the 1.5°C level logged to avoid catastrophic climate change. To meet the net-zero emissions goal, advances in climate technology are required that powerful supercomputers cannot do today. Quantum computing is the one technology that could develop climate solutions capable of abating carbon emissions by 7 gigatons a year of additional carbon dioxide by 2035, with the potential to align the world with the 1.5°C target [vii].

‍At the atomic level different rules of physics apply and quantum rules determine how atoms interact and entangle together. This understanding enables advanced and rapid calculations. Quantum computers form an exponential increase in processing capacity by considering all possible outcomes to a problem simultaneously rather than sequentially. The result is a computer that delivers atomic-level speed, millions of times faster than anything today.

Explanation of the Quantum Ecosystem

Quantum mechanics is the area of physics that studies the behaviour of atomic and subatomic particles. Quantum computers take advantage of these behaviours to perform computations. Quantum phenomenon is not new and is the basis of the development of lasers and semi-conductors since the 1950s. Emerging quantum technologies support development of quantum computers. Foundationally the technologies use the quantum mechanics principles of entanglement[viii] and superposition[ix] (how atoms work together when energised) to share information. This is not possible with classical computers which work on binary bits where a bit is an electronic signal that is either off or on (0,1). The value of this switch makes computers follow specific logic making them reliable for mathematical calculations but unsuitable for intangible problems such as climate change and medical breakthroughs where situational awareness of all data is required.

The best analogy to describe quantum computing is a coin toss. For classical computers it is binary, either a heads or tails state. Quantum can be envisaged as a spinning coin where heads and tails states are both active while it is spinning. When the coin stops spinning it falls either heads or tails reverting the state to binary, losing the quantum state. At this point the result is measured as it has gone through every possibility while the coin was spinning and stores results back in the binary state. Superposition is akin to the spinning coin, giving quantum computers inherent parallelism, allowing millions of operations simultaneously.

Qubit

Quantum computers substitute bits with quantum bits known as qubits representing quantum particles at atomic and subatomic levels. Superposition can be achieved using photons[i] (packets of light) creating an interference pattern where the photon is travelling every path at once causing multiple states to occur allowing complex calculations at scale and in parallel.

Superposition shows why solutions going through trial and error in laboratories take years but with quantum technology can take hours. This can also be illustrated with programming a mouse out of maze. Classical computers program all possibilities to find the quickest route out by trial and error, but quantum computers simultaneously plot all routes, stop and measure to find the quickest route. When a quantum state is measured, atomic wavefunction collapses and you measure the state as either a zero or a one known as the deterministic state, where the qubit acts as a classical bit and reveals the best path out of the maze.

Because of entanglement, where qubits can connect anywhere, multiple qubits are pushed into the same state that can connect over large distances.  Quantum processors can draw conclusions about one particle by measuring another one to solve complex problems faster. Superposition and entanglement work in parallel. This phenomenon is exponential as the more qubits the more possibilities of storage capability. The optimization of this process is known as Quantum Annealing[xi] and is the most promising quantum technology for shorter term delivery as easy to build stable processors and qubits. When the number of qubits increases, the possibility of noise[xii] or heat generation occurs and needs to be addressed by quantum error correction (QEC). Noise in the quantum state is known as decoherence where the system degrades, losing entanglement and data, so needs to be fault tolerant.  

To reinforce this explanation Scientific American has a good video, noted in the reference section.[xiii]

High level Applications of Quantum Computing  

Computational capabilities via quantum technologies allow supply chain optimizations, molecule/vaccine discovery, discovery of mineral deposits, climate change solutions, cures for cancer and more. Governments know this, hence the large national investments.

There is no need to wait for mainstream adoption to start addressing these issues as small error tolerance is acceptable as quantum computers deal with probability theory. Even though error correction is required to correct decoherence, techniques exist now to correct by using logical qubits (cluster of physical qubits to do error adjustment), building Noisy Intermediate Scale Quantum computers (NISQ)[xiv] and finding qubits not sensitive to noise.

This spawns hybrid quantum computing[xv], available now for commercial applications use including investment predictions, logistic route optimization, smart city planning and energy distribution. Hybridization provides access to quantum computing for business problems in partnership with classical computers (GPU/CPU) and Web3 technology.

Science

In order to assess quantum commercialisation, it behoves to look at quantum advantage, the threshold where a quantum system can perform operations that any classical computer cannot simulate in reasonable time. Currently, no quantum computer can perform a meaningful task, faster, cheaper, or more efficiently but that will soon change. Quantum sensors will be commercialised before quantum computers providing a catalyst.

There are 4 important areas below where classical computers will fail and, using superposition, all viable options can be generated, bypassing non scalable long trial and error experiments in labs. Emission of carbon dioxide into the atmosphere and reversal through clean fuels and energy storage are functions of nature, battery improvement aligned through chemistry and healthcare through biology, all which can be seen through a quantum lens and encapsulated by the laws of quantum mechanics.

  • Climate Change – carbon capture, nuclear fusion, renewable sources, decarbonize ammonia process, remove methane from agriculture, make cement production emissions free, lower cost of hydrogen to replace fossil fuels and energy grid analysis.
  • New Battery Development – align quantum chemistry to battery chemistry, create longer life faster charging batteries, halve the cost of grid scale storage.
  • Healthcare – precise medical imaging, new drug discovery, early disease detection, cell level intervention with nano quantum sensors and combining healthcare life sciences with quantum technologies aligned with biology.
  • Space Exploration – more precise navigation systems, communication networks, quantum clocks and putting energy collectors in space.

Cybersecurity Issues

Quantum technologies promise much but could disrupt a country’s national security. They are complex and diverse technologies with varying levels of technical readiness, so mitigation is required to avoid any quantum cyber surprises.  Cryptography experts say that a classical computer will take millions of years to find the prime number of two factor combinations used in encryption today, but quantum computers can crack that in less than an hour, putting data security and privacy at risk with the advancement of quantum computing. This needs to be on all risk radars. Bad actors can harvest data now and once quantum advances will decrypt the data.  Quantum-safe cryptography[xvi] is the process of securing and transmitting data in a way that cannot be hacked by quantum or classical computers. Safe transition is achieved through technology lifecycle management. The urgency to initiate and complete the transition to quantum safe cryptography is bespoke for individual organizations. EvolutionQ’s “Quantum Threat Timeline 2022”[xvii] explains how three simple parameters can evaluate this:

  • shelf-life time: the number of years data should be protected.
  • migration time: the number of years needed to safely migrate the systems protecting that data.
  • threat timeline: the number of years before relevant threat actors can potentially access cryptographically relevant quantum computers.

Organizations will not protect their assets from quantum attacks if the quantum threat timeline is shorter than the sum of the shelf-life and migration times. The quantum threat to cybersecurity could become apparent sooner than many expect.

Quantum Computing improves modelling and predictive analysis by the use of AI, simulation and optimisation. The pace of data explosion has increased the need to process and store data far broader than computer capabilities can today, increasing cybersecurity risk. Data scientists say that public key cryptography systems in use today such as RSA[xviii] can be broken by quantum computing. The world has entered a post quantum phase where strategies are being developed to protect the encryption and confidentiality of data. The United States is pioneering this phase by releasing post quantum standards in 2024 backed by government investments. This assumes mainstream adoption of quantum computing within a 5-10 Year timeframe now accelerated by third wave AI and increasing natural catastrophes.

Given quantum-enabled cyberattacks, no public-key encryption today is safe. Quantum Key Distribution (QKD)[xix] is available but not at smartphone level yet. Computers can use prime factors to a large number to act as an encrypted key: the receiver must know the prime factorization to decrypt the information, a secret between sender and receiver. With QKD, an eavesdropper can only see the composite number and a feature of quantum mechanics means that any interloper would need to measure so interception is easily detected, and hackers cannot cover their tracks.  By partnering with organizations that can provide resources and training on quantum computing while also notifying when new standards are released, companies can update their technology with patches to keep their data secure.

Quantum Technology

Quantum Threat Timeline can be mitigated by deploying new cryptographic tools that are resistant to quantum attacks. Transition to quantum-safe cryptography requires the development/deployment of hardware/software, quantum immune solutions, establishing standards and legacy migration. A milestone will be to demonstrate “quantum supremacy” where a quantum computer outperforms the most powerful supercomputer. This will signify having achieved control on a large number of physical qubits, necessary for quantum computing. That quantum supremacy milestone could easily be passed in the next couple of years, so these cybersecurity mitigation preparations need to start. From the threat timeline to the migration timeline, assessing the overall risk see below. (Mosca & Mulholland, 2017).

Graph

Enterprise level risk assessments all need revision. The ecosystem shows the quantum technologies interoperating together and how quantum communication networks link to space technology. QKD is physically realized using photon quantum computing and that requires fiber optics so wireless connectivity is needed to connect to smartphone. The Post Quantum phase will exist until wireless connectivity can transmit qubits. When QEC ends decoherence that will be a tipping point for quantum computers. Distance problems are addressed by connecting ground stations with satellites enacting QKD by transferring a group of keys to a satellite and then using those keys to secure communication. The ability to distribute entanglement across oceans from space makes the quantum internet real, securing data between countries cell phones from space and beyond as in the diagram below.[xx]

Chart

Today the largest quantum computer has 1000 qubits[xxi] but this is exponential so when you get to 1 million qubits this is a powerful machine capable of breaking the current encryption systems. A quantum computer was built based on an algorithm by Peter Shor[xxii] to prove that current encryption can be broken so it has to be taken seriously but not as to cause a Y2k like panic.

Standards

In 2016, the National Institute of Standards and Technology (NIST)[xxiii] requested submissions for algorithms that could replace public key encryption. NIST has been working on post quantum cryptology for five years and it will take the same amount of time to get critical mass to adopt the standard. NIST announced the current secure Advanced Encryption Standard (AES)[xxiv]. While NIST released one quantum-resistant signature standard it announced the final algorithm choices and draft standards early in 2022 with final standards to be announced in 2024[xxv]. These techniques are more resilient against advances in classical computing than the current public-key algorithms and offer more overall security. Organizations must be crypto-agile, and AES is a safe default standard for today. Cloud companies such as Microsoft, Google and Amazon are all developing quantum-safe encryption algorithms and software.

Transition to post-quantum algorithms via standards is priority to protect critical digital and hardware assets from future cyber threats. Cyber security risks posed by cryptographically relevant quantum computers is a systemic risk. The algorithms that support encryption today are still considered safe for e-commerce because while quantum computing is real, the technology is at an early stage and will not be a threat until computers have 1 million qubits, hence the need for the time threat timeline hand in hand with the standards.  

Preparation means security certificates and current trust anchors[xxvi] must be updated and the key sizes for public-key algorithms will change. Organizations need a transition plan and need to decide whether to migrate to post quantum via hybrid certificates or create two parallel environments with single certificates. Institutions must use standards to become "quantum resilient or immune" using security by design. Manufacturers must include hybrid certificates into devices at production time, which is a digital certificate that features a classical crypto algorithm, like RSA, alongside a post-quantum crypto algorithm. A single certificate allows communicating securely with algorithms that NIST has listed. A hybrid digital certificate provides customers a preview on how a post-quantum crypto algorithm can work without infrastructure change. Quantum resilience mitigates effects of cyber vulnerabilities ensuring that sensitive data is broken into smaller, encrypted pieces and stored in different places.

Regulation

The development of quantum technologies raises questions for policy makers and regulators in the same vein as AI, to ensure that it is being used in a responsible manner as well as assessing the societal and ethical implications. With such a transformational technology regulators should be inside the quantum ecosystem as a node on the network in real time to promote their understanding of the wider market innovation, competition and national developments. The cybersecurity challenges are clear, the development of standards is underway and changes in policy need to be reflected in cybersecurity laws. Anticipatory governance should be aimed at responsible quantum technology to achieve socially desirable outcomes.  Quantum technologies serve as a broader reference to quantum computing. There have been many hype cycles in AI which has led to moratoriums, fear and confusion. Cybersecurity, societal or privacy risks posed by quantum enabled technologies on assumption of decrypting currently used encryption systems must be addressed even with penalties if deadlines are not met.  To maximize the opportunities for quantum we must also push for greater inclusion and diversity to avoid accidental bias within AI training and the unintended consequences if not regulated.

Regulatory bodies want to enforce quantum compliance over sensitive parts of the infrastructure including the supercomputer and cloud. Encryption affects everyone so an increase in computational powers does mean that some people can exploit security more than others. Access to quantum technology today is not about community so it is important to move this to the community level early. Governments must protect their citizens on how to regulate a new quantum internet and recognize secrets that are valuable for more than say 10 years (such as child healthcare records).  There is a long gap between lab to production to early adopters to mass market adoption.  Quantum technologies are sovereignty technologies where countries are developing their own with significant funding so there is a need to try and avoid a scientific and technological gap between countries causing a quantum divide. Sovereignty technologies have strategic interests and already export controls of quantum processes exist even though these machines are at the early stage of growth with limited qubit capacity. Regulators must deal with hype and while the hype cycle is still there the reality cycle is starting to grow as shown by Gartner.

Graph

Generative AI

Generative AI[xxvii] (such as ChatGPT) is at an early stage of development with a challenge being the amount of data to train the models. Generative AI produces content similar to the work of human intelligence where the human is the curator and the machine the creator. Quantum computing as established can process vast amounts of data at lightning speed and therefore will be a serious contributor for Generative AI by producing more accurate and complex outputs through quantum machine learning. The integration of these two fast moving technologies over time will transform computing and AI.  The Consumer-Packaged Goods business sector is a key recipient of AI and quantum to unlock deeper insights into the consumer to see relationships and information to protect brands. AI research intersects with quantum research to become a new computing concept for the sector. This is where hybrid quantum computers can solve contingency problems using heuristics as no exact answer is required, just requiring approximations based on probability theory. The key question being asked is if Generative AI can scrape the internet quickly for information on quantum, is it possible that quantum algorithm code could be generated?  If the answer is yes, then this has big implications. At this transitory stage it is probable that Generative AI is not ready to generate quantum algorithms that solve worldwide problems as confidentiality and cybersecurity is paramount and more training is required. However, as these technologies mature and quantum AI emerges, the fusion of these technologies will be highly significant as long as they remain within the caveat of responsible and ethical usage.

Technology Considerations

Classical computers are approaching a technology barrier. According to Moore’s Law[xxviii], the number of transistors doubles approximately every two years in an integrated circuit. Today’s computers are rapidly approaching the size of atoms hence the quantum shift.  

It is important to discuss quantum clocks as they complement atomic clocks which are needed for precise position measurement of GPS (Global Positioning System[xxix]), and time critical Internet applications and are the most accurate time standards known. Based on atomic physics these clocks exist at absolute zero temperatures where atoms slow down. Every GPS satellite carries atomic clocks and are used in mobile phones, space navigation, aviation programs and digital television. Banks guarantee the time and date stamps of high-frequency transactions. Reducing these clocks in size on chips is key for applications with no GPS signal. Quantum clocks are part of the quantum technology sensor toolkit.

Quantum computer node networks all have super cooling to slow down the atoms (qubits) to do calculations using superposition and entanglement. Quantum repeaters keep the superposition, entanglement and measurement going to share sensitive data across large distances. For cost reasons these networks will initially be accessed through the cloud as a distributed quantum computer network and eventually bring to the smartphone level where low orbit satellites with wireless are used for last mile connectivity.

Deployment of IoT[xxx] (Internet of Things) is greatly enhanced by quantum devices and sensors leading to massive increases in data volumes which is ingested by quantum computing and improves risk management. Quantum Computing will enable a change in the security, protection and transfer of data within IoT ecosystem by use of QKD. The quantum network layer is responsible to channel and monitor data transfers through quantum and classical systems within local networks and the internet.  IoT systems require rapid results as they generate heavy volumes of data. Therefore, Quantum Computing processes complex data faster within the IoT system. Optimum complex computation capabilities integrate quantum technologies in digital twin simulations. Quantum simulation and quantum machine learning can be used to build a Quantum (Digital) Twin[xxxi]. Digital twins are used in critical infrastructure and autonomous transport to do industrial simulations. In the manufacturing sector when quantum algorithms are integrated into digital twin simulation across manufacturing plants, connecting multiple machines with multiple devices, the impact will be exponential. The Internet is enhanced and not replaced by entry of the Quantum Internet which transfers encoded qubit information through optical fiber networks and satellites. 

Thermodynamics of Quantum Computing and Energy Consumption.

When computers overheat, they underperform or crash. For quantum computers heat is a crucial interference factor and must be measured at speed as changes to a quantum state take only a millionth of a second. Fault tolerant quantum computers[xxxii] are being built with QEC to fend against noise and heat. This comes with an energy trade off.

The qubit environment needs to be close to absolute zero (-273°C) to ensure no interference. QEC preserves quantum information but incurs a high energy cost by adding qubits for error detection. There is correlation between “error rate” and “energy” in quantum computing that will allow the design of energy efficient computers to calculate the minimal energy cost. Even if a quantum logic operation consumes more energy than a classical logic operation, the smaller number of quantum logic operations mean that the quantum computer will ultimately be more energy efficient. Quantum computers should consume less energy as they solve problems quickly that would take supercomputers eons to solve. Recent experiments in comparing bitcoin mining[xxxiii] using level entry hybrid quantum computers showed strong energy efficiency gains, proving less powerful quantum computers are capable of solving computations comparable to supercomputers with less energy.  

Insurance implications

Quantum computing is considered a long-tail risk (an unknown) implying claims may not be settled until a relatively long time after a policy period expires and even require commutation. The potential of quantum computing is not obvious to many in the insurance industry though there are several early adopters. New entrants are likely with foundations in the technology. Financial craft brothers such as securities and banking applications utilise quantum algorithms and software are gaining traction in risk analysis, portfolio optimization and credit risk. Early adopters for quantum computing for risk analysis will have the vision to future-proof the insurance industry. Monte Carlo simulation speedups can use random sampling to estimate numerical quantities difficult to calculate outright via classical methods.

Emphasis should be on sharing data and improving risk modelling. Insurers cannot be complacent on the systemic threat of emerging risks such as cyber or climate change. Frequent catastrophic events, combined with meeting evolving regulatory requirements, challenge company business models and can make risk unaffordable. Quantum technologies offer exponential increase in computing power, precise measurements (quantum sensing), high-performance computing (quantum computing) and tamper-proof communications (quantum communication) to assess and quantify environmental risk, acting as an early warning signal for insurers. Quantum computers can handle precisely those computational tasks that form the basis for solving actuarial problems. Risks could be identified at an early stage by calculation and losses avoided. Analytics required for the development of products and services could be calculated faster and more accurately than was previously possible.

When taken in context with AI, quantum computing will show improved pricing and risk models within the underwriting process using benchmarking and baselining using Causal or Explanatory AI[xxxiv] leading to exposure management, reserve adequacy, reinsurance risk transfer, better fraud detection, claims management and better explanations with regulators. The ability to accurately simulate climate systems delivers significant improvements to catastrophe modelling and property insurance, benefiting the process of pricing, reserving and setting policy limits. The modelling of other aggregate risks such as supply chain interruption, liability risks or cyber, also benefit from quantum computing capabilities.

The insurance industry has many legacy systems holding data as they implement all the security measures needed to protect against quantum threats while maintaining data integrity. Data hygiene is the mitigator of certain cyber-attacks such as using data exchange platforms as back up. The cloud is the foundation for the successful application of quantum computing. Insurers need to strengthen cloud adoption and optimize the way they leverage the cloud to collect data in preparation for using quantum computing. The more cloud adoption, the more opportunities to collect data produced by workflows happening across the digital world.

A lack of experts applying quantum mechanics and quantum computing in actuarial mathematics settings is a barrier to entry as using principles of quantum mechanics in insurance research is very recent, however most hurdles will be overcome in 10 years and education now is paramount in universities and internal company education.

Global progress

The quest for a quantum computer is a ‘quantum race’ with competition at the nation level as well as global companies. This has increased intensity in recent years, with new private player entrants, grants from governments, and the emergence of start-ups backed by venture capital. As scientists, engineers and PhD’s race to access the calculating power of quantum computers, cryptographers in parallel are developing new encryption standards to prevent quantum computers falling in the hands of bad actors for malware purposes. The history of quantum computing has an embryonic cord to quantum physics. Physicists 40 years ago were finding that simulations calculated on classical computers were ineffective leading to the conclusion that an accurate simulation of quantum mechanics would require a processor that could work on the basis of quantum physics and closer to the laws of nature.

Since the 1990s, scientists globally have been working on quantum computing development. Germany has their own Quantum Alliance[xxxv] and in mid-June 2021, the first German quantum computer went into operation[xxxvi]. German multinational reinsurance company, Munich RE, is one of the founding members of the country’s Quantum Technology & Application Consortium. [xxxvii]

The US and China dominate government spending and private investments in quantum technologies followed by Germany, France, UK, Canada, India, Israel but the Netherlands are a leading investor in quantum as a percentage of gross domestic product.  Total global government investments are estimated at the equivalent of 36 billion U.S. dollars for 2023 compared to 1.6 billion U.S. dollars in 2015[xxxviii]. NATO has developed the Niels Bohr Institute[xxxix] in Copenhagen as an incubator accelerator for use of quantum in defense. The US launched the Quantum Alliance Initiative in 2018[xl] to establish clear thought leadership in quantum computing as a crucial area of information technology for mankind. QURECA[xli] has an excellent map to show the quantum effort worldwide.

Quantum Technology

A 21st-century technology race has evolved between the West and China on emerging technologies. Geopolitical risks straining global supply chains have prompted China to seek technological sovereignty in semiconductors, quantum technology, AI, and blockchain.  Chinese state funding in quantum doubles the EU commitment and four times that of the US.

Graph

 

Different economic models underpin China, and the US as China has significantly higher public spending beyond research and development compared to the US where private investments in quantum technologies, research, and start-ups are much higher.  China’s public spending on quantum exceeds the US fourfold. China accounts for over 50% of the estimated global public investment in quantum allocated to research with quantum companies. Because of the rising geopolitics in the technology war, nations are turning protectionist in national strategy including the US, as the race to quantum supremacy commences. This dashes ambitions about collaboration to speed up quantum research and to address the talent shortage in the field.  

The UK has a national quantum technologies program with a National Quantum Computing Center[xlii] plus a Regulatory Horizons Council[xliii] to set standards and be quantum ready. Work is in progress to develop a quantum clock at the size and scale required for a net zero smart grid and the first scalable noise adjusted quantum computer using private public partnerships with a government contribution of 2.5 billion pounds sterling.  

The Australian government released a national quantum strategy to fund the build of quantum technologies.  The Centre for Quantum Software and Information[xliv] at the University of Technology Sydney is working with DARPA[xlv] in the US on its quantum benchmarking program which assesses the performance of quantum computing algorithms. Australia is a founding partner of the Entanglement Exchange[xlvi].

Canada launched a National Quantum Strategy in January 2023 [xlvii]. As a large country with low population densities where the distance between cities can exceed 300 kilometers (outside QKD limits), they need trusted network nodes using satellite technology and the use of quantum repeaters to ensure entanglement, measurement and data integrity.

Quantum Internet Alliance[xlviii] is a European consortium formed to build a clock synchronized Quantum Internet in the cloud to enable quantum communication between any two points on earth based on QKD where sensitive information is protected from eavesdropping even if the eavesdropper has a quantum server in the cloud. 

A year after Google AI Quantum announced Sycamore Quantum[xlix] achieving 53 Qubits with a declaration of quantum supremacy which was hotly debated, China announced they overtook Google with two quantum computers named Zuchongzhi[l] (66 qubits superconductor quantum computer) and Jiuzhang[li] 2.0 (photonic quantum computer). An important sidenote is Google’s Sycamore consumes 26 kilowatts of electrical power, less than a supercomputer and runs a quantum algorithm in seconds. Although good progress it will be several decades before quantum computers are available to everyone. However, prototypes are running in research facilities, proving that the theory of quantum computers works in practice such as quantum pioneer D-Wave Systems[lii], IBM, Google, Microsoft, Honeywell and Alibaba. Different manufacturers of quantum computers work with different technical approaches. Performance is looked at by the number of qubits, the error rate and the extent of entanglement.  All processors are error-prone and work in isolation from any environmental interference, however quantum computing power is available in the cloud via the Internet.

Conclusions 

Quantum computing is a moving target. Success may hinge on collaboration to create awareness of what quantum technologies can achieve. Quantum computers capable of solving complex problems far beyond the capacity of classical computers are conservatively 10-15 years away but this timeframe will surely shrink. Post-quantum-cryptography triggered a race to revamp classical encryption in preparation for quantum computers. Our digital societies must be able to withstand a quantum computer threat. Algorithms must be available that make classical computers resistant to quantum hacking without requiring enterprises to replace their classical encryption infrastructure.  One recent report[liii] estimates that the market for post quantum security technology will rise from around $200 million today to $3.8 Billion as the quantum threat develops as shown below.

Revenue

The claim that quantum computing is years away from being useful is not factual. Although most quantum hardware is still in the labs, hybrid models allow applications to be hosted by quantum software and simulators in the cloud as a service. Algorithms for optimization, simulations and machine learning are already available. Error correction will bring the utility of quantum computing sooner than anticipated. From carbon sequestration to electrolysis of water and the invention of new batteries, quantum computing has the potential to harness nature to help reverse climate change. Communities need to be educated early so they can reach quantum competency from different angles and help such as building applications and web interfaces and be part of the development via open source.   

Barriers to entry are shown below all of which are being addressed today -

  • Quantum error correction and environmental sensitivity.
  • Post-quantum cryptography is a national security concern.
  • Quantum-powered AI could create unintended consequences.

Quantum is a deep technology so long-term thinking is a must. Straight out of the starting blocks are use cases for consumer engagement and commercial access. Firstly, Quantum advantage needs to be achieved which is the demonstrated ability for a quantum computer to outperform a classical computer by an order of magnitude, achieving results in minutes that would otherwise take millions of years to complete. A million qubits are needed for quantum advantage and so far, only 1000 has been achieved. Next stages are to higher qubits which are expensive requiring super cooling and error correction to control the quantum particles (qubits). Secondly, Quantum utility needs to be reached which shows improved outcome to the status quo through the application of quantum technology, reaching a state of heterogeneity where quantum accelerator technology sits alongside classical computers. This will make quantum computing more accessible by aggregating access to quantum compute capability, delivering tools for software developers/researchers, providing a platform with access to a catalog of quantum solutions and AI.  This needs to stay engaged in hybrid fashion to classical compute devices such as a NVIDIA[liv] GPU. Quantum computers then become general purpose computing devices in the same way as GPU. Finally, the state of Quantum Supremacy is sought by all nations and large companies from the point of view of computing power and the ability to calculate faster.

While maintaining healthy competition, it is hoped that collaboration will occur making quantum supremacy a shared success because climate change, new drugs, new energy sources and supply chain optimizations need this technology. It would be ironic to create a quantum divide before the digital divide which still exists in emerging and rural communities, though leapfrog could be beneficial. The number of quantum algorithms is growing, and material innovations are critical to enable practical applications. With advanced materials, quantum technologies offer functionalities such as superconductivity and manipulation of quantum information. Promising host materials for quantum systems include silicon, diamond, rare-earth minerals and many more.

As data volume increases, quantum computing can make better predictions about where markets are going. Quantum computing is already used for risk assessment in the financial industry for sales forecasting and financial market behaviour. Quantum computing will change the way we use data, adding exponential value to the data that’s already being collected through cloud-based technology. Quantum software as a service is available in the cloud to keep the costs down and make technology available to all.

Quantum computing is a revolutionary technology that could allow for precise molecular-level simulation and a deeper understanding of nature’s basic laws. McKinsey estimates the quantum computing market to be worth anywhere between 9 billion and 93 billion, quantum sensing 1 to 7 billion and quantum communications 1 to 6 billion from investment perspective[lv]. The market opportunity is high as is the technical risk. Those who do not understand how quantum computers work because of the in depth physics will soon appreciate that quantum computers solve very complex known problems in a short amount of time with high accuracy and they will be using quantum computing in their daily lives under the hood without knowing the technology, just like the telephone and Internet in the past.

 

REFERENCES


[i] https://en.wikipedia.org/wiki/Richard_Feynman
[ii] https://www.nature.com/articles/nphys2258
[iii] https://thelephant.io/the-quantum-computing-market/
[iv] https://aws.amazon.com/what-is/quantum-computing/#:~:text=Quantum%20computing%20is%20a%20multidisciplinary,hardware%20research%20and%20application%20development.
[v] https://generativeai.net/
[vi] https://www.un.org/en/climatechange/cop26
[vii] https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/quantum-computing-just-might-save-the-planet#:~:text=Quantum%20computing%20could%20be%20a,the%201.5°C%20target.
[viii] https://scienceexchange.caltech.edu/topics/quantum-science-explained/entanglement
[ix] https://scienceexchange.caltech.edu/topics/quantum-science-explained/quantum-superposition#:~:text=When%20an%20electron%20is%20in,in%20two%20places%20at%20once.
[x] https://www.symmetrymagazine.org/article/what-is-a-photon?language_content_entity=und
[xi] https://docs.dwavesys.com/docs/latest/c_gs_2.html
[xii] https://iopscience.iop.org/article/10.1088/0034-4885/76/7/076001/meta
[xiii] https://www.youtube.com/watch?v=uLnGp1WTNFQ
[xiv] https://thequantuminsider.com/2023/03/13/what-is-nisq-quantum-computing/
[xv] https://ionq.com/resources/what-is-hybrid-quantum-computing
[xvi] https://www.ibm.com/cloud/blog/what-is-quantum-safe-cryptography-and-why-do-we-need-it
[xvii] https://www.evolutionq.com/
[xviii] https://www.techtarget.com/searchsecurity/definition/RSA#:~:text=RSA%20is%20a%20type%20of,is%20used%20to%20decrypt%20it
[xix] https://www.techtarget.com/searchsecurity/definition/quantum-key-distribution-QKD
[xx] https://onlinelibrary.wiley.com/doi/abs/10.1002/sat.1392
[xxi] https://spectrum.ieee.org/amp/ibm-condor-2658839657
[xxii] https://en.m.wikipedia.org/wiki/Shor%27s_algorithm
[xxiii] https://www.nist.gov/
[xxiv] https://www.techtarget.com/searchsecurity/definition/Advanced-Encryption-Standard#:~:text=The%20Advanced%20Encryption%20Standard%20(AES)%20is%20a%20symmetric%20block%20cipher,cybersecurity%20and%20electronic%20data%20protection
[xxv] https://www.nist.gov/news-events/news/2022/07/nist-announces-first-four-quantum-resistant-cryptographic-algorithms
[xxvi] https://en.wikipedia.org/wiki/Trust_anchor
[xxvii] https://www.techtarget.com/searchenterpriseai/definition/generative-AI
[xxviii] https://en.m.wikipedia.org/wiki/Moore%27s_law
[xxix] https://www.gpsworld.com/the-role-of-atomic-clocks-in-data-centers/#:~:text=The%20atomic%20clock%20time%20transmitted,transmitted%20time%20is%20not%20available
[xxx] https://encyclopedia.pub/entry/34723
[xxxi] https://www.linkedin.com/pulse/why-we-need-quantum-digital-twins-ian-gordon
[xxxii] https://www.osti.gov/servlets/purl/1640593?ref=blog.taurushq.com
[xxxiii] https://cointelegraph.com/news/quantum-miners-would-yield-massive-energy-savings-for-blockchain-study
[xxxiv] https://www.internationalinsurance.org/insights_cyber_embedded_artificial_intelligence_in_financial_services
[xxxv] https://www.quantum-alliance.de/
[xxxvi] https://www.allaboutcircuits.com/news/ibm-accelerates-germany-as-a-quantum-hub-with-eus-first-quantum-computer/
[xxxvii] https://www.qutac.de/?lang=en
[xxxviii] https://qureca.com/quantum-initiatives-worldwide-update-2023/
[xxxix] https://phys.org/partners/niels-bohr-institute/
[xl] https://www.hudson.org/policy-centers/quantum-alliance-initiative
[xli] https://qureca.com/
[xlii] https://www.nqcc.ac.uk/
[xliii] https://www.gov.uk/government/groups/regulatory-horizons-council-rhc
[xliv] https://www.qcaustralia.org/
[xlv] https://en.wikipedia.org/wiki/DARPA
[xlvi] https://entanglementexchange.org/
[xlvii] https://www.canada.ca/en/innovation-science-economic-development/news/2023/01/government-of-canada-launches-national-quantum-strategy-to-create-jobs-and-advance-quantum-technologies.html
[xlviii] https://quantum-internet.team/
[xlix] https://en.m.wikipedia.org/wiki/Sycamore_processor
[l] https://tadviser.com/index.php/Product:Zuchongzhi_(quantum_computer)
[li] https://www.scmp.com/news/china/science/article/3223364/chinese-quantum-computer-180-million-times-faster-ai-related-tasks-says-team-led-physicist-pan
[lii] https://www.dwavesys.com/
[liii] https://www.globenewswire.com/news-release/2019/01/10/1686297/0/en/Market-for-Post-Quantum-Cryptography-Software-and-Devices-to-Reach-3-9-billion-by-2028.html
[liv] https://www.nvidia.com/en-us/
[lv] https://www.mckinsey.com/~/media/mckinsey/business%20functions/mckinsey%20digital/our%20insights/quantum%20computing%20funding%20remains%20strong%20but%20talent%20gap%20raises%20concern/quantum-technology-monitor.pdf

David Piesse

 

About the Author:
David Piesse is CRO of Cymar. David has held numerous positions in a 40-year career including Global Insurance Lead for SUN Microsystems, Asia Pacific Chairman for Unirisx, United Nations Risk Management Consultant, Canadian government roles and staring career in Lloyds of London and associated market. David is an Asia Pacific specialist having lived in Asia 30 years with educational background at the British Computer Society and the Chartered Insurance Institute.

 


ITL Partner: International Insurance Society

Profile picture for user IISpartnerpage

ITL Partner: International Insurance Society

IIS serves as the inclusive voice of the industry, providing a platform for both private and public stakeholders to promote resilience, drive innovation, and stimulate the development of markets. The IIS membership is diverse and inclusive, with members hailing from mature and emerging markets representing all sectors of the re/insurance industry, academics, regulators and policymakers. As a non-advocative organization, the IIS serves as a neutral platform for active collaboration and examination of issues that shape the future of the global insurance industry. Its signature annual event, the Global Insurance Forum, is considered the premier industry conference and is attended by 500+ insurance leaders from around the globe.


Additional Resources

 


Claims Perspectives

swiss re

Thought provoking insights on the evolving insurance claims landscape. Swiss Re's first issue of their bi-annual magazine sharing trends that are shaping the claims environment and perspectives in both global outlooks and local realities. 

Learn more


Advanced analytics in insurance: Utilizing building footprints derived from machine learning and high-resolution imagery

logo

Using geospatial imagery, property insurers can identify unique exposures, such as roof characteristics and tree overhang, that help to improve risk determinations.

Read More


Embedded Insurance for Telcos

bolttech

bolttech has launched a new whitepaper with GSMA and Mobile World Live, exploring how telcos can capture the next wave of growth through embedded insurance.

Read more


How P&C insurers can successfully modernize core systems

blocks

Updating technology is essential but perceived as costly and difficult. Here’s our guide for property and casualty carriers.

Read More


The Power of Lifecycle Marketing

With an integrated, data-driven approach, companies can optimize their strategies based on customers' needs at all lifecycle stages.

Three people standing all huddled around and looking at a tablet

KEY TAKEAWAYS:

--Insurance companies should be prepared to nurture leads for months—sometimes years—before finally securing a sale.

--First impressions matter, but so do second ones. Make sure your onboarding process is easy to follow and as frictionless as possible.

--Client success should be at the forefront of your engagement efforts.

--The average retention rate in the industry is high, at 84%. If your business doesn't hit that number, there is room for improvement.

--Regularly soliciting feedback through surveys, interviews or focus groups provides insights into what measures work well and what need improvement in terms of customer lifecycle marketing.

----------

Retaining insurance clients can be a daunting task, and customer churn tends to be expensive. As a result, nurturing qualified leads has become the primary driver for client retention strategies.

This article aims to address customer lifecycle marketing. It will offer actionable tips on retaining clients and empowering providers to thrive in this ever-evolving ecosystem.

Understanding the Customer Lifecycle

An essential aspect of any successful strategy is understanding its foundation. In this case, we need to define the concept of a customer lifecycle as it relates to insurance clients. This involves meticulously mapping out the journey from lead generation to long-term retention and beyond. 

The Stages in the Customer Lifecycle

Examining each stage in detail will provide valuable insights that insurance businesses can use to optimize their client efforts. 

  1. Acquisition: At this stage, your main goal is to attract potential clients who are interested in your offerings. You can do so through targeted marketing efforts such as digital campaigns or industry events.
  2. Onboarding: Once these clients choose you as their provider, smooth onboarding experiences should follow. For example, prompt communication and comprehensive educational materials set the stage for successful relationships.
  3. Engagement: Next, engage with your customers through personalized interactions that meet their unique needs and preferences. Such interactions could involve regular check-ins or sending relevant industry news updates.
  4. Retention: Now comes an equally important step—retaining satisfied customers, who become loyal advocates for your organization, providing long-term, sustained success.
  5. Assessment: Feedback collection is crucial. Measure performance metrics and identify areas where improvements can be made within your organization's client efforts. The goal is continuous optimization.

Consistent tracking and leveraging of data is pivotal in informing decision-making across the customer lifecycle.  

Acquisition Stage

The insurance industry has notoriously long sales cycles. Insurance companies should be prepared to nurture leads for months—sometimes years—before finally securing a sale.

Strategies for Attracting Potential Insurance Clients

For many, the key to success lies in crafting compelling content that addresses prospect pain points. You could talk about anything from the benefits of particular insurance policies to personalized coverage options.

Use different digital channels where potential clients are likely to gather. McKinsey found that's where 80% of policyholders begin their search. Effective tactics that come into play here include:

  • providing incentives for referrals
  • offering discounts and promotions
  • using online advertising techniques
  • leveraging social media platforms
  • working on search engine optimization

Leveraging Targeted Marketing Campaigns

For B2B companies, you can use platforms such as LinkedIn for lead generation. Powerful options let you target the audience who might be interested in what you have to offer.

Segmenting your marketing campaigns based on demographic data, pain points and goals gives you an edge over the competition. For instance, you could target high net-worth individuals with personalized offers on premium products. That could yield higher conversion rates than generic outreach campaigns.

Use a mix of email campaigns and targeted ad placements or paid search campaigns. This ensures your message reaches potential clients via multiple avenues—increasing its overall impact.

Importance of Personalized Messaging and Value Proposition

Creating tailored value propositions caters directly to specific prospect profiles. You can then forge stronger connections.

See also: 'Virtualizing' Your Customer Service

Onboarding Stage

The onboarding stage is where you start to work on retaining clients for the long term. 

Welcoming New Clients

First impressions matter, but so do second ones. Remind your client why they chose you in the first place. Make sure your onboarding process is easy to follow and as frictionless as possible.

Showcasing your company's soft skills, such as strong communication abilities and empathy, can go a long way in winning client trust. Your problem-solving skills should be put on full display by setting up customer support solutions that are useful and easy to access.

An efficient onboarding process also saves clients time. You demonstrate that you value their decision to work with your company.

Educating Clients About Their Insurance Policies and Benefits

Your clients must fully understand their policies' details and benefits, no matter how complex. Educational materials can take the form of brochures, easy-to-read guides or explainer videos that break down complex information into digestible chunks.

Being available to answer questions or offer clarifications helps establish a foundation of trust. 

Engagement Stage

Once you've successfully onboarded new clients, try to keep them engaged with your insurance products and services. Clients who feel cared for are more likely to remain loyal.

Establishing Effective Communication Channels

Client success should be at the forefront of your engagement efforts. Offer support via phone, email or live chat systems. Being available 24/7 removes any obstacles that could prevent customers from reaching out for help.

Regular status check-ins or updates can also help maintain an open line of communication while keeping clients engaged.

Leveraging Technology and Data Analytics to Understand Client Needs and Preferences

Take advantage of data analytics and customer segmentation to better understand the preferences of each individual.

Gather information from various touchpoints like websites or surveys using analytical tools. These allow companies to create personalized offerings.

Retention Stage

As we move along the customer lifecycle, the focus shifts toward retention. The average retention rate in the industry is high, at 84%. If your business doesn't hit that number, there is room for improvement.

Strategies for Fostering Long-Term Client Relationships

Many business owners will tell you that communication is different with clients than with customers. A customer is often seen as making a one-time purchase, while a client has a continuing relationship. In the insurance industry, you want to focus on the latter.

Some strategies for fostering long-term relationships can include:

  • Offering rewards programs
  • Offering loyalty incentives
  • Providing timely customer support
  • Offering discounts for multiple policies or services
  • Sending out newsletters/updates

Identifying and Addressing Client Pain Points and Concerns

To maintain existing clients, it's essential to proactively identify areas of dissatisfaction or concern within each relationship. Addressing these issues promptly helps keep clients satisfied. They will feel their concerns are valued, reducing the likelihood of their seeking a different provider.

For example, let's say clients have a portal to access their accounts where they can make payments or update information. If the portal is slow or unreliable, customers may become frustrated and switch to a different provider.

Client Outreach and Personalized Offers to Encourage Loyalty

Don't take your clients for granted after onboarding. While letting your insurance business run on autopilot can be tempting, outreach is needed to increase loyalty. This could include regularly scheduled emails or cross-selling calls with your clients.

Recognizing milestones or anniversaries in the relationship helps show appreciation toward a longstanding partnership while motivating clients to remain committed patrons.

See also: Where Small Commercial Insurers Are Investing

Assessment Stage

Lastly, insurance businesses should evaluate the effectiveness of their customer lifecycle marketing efforts. This process can involve various methods, such as key performance indicator (KPI) evaluations.

Key Metrics for Assessing Client Retention and Satisfaction

Numerous KPIs like customer lifetime value and Net Promoter Score help businesses quantify their efforts' effectiveness at retaining customers:

  • Customer Lifetime Value: A prediction of how much revenue a single customer will generate over time, which aids in gauging the net contribution of marketing initiatives.
  • Net Promoter Score: Measures overall satisfaction levels based on whether a customer would recommend you to others.
  • Retention Rate: The percentage of customers who remain loyal to your business over a period.
  • Customer Complaint Resolution Rate: A reflection on how well you handle customer issues.
  • Repeat Purchases or Renewals: Measures how often customers come back to purchase new products or renew existing ones.
  • Customer Referrals: An indication of how likely customers are to recommend your business to others.
  • Engagement Rate: Measures how much your customers are engaging with your emails, campaigns and other initiatives.

Regularly soliciting customer feedback through surveys, interviews or focus groups provides valuable insights into what measures work well and what need improvement.

Conclusion

Customer lifecycle marketing plays an indispensable role in retaining insurance clients. With an integrated, data-driven approach, companies can optimize their strategies based on customers' needs at all lifecycle stages. It can be 5 to 25 times more expensive to acquire a new customer than to retain an existing one. As such, businesses should work hard to ensure that they keep their customers loyal and satisfied.

With the right KPIs in place and access to customer feedback, businesses can identify areas of improvement and maintain strong relationships with their clients.


Michael Meyer

Profile picture for user MichaelMeyer

Michael Meyer

Michael Meyer is the growth strategist at Leads at Scale.

His main areas of expertise are communication and business growth.

More Dishonesty About Honesty

A behavioral economist allegedly faked data about honesty on insurance applications — the second to be accused on the same "landmark" paper.

Image
data

With the insurance industry so focused on improving the customer experience, many have turned to behavioral economists for insights into how customers actually think, as opposed to how we think they think or how we want them to think. But... wow... behavioral economists sure aren't doing much for their credibility.

Two years ago, evidence surfaced that the data had been faked in the most famous study yet done with insurance customers. The study, conducted with 13,500 actual customers and published in a paper in 2012, supposedly showed that people were more truthful on their applications for auto insurance if they were simply asked to attest to their honesty at the beginning of the application rather than after they had completed it. But challenges eventually led to the publication of the raw data, which was obviously and rather clumsily faked, and the "landmark" paper was retracted in 2021. Blame fell on one of the four co-authors, Dan Ariely, a best-selling author who became chief behavioral officer at Lemonade in 2015 and held that position until 2020.

Now, another of those co-authors has been accused of faking the data in one of the two other studies cited in that 2012 paper and has been placed on leave by Harvard Business School. The much-cited researcher, Francesca Gino, has been accused of manufacturing the data for other studies, too.

Is it too much to ask for some honesty about honesty?

For good measure, the bloggers who exposed the fraud in Ariely's data and whose recent post at Data Colada raised the concerns about the data in Gino's say they suspect the third study cited in the 2012 paper is also based on made-up data. For now, they conclude:

"That’s right: Two different people independently faked data for two different studies in a paper about dishonesty."

Ariely has always denied faking his results, though neither he nor anyone else has ever explained why The Hartford, which let him conduct the test with its customers, would have fed him false data. As I noted in this Six Things when the initial fraud was exposed, I also found it odd that he waited so long to publish his results — he told me, personally, about the study in 2008 when we met at a conference where we both spoke, yet he didn't publish until 2012.

On the recent allegations, Gino declined to comment for this story in the New York Times, which details the challenges to her work.

Where does that leave us? Do we abandon behavioral economics as a tool for improving the customer experience and for encouraging people to buy insurance?

I suggest a distinction: We should rely less on behavioral economists but still think in terms of behavioral economics. 

As the Times article shows, it's easy for behavioral economists to fake the data, and it seems they have incentives to do so. Behavior changes when fame and fortune beckon, even for the best-intentioned — as these economists should know better than anyone. So, I'll be more skeptical of the specifics in many of the studies they cite, even as I'll continue to read books like "Nudge" to provoke my thinking.

Even as we perhaps wean ourselves from the pronouncements of the famous behavioral economists, though, we can all see how behavioral psychology shapes consumers. Just think about how social media companies manipulate us to keep control of our eyeballs. They have made a fortune in the "attention economy" by continually tweaking their algorithms to make us mad or sad or intrigued enough to click on just one more thing or to just keep scrolling.

Certainly, the federal government thinks consumers can be manipulated. The Federal Trade Commission won an $18.5 million settlement from Publishers Clearing House this week on the grounds that it used "dark patterns" to trick customers into paying for products or giving up their data. Last week, the FTC sued Amazon, also for allegedly used dark patterns, to trick customers into signing up for Prime and to keep them from dropping the service. (Amazon denies doing anything nefarious.)

There's also a rock solid basis for behavioral economics, anchored in the work of Daniel Kahneman on cognitive biases, that makes clear that we have to take humans as they are, not as the perfectly rational actors that traditional economists want them to be. Nobody actually thinks based on utility functions and indifference curves, even though they look so clean on a whiteboard.

So, insurers should continue to lean in to behavioral psychology as they try to figure out what bothers customers throughout the insurance lifecycle and as they try to find ways to motivate customers to buy the protection they need. Insurers should just do it primarily on their own, using A/B testing. 

Did that email trigger a response? How about that one? Or that one? If we change X about the claims process for half our customers, what does that do to customer satisfaction? Does an increase in satisfaction translate into retention? Cross-selling? Up-selling? And so on and on and on, 24/7/365.

You might even try having people attest to their honesty at the start of an application. Just don't expect the results you were promised in that 2012 paper.

Cheers,

Paul

A Cautionary Lesson on ESG Ratings

Based on ESG ratings in isolation, Silicon Valley Bank appeared to be a sound choice of a more sustainable investment. It wasn't.

Low angle photo of a tall glass building against a blue sky that's reflective

KEY TAKEAWAYS:

--A leading data provider gave SVB a good overall ESG rating and an upper end score for governance--demonstrating the potential limitations of relying too much on single sources of outsourced, off-the-shelf ESG data.

--Increasingly, using ESG data blindly can be avoided. Insurers can supplement or cross-check rating scores with the expanding range of data available to validate and set ESG strategies.

----------

During the pandemic, Silicon Valley Bank (SVB) saw a huge increase in customer deposits, which nearly tripled from pre-pandemic levels. Some of these deposits were used to provide loans to other customers, but the bank also invested them in U.S. Treasury bonds and other bonds deemed high-quality by rating agencies.

When the Federal Reserve began raising interest rates, the U.S. yield curve shifted upward, putting downward pressure on the price of these bonds. Ordinarily, this in itself would not be an issue, as the bank doesn’t have to sell these lower-value assets at their new valuation, unless it requires liquidity to meet demand for withdrawals. Yet, that is exactly what happened with SVB: Customer withdrawals rose more than expected, and, to meet the need for liquidity, the bank had to sell these bonds at a significant loss.

SVB, in turn, announced that it would need to raise capital. Investors sold off the stock, and customers made a run on the bank for their deposits, trying to withdraw $42 billion on March 9. The following day, the Federal Deposit Insurance Corporation took over SVB’s assets. Similar concerns were raised with other banks, including Signature Bank in New York.

Green appearances can be deceptive

One less-reported aspect of its failure (including the sale of SVB's U.K. subsidiary to HSBC) is that it was well-rated from an environmental, social and governance (ESG) perspective. A leading data provider gave SVB a good overall ESG rating and an upper end score for governance.

What happened with SVB demonstrates the potential limitations of relying too much on single sources of outsourced, off-the-shelf ESG data. As important a source as third-party ESG ratings and scores are, SVB’s collapse shows that insurers must take greater ownership of the data on which they rely for setting a more sustainable investment strategy.

Beware ‘black boxes’

Currently, a lot of insurers that we speak to are still treating their third party ESG data as "black boxes," where they use the data provided without analyzing the methodology used to create the data. SVB was rated well because of its focus on creating initiatives to advance inclusion and opportunity in the innovation economy and its investments in clean energy solutions. SVB was seemingly a sound ESG diversification bet.

Increasingly, using ESG data blindly can be avoided. Insurers can supplement or cross-check rating scores with the expanding range of data available to validate and set ESG strategies. For insurers, there are clear parallels to the rationale for why and what the Prudential Regulation Authority and the Lloyd’s Market already expect with regard to validating the output of external models (e.g., economic scenario generators and CAT models). Stronger ESG investment controls will likely have capital management benefits in the future.

Ways forward could include using more than one data vendor to get different perspectives. This could also go hand in hand with developing an approach for a company sourcing their own data that more accurately aligns to their specific ESG beliefs, ambitions and targets, such as achieving net zero by a certain date, for example.

The fundamental goal should be to establish a sense check and validate primary data. Other avenues to explore could include periodic deep dives into specific sectors of investment interest to understand what’s driving overall portfolio scores.

See also: The Return of the Regulators

5 key takeaways

The main points for insurers to consider are:

  • Insurers should take greater ownership of data they’re using to inform investment decisions and manage ESG-related risks.
  • It’s also important for insurers to clearly articulate an ESG strategy for their business and create a clear link to how this will apply to their investment strategy, particularly with regard to the implications and potential trade-offs when using ESG data.
  • Insurers must validate and understand the ESG data they’re using, as opposed to relying on a black box approach. 
  • It is important for an insurer to perform due diligence on ESG data when they are using it to inform portfolio allocation decisions and for there to be greater oversight of how their asset managers are making stock selection decisions. Such due diligence might have brought to light some of the governance concerns relating to SVB. 
  • Insurers are increasingly seeking support on which validation approaches to consider when using third party ESG data and establishing governance processes around data used to inform strategic decisions.

Using Facial Analytics in Underwriting

Life and health insurance can improve the underwriting experience using AI-driven facial analytics … and a simple selfie photo.

Blonde woman looking at the camera with red lines across her face

KEY TAKEAWAYS:

--The insurance industry needs to adapt to digital natives who are tech-savvy and connected and prioritize convenience.

--Facial analytics can predict an individual's risk of illness or disease with remarkable accuracy.

--A simple selfie photo helps insurance carriers instantly triage applicants into refined risk pools.

----------

The insurance industry has long been associated with a traditional approach to doing business; often relying on face-to-face interactions and paper-based processes. However, as digital natives enter the workforce, insurance companies must adapt to their preferences and expectations to remain competitive. 

Underwriting the old way

In conventional underwriting, details such as age and gender are gathered, and survival projections are created by using tables to categorize individuals into risk categories with specific premiums. 

Traditional life insurance companies don't directly evaluate personal health and lifespan but follow established guidelines to sort individuals into risk categories based on demographic features and warning signs like smoking, obesity and pre-existing health issues. 

Candidates must then answer multiple questions about their family background and medical status. Depending on the policy, they might need to go to a clinic to provide blood and urine samples and have their blood pressure, weight and height checked.

Some insurers also look further into personal backgrounds by using independent sources that offer details on prescription medication usage and driving histories. This method is lengthy, often taking 30 to 45 days. It is expensive, due to the numerous individuals involved in collecting and analyzing the information. Additionally, it's invasive, as it requires the gathering of bodily fluids and the use of what appears to the customer to be unrelated data like driving records. 

For clients, particularly the younger generation, the life insurance underwriting process is not enjoyable. So how can we improve it?

See also: Beware the Dark Side of AI

What better way to engage with a tech-savvy generation than with a selfie photo?

A selfie?

When examining a photo featuring two individuals, the human eye can easily recognize which person appears older or younger by observing signs of age like wrinkles, age spots and lines. 

Computers, using the science of facial analytics, can mimic humans' ability to assess a face but with even greater accuracy.

During my Ph.D. work 30 years ago, I began researching the relationship between facial features and health outcomes. As computers became more powerful through the use of graphical processing units (GPUs) and advances in memory density, they propelled AI, more specifically deep learning. These advancements in deep machine learning made it possible to use AI for health intelligence. 

Now AI, powered by deep machine learning, can identify dozens of health-related signals such as body mass index, biological age, senescing rate, physical stress, heart rate, blood pressure, genetic diseases and more. Soon, we will be able to predict an individual's risk of illness or disease with remarkable accuracy.

The future of health intelligence is in preventive healthcare, the ability to leverage facial analytics to provide signals of health from any connected mobile device. By identifying early warning signs of disease or illness, this technology could help insurers and healthcare providers intervene quickly, ultimately improving patient outcomes and reducing healthcare costs. This technology will have major impacts for every region of the world, especially in  low-income and remote communities.

Using facial analytics for underwriting

Facial analysis can now be incorporated into underwriting by a simple face scan, a selfie, of a potential customer. This helps insurance carriers instantly validate self-reported and external data and triage applicants into more refined risk pools -- without needing body fluids or physician assessments. Not only is the technology more efficient and quicker for customers, but with continual improvement and training for the algorithms, it will become a more accurate and efficient method of assessing risk, which could lead to reduced premiums for policyholders, improved financial stability for insurance companies and, ultimately, better health outcomes for all. 

Crucially, facial analytics does not entail facial recognition and cannot be used for identification or tracking; instead, it concentrates on identifying characteristics associated with risk factors and lifespan. 

See also: In Race to AI, Who Guards Our Privacy?

Benefits for the insurance sector

Facial analytics can transform the insurance industry in three key ways:  

  1. As an instant verification instrument to guarantee precise reporting of key health intelligence metrics like BMI and health conditions. This will enable immediate, accelerated processing, forever changing the insurance buying experience. 
  2. As an indicator of life expectancy, aiding in documenting individuals with longer lives, determining their expected lifespan and assisting with their financial planning solutions. 
  3. As a means to offer customized health information and uncover risk aspects for debilitating or fatal illnesses, encouraging client health support – and ultimately saving lives. 

I was fortunate to have the opportunity to talk about this transformation and demonstrate facial analytics at InsureTech Connect Asia in Singapore recently. To lead this change, insurers will need to get on board with facial analytic based-AI technology. If leaders step up and take this opportunity, they will enable better pricing, detect and minimize fraud and, importantly for younger applicants, offer faster, more individualized underwriting decisions for the new digital generation of customers.


Karl Ricanek

Profile picture for user KarlRicanek

Karl Ricanek

Karl Ricanek, Ph.D., is co-founder, CEO and chief AI scientist at Lapetus Solutions. 

Dr. Ricanek has spent decades researching AI and machine learning algorithms and the relationship between facial features and health outcomes, leading him to develop a ground-breaking facial analysis system that can predict an individual's risk of illness or disease. Dr. Ricanek holds multiple patents and has over 80 referred articles and book chapters on his work. He is an adviser on AI to the U.S. National Association of Insurance Commissioners

How AI Is Shaking Up Insurance

“I think you are going to start to see CEOs who are hired for their ability to use AI in the very near future."

A Woman with Green Hair Looking at the Camera with a dark background and white code text across her face

KEY TAKEAWAYS:

--As the language models improve, the ability to reduce reliance on call centers may be coming sooner than later.

--The mundane work of auto filling applications, claim forms, coverage certificates, renewal correspondence or really any repetitive and predictable task is something ready built for an AI. AI would also be very capable at comparing coverage and policy language quickly.  As the technology evolves, AI could quickly move into writing briefs and coverage opinions. 

--AI tools will be force multipliers to make all work faster and more efficient.

----------

Whether artificial intelligence will help the insurance industry work smarter, or whether it will mean massive job losses, or maybe represent something in between is yet to be seen. But what is for certain is that the dawn of artificial intelligence has already come and that nearly every facet of the insurance industry will soon be reckoning with what it means and how it will fit in its future. 

Large language models, such as ChatGPT, and image-generation AI, such as DALL-E, have wowed audiences over the past few months, but in many respects, machine learning and algorithms have been playing a role in many aspects of the insurance industry for years already. 

Simple chatbots on websites and many underwriting tools are already using many of the baseline tools found in the new artificial intelligence tools splashing the headlines, but what is remarkable is the speed with which many of these tools are evolving and the potential many of these seem to have for quickly jumping into innovative aspects of the industry that have not yet seen AI’s influence. 

“I think you are going to start to see CEOs who are hired for their ability to use AI in the very near future,” said Bill Holden, senior vice president of executive perils for Liberty Company Insurance Brokers. “I don’t know if they are asking candidates about it now, but in back of their minds I know that all the boards are thinking about it.” 

Customer-Facing

With the rapid evolution of the large language models, the obvious first line of potential for AI’s application in the insurance industry is with the point of contact with the customer. 

Updated web or app chatbots are certainly on the horizon, as are more intuitive phone chatbots that can move beyond simple call routing operators and move more into the realm of solving customer service problems and answering coverage questions. Front desk receptionist robots could even conceivably replace humans, presuming there is still a role for bricks and mortar offices in that future.

But anyone who has spent time dealing with an automated customer service agent could be forgiven for casting doubt on whether AI will completely replace the human touch in customer-facing roles. Still, as the language models improve, the ability to reduce reliance on call centers may be coming sooner than later. 

See also: Lessons Learned on Insurance Apps

Paperwork Heroes

Moving away from the immediate customer-facing role, AI could very easily slip into an effective role just behind the scenes helping customers, and really anyone who needs to spend any time with paperwork. 

AI doesn’t get fatigued, so the mundane work of auto filling applications, claim forms, coverage certificates, renewal correspondence or really any repetitive and predictable task is something ready built for an AI. 

AI would also be very capable at comparing coverage and policy language quickly. 

Coupling the large language models with image recognition could also allow the technology to auto populate things like claims information based on uploaded photos and help underwriters make initial coverage decisions and claims settlements based on a trove of automatically generated data points. AI can easily and instantaneously interface with sensors and images and data in ways humans just can’t. 

Imagine an AI assistant assessing damages for every policyholder in a community post-disaster based on drone-captured, satellite-downloaded and customer-uploaded photography, all in a fraction of the time it would have taken a team of humans with boots on the ground. 

As the technology evolves, AI could quickly move into writing briefs and coverage opinions. 

And the ultimate use case would be using AI coupled with predictive analytics to prevent claims in the first place, and then taking it a step further and using it in a fraud detection role by analyzing patterns in claims data and applications that might have otherwise slipped past humans. 

Bias and Discrimination

AI and machine learning can move faster than humans, but unfortunately it is impossible to see inside their black box to see what is driving their decisions. Once they are trained on their data sets, they make their decisions independently, which is their strength, but when it comes to questions of bias and discrimination, potentially also a major weakness

In insurance, bias and discrimination are obviously illegal, but without knowing what is driving the decisions made by AI, there is the potential to amplify implicit bias that is already in the data that the AI could potentially exacerbate. 

“AI just doubles down on what it thinks it knows,” said Bob Gaydos, CEO of Pendella Technologies. 

With the potential for harmful assumptions to get amplified if AI gets involved in underwriting, regulators will likely take a close look at any automation that has even a whiff of potential for bias and discrimination to be introduced. 

“Bring in AI,[and] it is going to be questioned,” Gaydos said. “If we open that door, we have to be ready for that discussion.” 

Gaydos warns that today’s laws are insufficient to regulate AI underwriting, and a new age of artificial intelligence is likely to usher in an intense wave of political and regulatory scrutiny that the industry may not be ready for or anticipating as it embraces AI. 

See also: 20 Issues to Watch in 2023

The Future

There is no doubt that AI is in the door in the insurance world.

While technically there could be the potential to remove humans from every insurance process, agents, assessors and underwriters are a long way from being replaced by the current generation of AI. More likely AI job losses will be felt most acutely with the front-line workers doing tedious work — work that had previously been outsourced to call centers already. And with the more advanced work, AI tools will be force multipliers to make the work faster and more efficient. 

Now, what will the market look like decades from now? Perhaps an AI analyst will be able to give us an assessment.

Life Insurance Brokers Need Better Tech

If brokers are going to continue to provide exceptional customer service, they need the technology and resources that can back them up.

A Laptop Near Documents and Post it Paper on a White Table

KEY TAKEAWAY:

--11% of carriers say it takes more than 60 days to pay their brokers, which can fray relations. More often than not, back-office vendors are needed to provide transparency in the commission process. Vendors' agency management systems receive carriers’ commission feeds, accurately complete the commission accounting and even pay the commission directly. They can support all commission structures and custom compensation types, meaning they can handle complex hierarchy structures to process payouts--and greatly reduce delays.

----------

Despite the crucial role that brokers play in the life insurance industry, they are underserved and under-supported by carriers. 

Recent research from Equisoft reveals that the issue is exacerbated by a lack of technology. According to the study, three of the top four challenging aspects of the broker-carrier relationship are the lack of tooling, inability to track compensation and performance in real time and lack of digital capabilities.

If brokers are going to continue to provide exceptional customer experience and customer service — something that is more difficult to do with changing customer expectations and the presence of easily accessible, on-demand products and services in other industries — they need to have the technology and resources that can help back them up. This includes the use of online appointment scheduling software, mobile applications, voice agents, SMS/text, digital service portals and many others. 

Issues also arise with the commission accounting process. While base commissions and brokers' First Year Override aren’t particularly complicated, commissions can become complicated when there are differing compensation agreements for different products, when there are split commissions and when people retire — to list a few examples. 

Commissions are complicated by the fact that 11% of carriers say it takes more than 60 days to pay their brokers, according to the study. For most other industries, employees are paid on a regular, weekly, biweekly or monthly basis. This consistency, predictability and, most importantly, timeliness enables workers to budget and create financial plans for themselves. Brokers should be afforded that same respect and be paid promptly.

When it takes too long for them to be paid, they may start talking to other brokers about their situation — potentially influencing which carriers brokers pursue relationships with and for whom they will advocate.

While these issues are caused by technology, they can also be fixed using technology — specifically through digital transformation and by updating outdated legacy systems.

See also: Breathing Life Into Life Insurance

For brokers relying on aging agency management systems (AMS) effective, efficient and accurate management of commissions may be difficult. 

While the study reported that 75% of brokers use an AMS to manage their relationships with carriers, 43% of respondents indicated that they planned to update their AMS in the next 12 months. 

Brokers expect that, as the industry evolves, the way they are compensated will evolve, too. The challenge is the lack of transparency between carriers and their brokers about the commission process. 

More often than not, back-office vendors are needed to provide transparency in the commission process. Vendors' AMS receive carriers’ commission feeds, accurately complete the commission accounting and even pay the commission directly. Additionally, they can support all commission structures and custom compensation types — meaning they can handle complex hierarchy structures to process payouts.

Updating these systems and offering more digital sales and service enablement solutions leads to better broker experiences. Instead of spending time working through manual processes or worrying about when their check is going to arrive, brokers can focus on what matters most: delivering value to policyholders and providing exceptional customer experience.


Brian Carey

Profile picture for user BrianCarey

Brian Carey

Brian Carey is senior director, insurance industry principal, Equisoft.

He holds a master's degree in information systems with honors from Drexel University and bachelor's degrees in computer science and mathematics from Widener University.