Download

Insurers' Flawed Understanding of ROI

The industry’s traditional assessments of “returns” are not suited to the insurance company of the future.

A blue and white plane in a factory under lights and with yellow moving vehicles around it

KEY TAKEAWAYS:

--Insurers often don't fully load their costs, so they miscalculate the "I" (investment) in ROI.

--Companies also need to take a sophisticated look at all their potential returns: on the loss ratio, on retention, on sales and marketing, on the expense ratio, on ratings and on any costs that a new initiative will replace.

----------

I’m fortunate to be constantly engaging with established insurance carriers that are seeking to incorporate technology into their product offerings. Rightly, most carriers seek to understand the potential return on investment (ROI), or the cost-benefit, of new technologies such as our platform, prior to signing contracts and implementing. This is where conversations can get really exciting. However, as a former investment banker and CFO, I can confidently say that the industry’s traditional assessments of “returns” are not suited to the insurance company of the future.

We need to redefine ROI for insurance companies.

Historically, the entire ROI discussion has revolved around reduction of loss – i.e., frequency or severity. This makes sense because insurance companies are broadly run by actuaries, and this is their foundation. I am not here to poke holes in actuarial analytics, nor am I suggesting these concepts get pushed aside. What I am saying, however, is that this is far too narrow -- and there is much more to it.

When GE considered the ROI of a new jet engine factory, they didn’t try to justify the ROI of the loading dock out back. They understood that a loading dock (with a certain throughput) is needed to accept delivery of all the components of the engines that are to be built inside and sold to airlines for a profit. Similarly, the ROI for front-end data gathering or customer engagement tools that enable an improved customer relationship and increased interactions with insureds are necessary to deliver preventative devices, or provide advice, or drive awareness, etc. So, carriers must ask whether this is a stand-alone capability to be evaluated separately or an enabler of the larger prevention program that is better analyzed holistically.

If I were CFO of an insurance carrier today, I would be looking at the following:

First off: I would fully load my costs. I need to completely understand my denominator for my ROI (“I” = Investment). There is the obvious contracted cost of the solution, device or service. Then there might be technology implementation and integration costs. (Our upfront integration is de minimis, so that one is easy for us.) Next, the new service or solution needs to be deployed -- properly and effectively. This requires marketing and communication with insureds and prospects.

There are challenges:

  • carriers often have email addresses for only 15% to 30% of their customers!
  • carriers historically stink at communicating with insureds,
  • communications often require a level of coordination with their agent network, and
  • rigid, legacy or third-party platforms can pose challenges when a carrier tries to implement changes.

Add in any regulatory or filing requirements, or any incentives -- shared or otherwise -- needed to drive adoption, and you will be looking at a fully loaded cost to implement a new program.

Now the fun part! Creating a clear view on reasonable, expected “returns” can (in our case) compel a finance team to consider six key areas:

  1. Loss Ratio Impact - Data & Analytics Value
  2. Retention Rate Impact - Net Promoter Score Improvement
  3. Sales & Marketing Impact - Selling & Ecosystem Monetization
  4. Expense Ratio Impact - Efficiency Improvement
  5. Overall Rating Impact - Innovation, Improvement in Reinsurance Costs
  6. Any Replacement or Offsetting of an Existing Cost

See also: The Promise of Predictive Models

First: Loss ratio

A large part of ROI analysis revolves around the impact on loss ratio – reduction in frequency and severity of claims – over time. I won’t spend too much time here because this is the core competence of insurance carriers.

A corollary to this is: What are new sources of data and analytics worth that help improve the accuracy of rating and pricing of risk? Carriers love data, and new sources of data and analytics are immensely valuable. But how should carriers value new sources of data?

Our analytics, for example, provide insights into homeowner behaviors (e.g., who is most risky, who is safer, who is engaged with the home, etc.) and certain property conditions, features and attributes (who has braided metal hose, who has GFCI outlets installed, who has an emergency plan for their family, etc). This is new and differentiated information. Together, we work toward establishing actuarial evidence and quantifying/correlating to the impact on loss experience.

Second: Customer retention

The retention rate, typically 86% to 92% for homeowner carriers, is a critical metric for insurance companies. To assess retention rate impact, carriers must understand, “How much value does XYZ technology (or service, or solution) provide to our insureds?”

A “retained” customer is one less customer that needs to be “acquired.” However, do carriers really know their customer acquisition cost (CAC)? My suspicion is that most carriers do not, because they are not fully loading their expenses, and often these costs are shared with distribution partners. CAC for insurance companies has been estimated at $750 to $1,000, and this number is likely increasing. Importantly, the impact of this understatement of CAC is that carriers would undervalue new tools that enable customer acquisition.

In the same way, do carriers really know their customer lifetime value (CLV)? In a world where customers are arguably becoming less “sticky,” insurance companies must reassess the value of retaining a certain customer (or type of customer) or the cost or impact of losing one.

How about brand impact from the new initiative? What is the ROI of an NPS point? Of increased brand awareness in a target market? Of brand loyalty? What is the value of a happy or satisfied customer? To begin to assess this, carriers will need to first think more about their brand in the marketplace, potential threats and opportunities. Many carriers are also grappling with their own role in representing and protecting their brand in the eyes and minds of consumers, versus turning again to their agent network to represent their brand publicly and letting the brand speak through service and claims experience.

Third: What is the value of facilitating cross-sell/up-sell opportunities?

This one is particularly tricky for carriers because few have assessed the value of new (direct) digital distribution channels.

Fourth: Expense ratio and efficiency

What is the impact on efficiency? Carriers have many different work streams and processes: quantifying impact and cost savings by improving cycle times, increasing processing accuracy, reducing cycles (such as absolute number/frequency of claims) and more. Improving the claims experience has a dual impact of improving the overall customer experience and likely drives NPS, while also shortening/streamlining the internal processes for the claims handling department. Win-Win.

Fifth: Providing a rating lift

What is the longer-term AM Best ratings impact of a new technology, service or initiative? Will a new technology be expected to improve the underlying business fundamentals, reduce risk profile and loss ratio over time, strengthen customer acquisition growth or drive retention rates?

AM Best began awarding rating points for innovation initiatives three years ago. What is the ROI on innovation? Or of being a learning organization? AM Best’s initiative was invaluable to the industry (my opinion) because it so strongly acknowledged -- and put a microscope on -- the critical importance of innovation and continuing evolution of the traditional insurance business model. The industry seems to have pushed back rather aggressively, unfortunately, at the ratings methodology. However, industry transformation is now clearly underway.

What is the reinsurance impact of improved rating? To enable this, we (and carrier partners) need to also educate the reinsurance industry (and brokers) on the benefits and effectiveness of insurtech solutions. We are doing this gradually, and the appreciation is growing. This will be a longer game.

See also: How to Deliver the ROI From AI

Sixth: Does the new technology or solution replace an existing cost already being incurred?

In our case, we provide top-tier digital content from nationally recognized experts – supplanting the need for our partners to source or create content to share with insureds to drive their own engagement, demonstrate a level of care and professionalism toward their insureds, run email campaigns/social outreach and maintain their brand awareness in the market.

Lastly, what is the ROI of standing still?

Risk analysis is the fundamental, core competence of the insurance industry – above all else. So why are two-thirds of insurance carriers not able to acknowledge the risk of standing still in a world that is increasingly, and rapidly, digitally transforming?


Geoff Martin

Profile picture for user GeoffMartin

Geoff Martin

Geoff Martin is president & co-founder at vipHomeLink. He is a growth-oriented, acquisitive and entrepreneurial business leader. In 2010, Martin shifted his career focus to the exciting world of technology, following almost two decades in corporate finance.  As a senior investment banker, Martin had executed and advised clients on over $10.5 billion in corporate transactions, principally involving mergers and acquisitions, corporate partnerships and capital raising. Now, as a team leader with significant operating, business development and CFO experience, Martin is leading his third early-stage technology company.

In 2018, Martin partnered with former-IBM innovation expert and friend, Alfred Bentley, to launch vipHomeLink. The two founders identified an opportunity to leverage mobile technology, AI, behavioral science and big data to equip and empower homeowners to digitally manage and maintain their homes.  As a content-rich, interactive app solution, vipHomeLink helps make homes safer, efficient and more valuable while transforming the experience of homeownership.

Today, as president of vipHomeLink, Martin develops B2B SaaS partnerships across the insurance industry, and with smart-home device providers and other related residential companies, where vipHomeLink provides core strategic value by enabling partners to significantly increase customer engagement and retention, while improving loss ratios by preventing claims for homeowners.

Martin received an MBA from New York University's Stern School of Business and a bachelor's degree from The Pennsylvania State University. When unplugged, he enjoys playing tennis and spending time at home with his wife and three children.

The Experience/Efficiency Paradox

Insurers must move from the manufacturing era (efficiency through administrative scale) to the ecosystem era (maximizing the value of a relationship).

A woman touching the screen at a self checkout stand in a store with two avocados sitting on the scale

KEY TAKEAWAYS: 

--Insurance, like most industries, often sees a clash between the desire to be more efficient and the desire to provide a great experience for customers and employees. But there doesn't have to be a tradeoff.

--Rethinking how data is updated, for instance, can create great efficiencies for the insurer while providing a much better experience. Why make customers and employees update three policies with a single insurer when one update could flow into all three?

--The challenges to resolving the efficiency/experience paradox used to be technological but now boil down to having the right mindset.

----------

Broadly speaking, a paradox is a statement that leads to a‌ self-contradictory or illogical conclusion despite‌ valid reasoning from true premises.

For example, experience and efficiency should go hand-in-hand. In theory, greater efficiencies through automation free human capital to focus on creating better experiences. However, in most sectors, experience and efficiency can appear at odds. Improving one typically comes at the expense of the other. 

Take self-service checkouts in retail. While they are aimed at creating efficiencies for the supermarket and the shopper, it’s questionable whether they improved the consumer experience, certainly consistently. Equally, the early use of chatbots may have introduced cost efficiencies and economies of scale, but they often hurt service.

This efficiency/experience paradox often occurs when focusing on the two independently. When you focus only on one, you miss the opportunity to derive value from the other. 

Insurance, however, finds itself in an unusual position where experience and efficiency aren’t mutually exclusive but symbiotic. 

Improving Efficiency and Experience in Insurance

Look at transformation of claims handling. The work is traditionally seen as isolated from the business, and the transformation conceived and run by a team. But the efficiencies that can be gained from a data-driven model that's centered on the customer can get missed when addressing claims efficiencies in isolation. For example, when all customer data is up to date and automatically integrated into the claims experience, it becomes a far more intelligent capability.

The data integration immediately creates a greater focus on how the experience can benefit the customer. Policy details and changes in circumstances appear in real time when the claim is raised. There’s no need for policy changes to be a “change request” for the claims team to act on. Manifesting a product change is much more efficient in every customer interaction or experience throughout the business.

This works both ways. Consider a protection business that ensures changes in its customers’ lives are tracked and acted upon. Those updates can highlight a need to address changes in coverage, while also delivering sought-after efficiencies. For example, a customer moving to a new house creates an opportunity to reassess their coverage, build a deeper relationship and demonstrate a keenness to be of service.

Rather than running data-driven communications as a project or change request, you have a continuous and adapting relationship. This relationship makes sure every change in a customer's life is understood in the product context. 

From using open banking or employee records to identify changes in financial circumstances to reacting to a change of address, these aren't only new opportunities to create vastly better experiences, but they can also drive huge efficiencies. 

Automating data integration is in stark opposition to current customer experiences. A colleague recently told me they had three policies with a provider who couldn’t see them collectively and recognize that relationship. This inability made a simple address change a triplicate exercise.

Putting the insurer by the side of the customer, and completely changing the paradigm for the product and how the business works, can turn large and often costly business processes into continuous and seamless customer experience outcomes.

See also: Claims Leaders Face a Paradox

Organizational Change Is Essential

This type of holistic view is essential for today’s insurers, whose enterprise designs are having to shift in parallel with a need to massively increase their knowledge of the customer. Further, with customer expectations changing, regulatory demands increasing and rapidly evolving macro-environments forcing this transformation at pace, organizational change is needed to drive an insurer's ability to act on this shift in customer insight. 

This shift is about moving from the manufacturing era of efficiency and gaining administrative scale, to the ecosystem era of maximizing the value of a relationship. In this shift, there's a huge correlation between more efficient, automated and intelligent insurance businesses and those providing the best and differentiating experiences. 

It’s about shifting the time taken to see a change in risk through underwriting to the time when this materially changes what’s offered to the customer. Which is when it really counts. 

Take electric vehicles (EVs). Underwriters must adopt real-time changes in data regarding the risk to pricing models for these relatively new vehicles. The cost of repairs is only now being understood. 

An ability to factor in these changes currently might only force a change in price. Instead, acquiring a better understanding of ‌garages and repair networks' capability to fix the vehicle might significantly reduce the cost of a claim and remove the burden of raising prices -- while creating better, more seamless experiences for EV owners.

By bringing together often separate parts of the insurance offering and making sure the customer is never lost in the search for efficiency, and vice versa, this paradox can be turned into dramatically improved use cases for transformation.

Mindset Is Now the Biggest Obstacle to Change

At one time, not too long ago, making the shift was a technology challenge, but, with the right foundations, this is no longer the case. It is now a mindset challenge. 

IT and the insurance business are still not interwoven. The idea that the working model overall is akin to an agile software development approach seems way off. An enterprise design change is needed that makes sure the symmetry between efficiency and experience is clear and sits at the heart of change. 

It’s critical that things like fraud detection, decision making and any actions that need to be taken combine to deliver a seamless customer experience. No matter whether they are purchasing a policy, making a claim or exploring additional services. The result is a stress-free experience for “good actors” and makes sure “bad actors” are identified, exposed and dealt with.

See also: The Evolution of Frictionless Payments

Critical Factors for Success in Balancing Experience and Efficiency

Insurance is complex. However, that complexity isn’t predominantly in the product's core. It’s often in product-centric business architecture. As a result, ‌policy-based business design becomes a constraining factor in building new connected experiences for employees and customers that are easy to change and configure. 

The critical factors for transformation success must be predicated on a true sense of an enterprise design change. This is about transforming the business. 

This new age of insurance needs more efficient insurers building fully customer-centric operations that make the most of every customer experience through data-driven customer knowledge. 

Let’s have our cake and eat it!


Rory Yates

Profile picture for user RoryYates

Rory Yates

Rory Yates is the SVP of corporate strategy at EIS, a global core technology platform provider for the insurance sector.

He works with clients, partners and advisers to help them jump across the digital divide and build the new business models the future needs.

A Turning Point for Offshore Wind

The scale and scope of the global offshore wind rollout is epic but needs to be fast-tracked by financial institutions, corporates and governments.

A multitude of windmills from a distance set against a sunset

The potential of offshore wind as a viable source of clean power for the energy transition is indisputable. Investment in the sector is growing rapidly around the world, the power capacity of installations is ramping up and technological innovations are proliferating – from multi-purpose windfarms and floating installations to next-generation connectivity and drone-based maintenance.

But developers and their insurers need to manage a range of risks to successfully scale offshore wind globally, among them prototypical technology, economic pressures, more extreme weather conditions, cable damage and collision perils, as well as environmental concerns. 

In its new report, A turning point for offshore wind, Allianz Commercial, as a leading insurer of renewable energy and low-carbon technology solutions, highlights growth opportunities, tech innovations, risk trends and loss patterns for the offshore wind industry as the sector prepares for global growth. 

China has overtaken Europe as biggest market

More than 99% of the total global offshore wind installation is in Europe and Asia-Pacific today, but the U.S. is investing heavily in this sector, and China has overtaken Europe as the world’s biggest market, with half of the world’s offshore wind installations in 2023 expected to be in the country. In 2022, 8.8GW of new offshore wind capacity was added to the grid, with global installed capacity reaching 64.3GW. Around 380GW of offshore capacity is expected to be added across 32 markets over the next 10 years, according to the Global Wind Energy Council.

While growth ambitions are huge, all is not clear sailing for developers, according to the report. Spiraling costs have halted major wind projects recently, and the industry is affected by inflation, capital expenses, rising interest rates and geopolitical instability. The cost of materials and vessel hire have risen, while the supply of materials and access to contractors remains challenging. Supply chain bottlenecks, lengthy permitting procedures and delays to grid connections are also exerting pressure. 

The scale and scope of the global offshore wind rollout is epic. It requires the expansion of manufacturing footprint, port facilities and infrastructure. And it needs to be fast-tracked by all stakeholders in a joint effort – financial institutions, corporates and governments. 

See also: September ITL Focus: Resilience and Sustainability

Cables top cause of claims 

Both the energy sector and the insurance industry have considerable expertise when it comes to managing the perils of offshore wind activities. In one of its largest offshore wind insurance markets, Germany and Central Eastern Europe, Allianz Commercial has seen 53% of offshore wind claims by value from 2014 to 2020 relate to cable damage, followed by turbine failure as the second major cause (20%). From the loss of entire cables during transport to the bending of cables during installation, cable losses have incurred multimillion-dollar losses in offshore wind as cable failure can potentially put a whole network of turbines out of commission. 

Cable risk is critical, and therefore the quality of service is vital. Contractors need to provide assurance that they have the required expertise to remedy incidents and that they can source replacement components quickly to contain losses incurred during downtime.  From an underwriting perspective, with subsea cabling work, insurers pay close attention to the type of cabling used, the kind of vessels involved, the communication between client and contractor and how often qualified risk engineers will make site visits to oversee proceedings.

Tech innovations breaking the mold

The sector has to carefully manage the deployment of emerging technologies at scale. Novel approaches include so-called "energy islands," which share power among grids and nations and multi-purpose wind farms that produce green hydrogen or house battery storage facilities. Pilot projects such as the Offshore Logistics Drones from German utility company EnBW explore the deployment of drones for the maintenance and repairs of turbines, reducing the reliance on helicopters and humans. While most offshore wind power is currently "fixed-bottom," the development of leading-edge floating wind technologies in deeper ocean waters is poised for commercialization. 

Managing the increasing size of wind turbines is another key challenge. In the last 20 years, they have nearly quadrupled in height – from around 70m to 260m – almost three times taller than the Statue of Liberty in New York. Rotor diameters have increased fivefold in the past 30 years. Wind turbines with capacities of 8 or 9MW are common, and newer models reach 14 to 18MW, with a wind farm project in Australia recently announcing plans to use 20MW turbines.

Availability of specialist vessels and collision incidents also pose challenges

Another pressing problem identified in the report is the availability of specialist vessels. A bigger fleet globally is needed that goes beyond Europe as a current primary location, including installation, jack-up and support vessels. Meanwhile, vessel collision with turbines and offshore infrastructure can also result in significant losses, with an uptick in incidents seen in recent years. Although, to date, these collisions have typically involved smaller vessels, often as a result of human error, there have also been a number of incidents involving larger vessels, an increasing concern given that 2,500 wind turbines are due to be installed in the North Sea alone before 2030.

See also: Managing New Age of Construction Risks

Navigating harsher environments

Although the offshore sector in Europe has significant expertise in managing operations in hazardous marine environments, as it expands around the world there will be new developments farther from shore in territories prone to different types of weather conditions and natural catastrophes. On the East Coast of the U.S. or Taiwan, for example, wind speeds and wave action will be much more significant. It remains to be seen whether climate change will heighten the risk, as rising sea surface temperatures can intensify the strength of hurricanes. 

Despite its invaluable contribution to the net-zero transition, the offshore wind industry needs to be mindful of responsible development and environmental stewardship, the Allianz report points out. The concerns include managing the impact on biodiversity and marine wildlife or the sourcing of required raw materials, such as rare earth elements or lithium.

Allianz is supporting some of the most exciting offshore developments, whether as an investor or insurer. In its recently launched Net-Zero Transition Plan, Allianz Commercial committed to revenue growth of 150% for renewable energy and low-carbon technology by 2030. In addition, Allianz committed to €20 billion in additional investments for climate and clean-tech solutions.

As an investor, the company is contributing to about 100 wind farm and green energy projects, such as Hollandse Kust Zuid in the Netherlands, He Dreiht (Germany) and NeuConnect (UK/Germany). Allianz Commercial provides insurance coverage solutions across all stages of offshore wind development, construction and operations and is the insurer of many developments, among them Revolution Wind (U.S.), Dogger Bank Wind Farm (U.K.), NeuConnect (U.K./Germany) and Jeonnam 1 (South Korea).


Adam Reed

Profile picture for user AdamReed

Adam Reed

Adam Reed is global leader, offshore renewables and upstream energy, at global insurer Allianz Commercial.

Emerging Technologies That Streamline Claims

The convergence of AI and human intelligence can streamline claims processing, detect fraud and rewrite the narrative of settlements.

Hundreds of blue, pink, orange, and yellow dots in a spiral shape against a grey-ish blue background

In an era defined by technological advancements, industries worldwide are harnessing the power of artificial intelligence (AI) to transform and enhance their operations. The insurance sector, known for its extensive web of paperwork and meticulous claims processing, is no exception. As insurers aim to expedite claims approvals and payouts while maintaining accuracy, the spotlight has turned toward emerging technologies.

Amid the buzz surrounding AI, an exploration of its practical applications and limitations becomes vital. To make substantial investments in AI technologies during a time of economic uncertainty, leaders need to be sure that any budget expenditure today can be backed up by tangible ROI in the not-so-distant future.

Looking at processes that have historically been manual and are draining resources is the best place to start when exploring the potential benefits of automation. 

Unveiling the Time-Efficiency Conundrum

Claims professionals invest a staggering 12% of their workdays navigating records retrieval. This totals nearly six weeks every year. Such a burden not only complicates the lives of attorneys, insurance brokers and claims experts but also prolongs the resolution of settlements for those directly affected.

It is against this backdrop that the marriage of insurance and AI has gained momentum as a possible way to reduce fraud, accelerate settlements and revolutionize the industry.

Pioneering Efficient Claims Approvals Through Technology Integration

The journey toward seamless and prompt claims approvals requires both technology and human oversight. Insurers are exploring how emerging technologies can not only reduce processing times but also enhance the accuracy of assessments.

The pivotal role of human oversight cannot be overstated, particularly in quality control. Even as AI streamlines processes, human experts remain integral in guaranteeing the precision and fairness of claims evaluations. By combining the speed of AI with the discernment of human intelligence, insurers forge a path toward efficient yet responsible claims processing.

See also: Embedded Artificial Intelligence (AI) in Financial Services

AI's Investigative Prowess: Countering Fraudulent Claims

One of AI's most lauded capabilities lies in its ability to uncover fraudulent activities that may otherwise evade scrutiny. This is where techniques like medical and social canvassing come into play, acting as deterrents against fraudulent claims by scanning geographic locations for medical treatment and retrieving social information to strengthen cases. 

Innovation within the insurance landscape extends beyond immediate efficiency gains. The rise of generative AI introduces a new dimension to claims processing, possibly transforming medical canvassing, claims assessments and more. As this technology evolves, insurers, lawyers, employers and patients are expected to experience a shift in the way claims are handled.

Navigating the Uncertain Terrain Ahead

In a world where every moment matters in the aftermath of an insurable event, the insurance industry finds itself at a crossroads. The convergence of AI and human intelligence offers the potential to streamline claims processing, detect fraud and rewrite the narrative of settlements. However, this journey must be embarked upon with eyes wide open – acknowledging both the promises and the pitfalls that come with integrating emerging technologies into a domain built on trust and responsibility.


Vince Cole

Profile picture for user VinceCole

Vince Cole

Vince Cole is the CEO of Ontellus, the nation's largest tech-enabled records retrieval and claims intelligence company.

Prior to Ontellus, he served as CEO of Charles Taylor US, a leading provider of claims solutions to the U.S. and global insurance markets. Previously, Cole was CEO, Americas, and global chief strategy officer at Crawford & Company, a claims management solutions business. Before Crawford, he held executive positions at Activa Medical and Genworth Financial. He also spent 10 years at General Electric, serving in senior leadership roles in GE Financial, GE Plastics and GE Capital.

Cole holds a BS degree in engineering from Montana State University.

Continuous Improvement Comes to Insurance

Process intelligence tools let operations leaders “see” digital products being built, enabling use of statistical process control techniques.

Neon yellow arrows pointing to the right against a dark and red lit up background

KEY TAKEAWAY:

--Continuous Improvement in the production of digital goods and services is coming fast. What’s really going on in that sea of cubicles? What’s really working in work from home? As there is one best way to mount a transmission or pick a pallet, there is one best way to underwrite an insurance policy and adjust a claim. Continuous Improvement is about empowering people with the right tools to find that way.

----------

I recently visited Walmart’s distribution facility in Brooksville, Florida, which features end-to-end automation. Driverless forklifts are a weird sight.  

The facility was built in ’97, so people have lived through the cultural transformation from manual to automated. Ask hourly workers what they think, and many like monitoring screens and robots from a stationary position, while others loathe the new setup for the lack of physical activity and 20 pounds they’ve gained. Management values reduced process variability and lower error rates. And Walmart executives and shareholders love 2x to 3x the throughput on the same square footage and employee base. 

A fourth constituency absent the day of my visit--customers--are driving all this. Customers unrelentingly demand more selection, faster delivery and lower prices, and 2x to 3x productivity gains contribute to all three. The demand is: “Deliver productivity, or else.” There’s always Amazon.

Walmart’s focus isn’t on automation, per se, but Continuous Improvement, incorporating a blend of best practices from Lean, Six Sigma and Total Quality Management (TQM) that were pioneered in manufacturing. As you can monitor, for example, a car being built along an assembly line, you can monitor boxes and packages moving through a logistics operation without the need to ask a worker what they’re doing or thinking. 

Continuous Improvement in the production of digital goods and services is coming, and coming fast. What’s really going on in that sea of cubicles? What’s really working in work from home? Process intelligence tools, such as ours at Skan, let operations leaders “see” digital products being built. This new visibility enables statistical process control techniques to eliminate waste, improve products and drive defect rates toward zero.

As there is one best way to mount a transmission or pick a pallet, there is one best way to underwrite an insurance policy and adjust a claim. Continuous Improvement is about empowering people with the right tools to find that way. Automation can eventually execute the path without fail—but process optimization and standardization come first.

What’s going on here in insurance, with serious financial, social and medical inflation, feels like more than just another cycle. Insurance leaders find themselves in a new world where linear cost management techniques, such as budget cuts, fail to meet matrixed operational challenges and, worse, sap employee morale.

Budget cuts tend to imply people are part of the problem. A culture of Continuous Improvement believes people, working in concert, are the source of all solutions.  

We’re all drawn to quick fixes, magic pills (and shots) and fad diets. Continuous Improvement represents a lifestyle change with guaranteed positive results—even if they take time. As the saying goes, “A culture of Continuous Improvement is the best long-term solution to all short-term problems.”


Tom Bobrowski

Profile picture for user TomBobrowski

Tom Bobrowski

Tom Bobrowski is a management consultant and writer focused on operational and marketing excellence. 

He has served as senior partner, insurance, at Skan.AI; automation advisory leader at Coforge; and head of North America for the Digital Insurer.   

How to Plan for Armed Intruders

Mass shootings have people scared, and they want action. Here are four ways organizations can make their facilities more secure.

Two security cameras at the top right of a dark grey wall

If you look at news reports of mass shootings over the past 20 years, it can seem no place is safe from the possibility of an armed assailant. Tragic attacks have taken place in houses of worship, entertainment venues, places of business, movie theaters and other places where people gather.

The result is people are scared, and they feel someone should do something about it. According to Church Mutual’s new “Risk Radar Report — Safety in America,” more than half (54%) of Americans say their top safety concern while attending events is an armed intruder or physical violence. That percentage has increased from 45% in Church Mutual’s first survey in 2019. Meanwhile, only 27% of those surveyed feel their organization is prepared for an armed intruder event.

Insureds are looking for direction to help make their facilities more secure. Here are some ways they can do that:

1. Perform a security self-assessment.

Before taking action, organizations need to know just how secure their facility is. Plenty of tools are available to do just that, including Church Mutual’s security self-assessment, which provides an easy-to-follow checklist of steps facilities should take to prepare for the possibility of an armed intruder.

Some of the most important steps organizations can take include:

  • Making sure their grounds are well-lit.
  • Conducting background checks on all people who are involved with security functions or money handling.
  • Partnering with a local law enforcement agency to identify security concerns.
  • Creating a key control policy so those who leave the organization do not retain their keys. Also, anyone who knows the location of the safe and key should undergo a background check.
  • Controlling access to entrances during events.

2. Conduct an armed intruder tabletop drill. 

Planning a full-scale armed intruder drill can be time-consuming and expensive. But there is a much easier, quicker way to determine whether an organization truly is prepared for such an incident—a tabletop drill.

During the drill, the organization gathers a group of no more than 15 people in a room. None of the people should have any prior knowledge of the scenario being used. After reading the scenario, the group can discuss how they might handle the situation, and who should take charge.

Church Mutual provides an armed intruder tabletop drill worksheet your customers can use. Of course, they can change individual details as needed, but this worksheet gives them a good start on preparing this exercise.

3. Decide on an approach to security. 

Security teams come in all shapes and sizes and often depend on the size of the organization. When a customer develops a strategy for security, they should take into account the different risk levels associated.

  • Low risk – Unarmed volunteer security team: In this option, you organize volunteers or employees and ask them to watch for suspicious behavior, de-escalate non-violent incidents and alert people gathering in your facilities to danger. This involves minimal exposure to risk and liability.
  • Medium risk – Hired local law enforcement or private security contractors: These options provide highly trained security experienced in handling a weapon in high-intensity situations, while still following reasonable use of force standards. When hiring private contractors, your contract must ensure the contractor will assume liability for their actions. You must also thoroughly vet the contractor to verify training standards comply with applicable laws.
  • High risk – Armed volunteer security team: This option typically results in the greatest risk, as the organization will generally bear responsibility and liability for the actions of the team. A significant amount of planning, training and management is required.

Any organization that selects an armed security option must contact its insurer to discuss its plans and ensure the appropriate insurance coverages are in place.

4. Look for warning signs of possible violence. 

Not every armed intruder incident comes out of the blue. A potential armed intruder may tell others about their plans ahead of time or exhibit some of the classic warning signs of violence, including these categories:

  • Behavioral – Acts of insubordination, poor hygiene or appearance and possession of firearms.
  • Psychological – Having delusional thoughts, suffering from a mood disorder and having violent fantasies.
  • Social – Name calling, making threatening statements on social media and using abusive language.
  • Urgent – Displaying a weapon, stalking or cyber-stalking and destroying property. 

An organization can monitor social media sites and enable anonymous reporting on its website. That way, if a person does broadcast their intentions, the organization is more likely to find out about them.

Preparing for an armed intruder is not easy or comfortable, but it is necessary. Every organization should have a plan in place.


Eric Spacek

Profile picture for user EricSpacek

Eric Spacek

Eric Spacek is assistant vice president, risk control, at Church Mutual. He has more than 15 years of insurance risk control experience.

Spacek earned a bachelor's degree in English from Eastern University in St. David's, Pennsylvania, and his juris doctor degree from American University.

He earned the Associate in Risk Management (ARM) designation and has also received the Cambridge Certificate in Risk Management for Churches and Schools. 

Cyber Insurance at Inflection Point

What happens next will depend on how clearly underwriters, brokers and insurance buyers commit to building resilience.

A neon green outline of a city with buildings and also interconnected date lines all set against a black background

KEY TAKEAWAYS:

--Irresponsible competition, often driven by a desire to boost market share, is forcing prices down and softening terms and conditions for cyber policies. A softening market seems like good news for insurance buyers but inevitably leads to volatility in insurance rates and constrictions in coverage. This kind of rubber-band effect, with pricing that stretches and snaps back, destabilizes the market and removes risk transfer options for buyers and their risk advisers.

--What buyers, as well as carriers and brokers, should work toward is stability in rates and certainty on coverage, through a focus on improving cyber hygiene and increasing resilience.

----------

The impact of supply and demand on product pricing is a well-established economic principle – when supplies are high and demand is reduced, prices tend to fall. When it comes to cyber insurance coverage, this principle also applies, but there are good reasons that it shouldn’t.

Irresponsible competition, often driven by a desire to boost market share, is forcing prices down and softening terms and conditions for cyber policies. This is classic behavior that causes global market cycles in property and casualty insurance, and it has played out repeatedly in the past three decades. But this behavior ignores a bigger problem: Cyber is not a cyclical risk.

Businesses and the insurance industry find themselves at a turning point in the evolution of cyber risk management. What happens next will depend on how clearly underwriters, brokers and insurance buyers around the world see the risk that cyber events pose, and how committed they are to building resilience against this threat, thus ensuring a stable supply of coverage for the long term.

Why this turning point matters now

A softening market, in which prices fall and coverage terms relax, seems like good news for insurance buyers. This kind of market is especially welcomed by organizations that have experienced a market correction, which occurred in cyber insurance in 2020 and 2021 as ransomware attacks surged and loss ratios soared. Rate relief and easy capacity after a few years of steep increases can seem like a gift to buyers.

Unfortunately, the joy of short-term gain is almost always followed by longer-term pain. A soft market ultimately hurts policyholders because it inevitably leads to volatility in insurance rates and constrictions in coverage. This kind of rubber-band effect, with pricing that stretches and snaps back, destabilizes the market and removes risk transfer options for buyers and their risk advisers. It also isn’t limited to only one geography; this cyclical activity occurs in the U.S., Canada, the United Kingdom and across Europe.

Insurance pricing is intended to reflect the risks insurers assume in offering coverage. When risk is accurately priced, buyers gain valuable protection and insurers can achieve profit, which helps to keep the marketplace stable. It’s difficult for risk managers and cybersecurity professionals to explain to their executive teams why insurance costs and availability go up and down, and even more challenging to budget for that volatility.

In a world of cyber risk, stability and certainty are better for everyone. But irresponsible pricing and a lack of underwriting discipline undermine stability. Cyber risk remains intense, as the NetDiligence Cyber Claims Study 2022 and Resilience’s own 2022 Claims Report demonstrate. Since 2018, NetDiligence has found that the average recovery expense following a ransomware or malware attack has steadily increased for both small and medium-size enterprises (SMEs) as well as large companies.

An analysis of claims received by Resilience shows three major trends carrying forward from 2022 into 2023: the resurgence of ransomware; inadequate attention to common critical points of failure that lead to loss, such as phishing; and an increased focus on financial transfer fraud and third-party vendors instead of extortion-based cybercrime. In fact, Resilience saw a 300% increase in ransomware claims from the last two quarters of 2022 to the first quarter of 2023. 

If cyber risk is not declining, why should underwriters weaken their pricing, terms and conditions? The risk landscape in cyber suggests they should be doing the opposite.

See also: Cybersecurity Standards for Insureds Are a Must

What the industry should do next

When the insurance underwriters, brokers and the customers they serve arrive at an inflection point, they face a choice. They can decide to think and act strategically or opt for short-term results that probably won’t last. What the industry should do next, therefore, is take the following steps:

  • Reassess cyber risks and exposures. Some organizations have greatly improved their cybersecurity and thus enhanced their risk profile, so they might well merit a reduction in rates or access to greater coverage limits.
  • Maintain responsible pricing, terms and conditions that align with the customer’s risk. This approach puts the client’s interest ahead of short-term gains, which can lead to strong, long-term business relationships.
  • Focus on building cyber resilience. Effective cyber resilience requires quantifying an organization’s cyber risk and then implementing a combination of good cyber hygiene, protection and insurance that aligns to the risk. Connecting organizational silos in finance and security is foundational to building effective long-term resilience to cyber threats.
  • Change the mindset about cyber exposure. The cyber insurance marketplace has the tools, talent and data to shift its mindset from “price and pay” incident claims to “predict and prevent” cyber events. Resilience’s 2022 Claims Report found that despite reports of new threat actors and vulnerabilities, practicing cybersecurity fundamentals with cyber resilience as an investment strategy leads to significantly better outcomes for organizations and their insurers.

The current inflection point in cyber doesn’t have to destabilize the risk transfer market. Instead, it can be a turning point for greater partnerships – especially cooperation and collaboration between government and private-sector entities. It can be an opportunity to improve customer engagement and value and ease capacity restraints that deprive organizations of adequate coverage.

Most of all, this turning point can lead to a deeper commitment to cyber resilience.


Mario Vitale

Profile picture for user MarioVitale

Mario Vitale

Mario Vitale is president of Resilience, a cyber risk solution company.

Resilience was founded in 2016 by experts from across the highest tiers of the U.S. military and intelligence communities and augmented by prominent leaders and innovators from the insurance and technology industries. 

The Crisis in Flood Insurance

We may finally see consumers start to change their behaviors, either leaving risky areas or fortifying their homes and businesses.  

Image
Flooded basement

The flood insurance crisis in the U.S., which has been described as a "slow-moving hurricane," has made landfall, hitting Louisiana especially hard.

Rising premiums, to reflect soaring claims from natural catastrophes, are now hitting consumers hard enough that Louisiana and nine other states have sued to block increases in national flood insurance rates. Those increases are limited to "only" 18% a year but could eventually total more than 700% for many homeowners and businesses and cause an exodus from southern Louisiana, according to testimony at a hearing last week. 

Politicians can be expected to use regulation to protect consumers -- also known as voters -- as long as possible, but, beyond some short-term issues, insurers can't be forced to lose money. Government officials may also decide to subsidize homeowners' insurance policies, but that isn't a long-term strategy, either. Those taxpayers whose flood insurance premiums stay the same or even decline will resist subsidizing those who choose to live with greater risks.

Something has to give. We may finally see consumers start to change their behaviors, either leaving risky areas or fortifying their homes and businesses.  

The stats show why push has finally come to shove. Swiss Re reports, "From 2017 onwards, average annual insured losses from natural catastrophes have been over USD 110 billion, more than double the average of USD 52 billion over the previous five-year period.... In the coming decade, hazard intensification will likely play a bigger role as well as higher loss frequency and severity due to climate change." The number of $1 billion natural disasters (adjusted for inflation) has increased steadily each decade, from 3.3 a year in the 1980s, to 5.7 in the 1990s, 6.7 in the 2000s, 13.1 in the 2010s and 20 a year thus far in the 2020s. 

As a result, according to the New Orleans Times-Picayune, "The average Louisiana community is projected to see 134% increases for single-family homes, but communities especially prone to flooding will see much steeper hikes. One ZIP code in Plaquemines Parish is projected to see the highest increase in the nation, at 1,098%."

Those kinds of increases will get your attention.

An article in the Atlantic does a nice job of describing the tension that results. The headline is: "What Your Insurer Is Trying to Tell You About Climate Change." The subhead is even more to the point: "Insurers are trying to send a message. The government is trying to suppress it."

The article says the federal government generally discourages using its aid "to fundamentally alter how individuals behave, let alone how local and state governments function. In addition, after the largest disasters, Congress will typically approve multibillion-dollar relief funds, as it recently did after Hurricane Ian in Florida.... Protecting people in harm’s way is, I would argue, an essential part of the government’s job. But public officials are also shirking their responsibility to not leave communities vulnerable again and again."

In the short to medium term, insurers will catch loads of grief for raising premiums and canceling policies, and regulators and legislators may not be a lot of help. In California, for instance, legislators recently tried to tackle two obvious problems that have caused many insurers to stop writing homeowners insurance in the state -- insurers aren't allowed to include the cost of the reinsurance they purchase and can only use historical data for underwriting even as natural catastrophes increase in frequency and intensity -- but couldn't agree on a solution before adjourning until January.

In the long run, I think the real consequences will be borne by consumers and, to a lesser extent, taxpayers (groups that obviously overlap quite a bit). Insurers have to be able to price policies based on the risk involved, so consumers either need to reduce the risk or government needs to subsidize those risks if premiums are to remain affordable.

Consumers, in particular, can do a lot to reduce risk -- but not inexpensively. With new construction, it's fairly easy to avoid high-risk areas or to build on higher ground, to raise the elevation of the living quarters in a house, to build with materials that resist wildfire and high winds, to keep flammable landscaping farther from structures and so on. But the vast majority of homes, apartments and office buildings aren't new. They're decades old, often many decades old. So retrofitting involves a complicated calculation based on the cost and on the benefits from the reduction in risk.

Where insurers can help is by providing information. Insurers know a lot about risk, but, at the moment, they give policyholders pretty blunt feedback. We offer to renew, or we won't renew. We will renew, but with a premium increase of XX%. 

To the extent possible, insurers should tell policyholders, "We aren't renewing because...." Better yet, "Your premium is increasing XX% because... but you can reduce that increase if you do X, Y and Z." 

The lead example in the piece in the Atlantic has expertise in environmental issues and took an extensive series of measures that meant her house near Yosemite National Park passed "defensive space" inspections recommended by the state fire department -- but Allstate still canceled her policy and didn't tell her why. 

Imagine if insurers could coach homeowners, and their communities, about specific things they could to reduce their risk and, in the process, lower premiums.

That wouldn't be a panacea. There will still be loads of short-term issues as consumers and their protectors in government try to minimize increases in premiums and as insurers pull out of markets or decline to renew policies, based on prudent business discipline. Some of the recommendations would cost more than consumers or taxpayers are willing to pay. But at least we could help consumers and governments understand the realities that we're all facing because of climate change and could help define a realistic future. 

Cheers,

Paul

 

A Secret Weapon Against Claims Inflation

An active, efficient accident management program can save hundreds of dollars per claim and potentially cut days off a claim’s cycle time.

A black car and a silver car on the road with the background blurred

KEY TAKEAWAY:

--An accident management expert can help expedite the collision claims management process to mitigate financial impacts, such as storage fees, secondary tows, rental costs and more.

----------

Managing the costs associated with an accident claim has never been more important, considering today’s challenging economic climate. Amid inflation, supply-chain disruptions and a labor shortage, auto insurers also face pressure from surging storage costs and an increasingly complex process of matching vehicles needing repair with facilities that have both the capacity and the capability to fix them. These challenges result in higher costs for insurance carriers and lengthier wait times for policyholders. 

The unfortunate truth is that accident frequency and severity have been increasing over the past several years, driving up loss costs. The cost impact from this trend is even greater when policyholders delay reporting the loss to their carrier, and only 9% to 13% of policyholders report first notice of loss (FNOL) from the accident scene, according to Agero.

While some costs are unavoidable, ensuring that carriers have an active and efficient accident management program can help them save hundreds of dollars per claim and potentially cut days off a claim’s cycle time. Leveraging the assistance of specialized accident management experts can provide insight into how to improve efficiency and performance to reduce (or even eliminate) costs coming down the pike.

Let’s take a look at how an accident management expert can help improve claims management in the face of myriad industry challenges.

Navigating pitfalls and macro challenges 

First and foremost, an accident management expert can help expedite the collision claims management process to mitigate financial impacts, such as storage fees, secondary tows, rental costs and more. To lessen these impacts, an accident management expert focuses on the following three areas: 

1. Recovering the vehicle from the accident scene

Recovering the vehicle from the accident scene is critical to minimizing loss costs and expediting claims cycle times. However, reporting an accident to an insurer from the scene can be incredibly difficult for policyholders. The time immediately following an accident is particularly challenging, as drivers may have to manage multiple high-stress situations simultaneously. These include assessing damage to their vehicle and any other vehicles involved, triaging potential injuries and navigating what might be a dangerous situation. Given the potentially chaotic nature of the moments following accidents, it is not surprising that most drivers fail to notify their insurance carriers while still at the accident scene. The delay costs their carriers an extra $800 to $1,025, on average, per claim, according to Agero’s analysis of secondary tow costs from the first half of 2023.  

See also: 5 Ways Generative AI Will Transform Claims

2. Mitigating downstream costs

Unfortunately, capturing vehicles at the accident scene is not always an option, despite best efforts. As a result, insurers and policyholders assume the additional costs of storage and secondary tows that can increase claims by hundreds of dollars above the cost of simply performing one tow from the accident scene to the desired repair shop or salvage yard. Mitigating these downstream costs is where an accident management expert can make a meaningful difference.

An expert can recommend steps to streamline the FNOL process, such as by identifying and recommending a digital FNOL option for immediate accident reporting. Digital solutions can benefit both the insurer and the policyholder. For instance, mobile telematics can automatically detect and report a crash, saving insurers hundreds of dollars in loss costs and helping policyholders report their losses from the accident scene.

An accident management expert will manage their own curated network of towers to deliver quality service at a reasonable cost, especially when compared with inflated retail or police tow rates. As a bonus, an accident management expert can provide real-time updates so the policyholder isn’t left in the dark about when the tow truck will arrive.

3. Combatting body shop refusals

Body shop refusals have been a growing problem since the start of the pandemic, increasing by 187% in 2022 over the prior year, and are on track to increase an additional 34% in 2023 per Agero research. These refusals have resulted in additional tows because, once a vehicle is towed to a shop and refused, it must then be towed back to storage until another shop has availability. Each refusal results in additional tows back to storage and to the new repair destination. This leads to extended storage time and costs and increases the time that a policyholder is without their vehicle. 

An accident management expert can provide critical relief by working with insurers on how they interoperate with their body shops to mitigate these challenges and break the cycle of refusals. For instance, an accident management expert can help insurers analyze body shop refusal rates, identify regional trends and potentially manage direct repair program shops that are refusing vehicles in violation of their carrier agreements.

Findings may show that a few pre-dispatch confirmations by an insurance associate can help avoid a refusal. This information can inform best practices for confirming a shop’s capacity and determining which jobs they’re capable of repairing. As a result, agents can select the best body shop for the job and reduce the chances of a vehicle being refused, resulting in quicker repairs for the policyholder. 

See also: Insurtech Is at an Inflection Point

There’s never been a better time to partner with an accident management expert

An accident management expert can serve as a vital resource by helping to streamline the process, identify unknowns (such as the impact of rising primary tow rates) and foster collaboration among all parties.

The confluence of industry challenges from inflation and rising costs, labor and parts shortages and increased volume of accidents makes it incredibly difficult for insurers to independently address the impact these issues have on collision claims. However, by doing their part to keep costs down and wait times low, an accident management expert can help make these events more seamless for all parties involved.


Ben Zatlin

Profile picture for user BenZatlin

Ben Zatlin

Ben Zatlin serves as vice president and general manager of Agero's accident management business, a role he began in September 2021.

Previously, Zatlin led Agero's digital transformation to Swoop, its next-generation dispatching platform. Prior to Agero, Zatlin was a management consultant at professional services firm Deloitte and an operations engineer at life sciences company Abbott Laboratories.

He holds a bachelor’s in biomedical engineering from University of Southern California and an MBA from Harvard Business School.

A Rush of Cyber Attacks in Australia

The attacks show how the nation is broadly unprepared -- but also how a series of tools could have prevented or at least mitigated the attacks. 

A large green computer chip against a black background

Over the past year, Australia has been the target of numerous successful cyber-attacks. These attacks have affected a significant percentage of the country’s population of 24 million people -- with some individuals affected in multiple breaches. According to the Australian Cyber Security Centre’s Annual Cyber Threat Report 2021-2022, there were a staggering 76,000 cybercrime reports from July 2021 to June 2022 -- a 13% increase from the previous financial year.

In September 2022, Australian telecommunications giant Optus was hit in one of the largest data breaches in Australian history. The Optus attack constituted the first incident in a series of devastating, large-scale cyber attacks that exposed significant flaws in Australia’s national cyber resilience. 

On Sept. 23, 2022, Optus released a statement on its website and social media confirming a “significant” cyberattack against their systems. Personally Identifiable Information attributed to approximately 10 million current and former Optus customers -- around 40% of Australia's population -- was compromised, including names, birth dates, home addresses, phone numbers, emails, passport numbers and driving license numbers. 

The breach sent shockwaves through the nation, and the circumstances surrounding it quickly became a subject of debate. A company insider claimed human error had accidently exposed their application programming interface (API) on a test network, providing the entry point that caused the attack. Optus rejected this claim, asserting that a highly complex and sophisticated attack had occurred, where the attacker used advanced techniques to scrape a portion of the company’s consumer database, leaving open questions about the motives and depth of the breach. On Oct. 6, the Australian federal government announced the implementation of an emergency regulation that would allow Optus to share customer information to banks and government agencies to detect and prevent identity fraud in the aftermath of the attack. 

With headlines surrounding the Optus attack still dominating the Australian news cycle, days later, on Oct. 13, another public statement regarding a second potential cyber attack shocked the nation once again. Medibank, Australia’s largest private health insurance provider, alerted the Australian Securities Exchange (ASX) that it had detected “unusual activity” on its networks, emphasizing that there was no evidence that sensitive data, including customer information, had been compromised. Medibank retracted this claim one week later, confirming that customer data had indeed been compromised in the attack.

See also: Say Goodbye to Cyber's 'Dating Profile'

On Oct. 26, Medibank revealed the scope of the customer data compromised, admitting that hackers had full access to three primary customer data categories -- AHM customer data, international customer data and Medibank customer data.‍ On Nov. 7, Medibank announced that 9.7 million customers were likely to be affected. Customers were informed that Medibank would not be paying the USD$10 million ransom payment, despite the hackers’ threats to publish the stolen data on the dark web. The investigations that followed the breach revealed that Medibank’s systems had been accessed as a result of a compromised login credential (user and password) used by an unnamed third-party IT services provider.

In March of this year, malicious actors once again leveraged compromised credentials from a third-party vendor to breach the  systems of Latitude, an Australian financial services provider. Data from 14 million customers in Australia and New Zealand was stolen. Again, this data included names, addresses, emails, phone numbers, birth dates, driver’s license numbers and passport numbers. Some dated back to 2005, drawing scrutiny over why the company kept customer records beyond the required seven years. The company is now under investigation to determine if it took sufficient measures to prevent the attack, and there is a class-action lawsuit against Latitude for its failure to protect customer data.

The Latitude breach was swiftly followed by another attack. This time, HWL Ebsworth was the victim. In April, Russian-linked ransomware gang Alphv attacked the major Australian law firm, publishing (1.1 terabytes of a total 3.5 terabytes stolen from HWL Ebsworth’s systems) on its dark web dedicated leak site. At least four Australian banks were implicated by the breach -- with Westpac, NAB, the Commonwealth Bank and ANZ among the many public and private sector entities that may have had data stolen. Further, an estimated 60 departments or government agencies have used HWL Ebsworth’s services, including the Defence Department, Home Affairs, the Australian federal police, prime minister and cabinet, Services Australia and the Fair Work ombudsman.

Many other attacks have made the headlines, targeting schools and universities, hospitals and healthcare providers, government entities (including the Tasmanian government) and more. This series of large-scale attacks has led to sharp criticism of Australian government officials for their lack of cohesive cybersecurity policy. As a result, Australian Cyber Security Minister Clare O’Neil made public admissions that Australia had been in “a cyber slumber,” falling at least five years behind other developed nations regarding cybersecurity and data privacy. O’Neil, who is overseeing the overhaul of the national cybersecurity strategy, said the high-profile Optus, Medibank, Latitude and HWL data breaches are only the “tip of the iceberg” of the cyber threats facing Australia. She has invited Australians to join the “whole-of-nation effort” to bolster the country’s cyber resilience. 

Potential Causes of Concern

Several factors have been cited as contributing to Australia's relative cyber unreadiness compared with other countries.

  1. Lack of appropriate regulations and mandatory cybersecurity standards for companies holding large amounts of personal data. Unlike Europe, Australia has no overarching data protection or privacy laws with strict security and breach response requirements. The existing regulations set minimum standards that companies can meet without necessarily achieving strong security. This allows some organizations to underinvest in their cybersecurity programs and infrastructure. There are also no cybersecurity licensing requirements or mandatory external assessments of controls to encourage best practice. Furthermore, the enforcement of existing frameworks, such as the Notifiable Data Breaches (NDB) scheme, is perceived as lax, with few consequences for noncompliance, while critical infrastructure operators face limited oversight and have discretion over how they meet security obligations. Experts argue that prescriptive security-focused laws, properly enforced through auditing and penalties, are urgently needed to lift industry standards across the board in Australia. 
  2. Underinvestment in cyber defenses. Budgets allocated to cybersecurity programs by both government agencies and private organizations have fallen short of what experts recommend based on evolving threats and expanding attack surfaces. This underfunding has resulted in insufficient resources dedicated to basic but critical defensive controls like encryption, multi-factor authentication, regular security testing, patching and logging/monitoring. Australia has struggled to meet its own cybersecurity strategy target of investing 2% of GDP in cyber defenses due to inadequate budget appropriations over time. This target itself is also considered insufficient, as comparable nations spend significantly more. This wide-scale underinvestment has created shortages in defensive capabilities that are ripe for adversarial exploitation.
  3. Shortage of cybersecurity skills and talent. Despite the rapid escalation of global cyber threats, Australia has failed to produce enough skilled professionals to match the growing demand across both government and private sector organizations. Cybersecurity occupations are consistently listed in national skilled occupation shortages, yet efforts to boost the talent pool through education and training have been insufficient. Those universities and vocational programs that do offer cyber courses struggle to attract students due to a lack of industry engagement and the perceptions of limited career opportunities in Australia. Immigration pathways for global talent have also been limited, preventing firms and agencies from easily supplementing the domestic cyber workforce. 
  4. Widespread use of outdated legacy IT systems. Many large organizations, and government agencies in particular, still rely on digital infrastructures and systems that are decades old, using obsolete software and technologies no longer supported by vendors. These legacy architectures were not built with security as a primary consideration, relying on outdated protocols and lacking basic security controls. Upgrading such sprawling legacy estates is an immense logistical and budgetary challenge for organizations, due to the complex interfacing of old and new. Delaying these upgrades, however, leaves serious security vulnerabilities and exposures that attackers can readily exploit through unpatched backdoors.
  5. A misplaced focus on data sovereignty. Australia’s focus on data localization (requiring data to be stored in Australia) has discouraged offshore cloud adoption where security is generally stronger. These local data storage requirements have placed significant cost burdens on enterprises, taking funding away from cybersecurity programs and skills development. In reality, the most significant attacks typically target people/processes rather than infrastructure or location. Accordingly, these overly protectionist policies provided a false sense of security while slowing digital transformation, leaving some organizations with outdated legacy systems that are hard to defend. In today's connected digital ecosystems, where organizations increasingly leverage multiple cloud platforms for flexibility and resilience, true data sovereignty is impossible. Rather than mandating unachievable data storage models, priority should be placed on establishing robust encryption, access controls and response obligations wherever Australian data is accessed or processed.

Earlier this year, O’Neil stated that Australia must prepare for a “dystopian future” in which increasingly digitally connected cities may be “held hostage through interference in everything from traffic lights to surgery schedules.” When addressing the Sydney Dialogue conference in April 2023, she said that Australia “faced a scale and intensity in the threat landscape that far outstrips the recent cases we have seen.” 

O’Neil called out state-sponsored attackers, financially motivated cyber actors and extortionists as public enemy number one. To combat these nefarious groups and individuals, she put together a new cyber strategy, including a series of national exercises focused on protecting critical infrastructure, and aims to make Australia “the world’s most cyber-secure country by 2030.” 

Boosting Australian Cyber Resilience With Cyber Threat Intelligence Solutions 

A crucial part of O’Neil’s strategy is building a team of 100 cybersecurity specialists who will be “permanently focused on hunting down people seeking to hack our systems, and hacking back.” As with any organization’s threat-hunting efforts, rich cyber threat intelligence (CTI) that sheds light on threat actors’ activities and targets, as revealed on millions of deep and dark web sites and forums, will be paramount to Australia’s threat-hunting mission. Armed with such intelligence, the Australian government and business community can understand threat actors’ tactics, techniques and procedures (TTPs) and benefit from early warnings regarding the very first indications of potential risk -- before an attack materializes. By monitoring their attack surface and preemptively implementing necessary defensive measures to block cybercriminal efforts, Australian companies will be better equipped to manage and reduce their overall organizational threat exposure and protect their systems from attack.

How these high-profile attacks could have been prevented

Optus: Although the cause of the attack remains disputed, for this purpose, we will examine the incident based on the assumption that an unsecured API was the source of the breach. In this case, a solution such as External Attack Surface Management (EASM) could have helped detect and mitigate this exposure before it was weaponized.

EASM solutions work to continuously discover an organization's digital assets and footprint across the external attack surface on various surfaces, such as public IP addresses, domains and APIs. EASM involves performing scans from an external perspective to understand how attackers view and potentially access your systems through exposed external assets connected to the organizational network.

Had Optus implemented EASM:

  • The API exposed to the open internet would have been discovered during external scans.
  • Its configuration without proper authentication or encryption would have been identified as a security weakness ripe for cybercriminal exploitation.
  • Optus could then have corrected the issue by reconfiguring the API with valid credentials or HTTPS to reduce the attack surface.
  • EASM monitoring would ensure any new APIs deployed externally were also appropriately protected.
  • Valuable metadata about Optus' digital properties and dependencies would be collected, helping to discover additional high-risk vulnerabilities and exposures.

By knowing their external attack surface and identifying misconfigurations, EASM gives organizations visibility to gaps that threat actors could exploit from the internet before evasive attacks occur. This could have helped Optus avoid such a significant breach.

Medibank: The Medibank breach was the result of compromised credentials used by a trusted third-party IT services provider. Real-time cyber threat intelligence from the deep and dark web could have helped to identify this exposure and prevent the attack.

  • Initial access brokers actively trade stolen access credentials (usernames and passwords, remote desktop protocol access, etc.) on dedicated deep and dark web forums and markets.
  • Real-time deep and dark web cyber threat intelligence continuously monitors these underground platforms to identify compromised credentials the moment they are listed for sale.
  • Had Medibank harnessed cyber threat intelligence from initial access broker markets, it likely would have detected the third party's admin credentials being leaked/sold soon after theft occurred.
  • Most initial access trading happens within days or weeks of a breach. Faster detection is possible through combining Attack Surface Management solutions with CTI to receive immediate alerts of potentially compromised organizational access.
  • Once alerted, Medibank could have rapidly contacted the third party to validate, check login logs, reset credentials and reduce organizational exposure.
  • With the admin credentials changed before the attacker could purchase, leverage and weaponize the compromised access, data exfiltration may have been stopped or limited.

Early warnings of credential compromise through deep and dark web monitoring of organizational assets provides a critical window to contain breaches before significant damage. By monitoring the organizational attacks surface in real time across the deep and dark web -- in particular, across initial access broker marketplaces -- Medibank may have been able to detect this exposure and prevent its weaponization before cybercriminals were able to exfiltrate sensitive data belonging to approximately 10 million Medibank customers.  

Latitude: The source of the Latitude payroll data breach has not yet been confirmed publicly. While official investigations continue, cybersecurity experts analyzing the case reportedly believe the attacker(s) gained initial access either through credential theft via a phishing attack targeting Latitude employees or by exploiting an unpatched vulnerability in an internet-facing Latitude application or service. If this were a case of compromised credentials, the steps Medibank could have taken would also apply here. If the cause of the breach was through the exploitation of an unpatched vulnerability, vulnerability exploit intelligence would likely have equipped Latitude with the necessary insight to prioritize treatment before the exposure had been weaponized in attack.  

Had Latitude implemented vulnerability exploit intelligence:

  • Continuous scoping and discovery of their organizational attack surface, coupled with CPE-CVE matching, would have alerted Latitude to an unpatched, exposed vulnerability within their asset inventory.
  • Effective vulnerability exploit intelligence would then have helped determine the real-time risk of exploitation, considering critical factors such as the availability of exploit kits and POCs, instances of exploitation in the wild and heightened cybercriminal discussions surrounding the vulnerability.
  • With insight into cybercriminal discourse and activity across the deep, dark and clear web, and a real-time understanding of the likelihood of exploitation, Latitude would have been equipped with the early warning they needed to recognize this as an urgent, high-risk threat to their organization.
  • This preemptive intelligence would have allowed Latitude to accurately prioritize treatment, immediately patching the vulnerability or isolating the unpatched asset to mitigate the damage of exploitation before the vulnerability had been weaponized in attack.

Armed with comprehensive visibility into their organizational threat exposure, Latitude could have likely uncovered and addressed the vulnerability much sooner -- before data theft occurred.

HWL Ebsworth: The cause of the HWL Ebsworth data breach has not yet been officially confirmed publicly. However, the usual modus operandi of notorious ransomware gang Alphv, which claimed responsibility for the attack and leaked data from it, suggests that Alphv infiltrated the law firm's network via a targeted phishing email campaign. Alphv is known to use personalized phishing lures containing malware payloads disguised as legitimate files or links. The goal of these phishing emails is to install info-stealing malware on corporate devices to extract login credentials and other initial access vectors -- similar to the Medibank case.

See also: Partners and Cyber: To Affinity and Beyond!

As discussed, cyber threat intelligence can detect stolen corporate credentials offered for sale on initial access broker sites, providing early warning of exposure before the access vector is purchased and weaponized. Cyber threat intelligence can also help organizations preemptively block info-stealing malware before it has infected a corporate endpoint and compromised access to the network. 

  • Initial access broker listings typically note the stealer that was used to compromise the machine. Continuous monitoring of these and other deep and dark web sources can provide critical insight into the indicators of compromise (IOCs) associated with credential theft malware.
  • By integrating real-time, context-rich IOC intel into their security tools, HWL Ebsworth could have preemptively blocked indicators associated with known access compromise threats at the network/endpoint level before user exposure via phishing lures.
  • Intelligence on keyloggers, info-stealers, remote access Trojans and other post-intrusion tools advertised for sale on the cybercriminal underground -- including contextual attributes such as source, threat actor, malware family and confidence score -- delivers critical insight into attacker techniques to identify blind spots and harden the attack surface before exploitation.

Timely integration of contextual indicators into HWL Ebsworth's security infrastructure would have blocked these access vectors preemptively at network and device level.

Timely integration of a comprehensive and continuously updated feed of indicators of compromise from both open and underground sources into HWL Ebsworth's security infrastructure would have enabled the firm to preemptively block known access compromise threats, denying the vectors before phishing exposure.

Conclusion

The series of high-profile cyber attacks over the past year have shaken confidence in Australia's cyber resilience, highlighting the need to reassess security strategies across all sectors. However, they have also provided important lessons for improvement. 

Moving forward, Australia must reevaluate the outdated focus on data sovereignty, recognizing the borderless nature of the cyber threat landscape. A comprehensive, nationwide cybersecurity strategy that embraces innovation is critical, and a paradigm shift in the way that Australia conceptualizes cybersecurity is central to success. Taking inspiration from her allies in the U.S., Australia must mandate minimum security standards for companies and critical infrastructure, regularly assess compliance and strictly enforce consequences for breaches. Cybersecurity budgets must be significantly boosted to address workforce gaps and equip security teams with the tools they need to defend their systems in the face of increasingly sophisticated cybercriminals. Cyber threat intelligence and attack surface management solutions should be adopted to preemptively hunt down threats and identify weaknesses before they are exploited.

Equipped with insight into the epicenter of cybercriminal activities and discourse, security teams can confidently bolster their defenses based on a real-time understanding of threat actors, their tactics, tools, techniques and procedures and likely vectors for attack. With the right skills, resources and oversight in place, Australian businesses and government entities can substantially reduce their risk of becoming the next headline cyber incident. Most importantly, they will be better able to safeguard Australians' personal data and digital security.

By learning from these events and taking a preemptive, intelligence-led approach, Australia has a chance to emerge stronger. Now is the time for decisive action that constructs a robust security architecture for the country -- one that can withstand the cyber challenges of tomorrow.


Delilah Schwartz

Profile picture for user DelilahSchwartz

Delilah Schwartz

Delilah Schwartz is Cybersixgill's cybersecurity strategist.

She boasts expertise in the fields of extremism, internet-enabled radicalization and the cybercriminal underground.