Tag Archives: credibility

cyber insurance

Promise, Pitfalls of Cyber Insurance

Cyber insurance is a potentially huge but still largely untapped opportunity for insurers and reinsurers. We estimate that annual gross written premiums will increase from around $2.5 billion today to $7.5 billion by the end of the decade. Many insurers and reinsurers are looking to take advantage of what they see as a rare opportunity to secure high margins in an otherwise soft market.

However, wariness of cyber risk is widespread. Many insurers don’t want to cover it at all. Others have set limits below the levels their clients seek and have imposed restrictive exclusions and conditions – such as state-of-the-art data encryption or 100% updated security patch clauses – that are difficult for any business to maintain. Given the high cost of coverage, the limits imposed, the tight attaching terms and conditions and the restrictions on claims, many companies question if their cyber insurance policies provide real value.

Insurers are relying on tight policy terms and conditions and conservative pricing strategies to limit their cyber risk exposures. But how sustainable is this approach as clients start to question the value of their policies and concerns widen about the level and concentration of cyber risk exposures?

The risk pricing challenge

The biggest challenge for insurers is that cyber isn’t like other risks. There is limited publicly available data on the scale and financial impact of attacks, and threats are rapidly changing and proliferating. Moreover, the fact that cyber security breaches can remain undetected for several months – even years – creates the possibility of accumulated and compounded future losses.

See Also: Better Way to Assess Cyber Risks?

While underwriters can estimate the cost of systems remediation with reasonable certainty, there isn’t enough historical data to gauge further losses resulting from impairment to brands or to customers, suppliers and other stakeholders. And, although the scale of potential losses is on par with natural catastrophes, cyber incidents are much more frequent. Moreover, many insurers face considerable cyber exposures within their technology, errors and omissions, general liability and other existing business lines. As a result, there are growing concerns about both the concentrations of cyber risk and the ability of less experienced insurers to withstand what could become a rapid sequence of high-loss events. So, how can cyber insurance be a more sustainable venture that offers real protection for clients, while safeguarding insurers and reinsurers against damaging losses?

Real protection at the right price

We believe there are eight ways that insurers, reinsurers and brokers could put cyber insurance on a more sustainable footing while taking advantage of the opportunities for profitable growth.

  1. Clarify risk appetite – Despite the absence of robust actuarial data, it may be possible to develop a reasonably clear picture of total maximum loss and match it against risk appetite and tolerances. Key inputs include worst-case scenario analysis. For example, if your portfolio includes several U.S. power companies, then what losses could result from a major attack on the U.S. grid? What proportion of claims would your business be liable for? What steps could you take now to mitigate losses by reducing risk concentrations in your portfolio to working with clients to improve safeguards and crisis planning? Asking these questions can help insurers judge which industries to focus on, when to curtail underwriting and where there may be room for further coverage. Even if an insurer offers no stand-alone cyber coverage, it should gauge the exposures that exist within its wider property, business interruption, general liability and errors and omissions coverage. Cyber risks are increasingly frequent and severe, loss contagion is hard to contain and risks are difficult to detect, evaluate and price.
  2. Gain broader perspectives – Bringing in people from technology companies and intelligence agencies can lead to more effective threat and client vulnerability assessments. The resulting risk evaluation, screening and pricing process could be a partnership between existing actuaries and underwriters who focus on compensation and other third-party liabilities, and technology experts who concentrate on data and systems. This is similar to the partnership between chief risk officer (CRO) and chief information officer (CIO) teams that many companies are developing to combat cyber threats.
  3. Create tailored, risk-specific conditions – Many insurers currently impose blanket terms and conditions. A more effective approach would be to make coverage conditional on a fuller and more frequent assessment of the policyholder’s vulnerabilities and agreement to follow advised steps. This could include an audit of processes, responsibilities and governance within a client’s business. It also could draw on threat assessments by government agencies and other credible sources to facilitate evaluation of threats to particular industries or enterprises. Another possible component is exercises that mimic attacks to test both weaknesses and plans for response. As a result, coverage could specify the implementation of appropriate prevention and detection technologies and procedures. This approach can benefit both parties. Insurers will have a better understanding and control of risks, lower exposures and produce more accurate pricing. Policyholders will be able to secure more effective and economical protection. Moreover, the assessments can help insurers forge a closer, advisory relationship with clients.
  4. Share data more effectively – More effective data sharing is the key to greater pricing accuracy. For reputational reasons, many companies are wary of admitting breaches, and insurers have been reluctant to share data because of concerns over loss of competitive advantage. However, data breach notification legislation in the U.S., which is now set to be replicated in the E.U., could help increase available data volumes. Some governments and regulators have also launched data-sharing initiatives (e.g., MAS in Singapore and the U.K.’s Cyber Security Information Sharing Partnership). In addition, data pooling on operational risk, through ORIC, provides a precedent for more industrywide sharing.
  5. Develop real-time policy updates – Annual renewals and 18-month product development cycles will need to give way to real-time analysis and rolling policy updates. This dynamic approach could be likened to the updates on security software or the approach taken by credit insurers to dynamically manage limits and exposures.
  6. Consider hybrid risk transfer – Although the cyber reinsurance market is relatively undeveloped, a better understanding of evolving threats and maximum loss scenarios could encourage more reinsurers to enter the market. Risk transfer structures likely would include traditional excess of loss reinsurance in the lower layers, and the development of capital market structures for peak losses. Possible options might include indemnity or industry loss warranty structures or some form of contingent capital. Such capital market structures could prove appealing to investors looking for diversification and yield. Fund managers and investment banks could apply reinsurers’ or technology companies’ expertise to develop appropriate evaluation techniques.
  7. Improve risk facilitation – Considering the complexity and uncertainty surrounding cyber risk, there is a growing need for coordinated risk management solutions that bring together a range of stakeholders, including corporations, insurance/reinsurance companies, capital markets and policymakers. Some form of risk facilitator – possibly brokers – will need to bring together all parties and lead the development of effective solutions, including the cyber insurance standards that many governments are keen to introduce. Evaluating and addressing cyber risk is an enterprise-wide matter – not just one for IT and compliance.
  8. Enhance credibility with in-house safeguards – If an insurer can’t protect itself, then why should policyholders trust it to protect them? If the sensitive policyholder information that an insurer holds is compromised, then it likely would lead to a loss of customer trust that would be extremely difficult to restore. The development of effective in-house safeguards is essential in sustaining credibility in the cyber risk market, and trust in the enterprise as a whole.

See Also: The State of Cyber Insurance

Key questions for insurers as they assess their own and others’ security

From the board on down, insurers need to ask:

  • Who are our adversaries, what are their targets and what would be the impact of an attack?
  • We can’t defend everything, so what are the most important assets we need to protect?
  • How effective are our processes, assignment of responsibilities and systems safeguards?
  • Are we integrating threat intelligence and assessments into active cyber defense programs?
  • Are we adequately assessing vulnerabilities against the tactics and tools perpetrators use?

Implications

  • Even if an insurer chooses not to underwrite cyber risks explicitly, exposure may already be part of existing policies. Therefore, all insurers should identify the specific triggers for claims, and the level of potential exposure in policies that they may not have written with cyber threats in mind.
  • Cyber coverage that is viable for both insurers and insureds will require more rigorous and relevant risk evaluation informed by more reliable data and more effective scenario analysis. Partnerships with technology companies, cyber specialist firms and government are potential ways to augment and refine this information.
  • Rather than simply relying on blanket policy restrictions to control exposures, insurers should consider making coverage conditional on regular risk assessments of the client’s operations and the actions they take in response to the issues identified in these regular reviews. This more informed approach can enable insurers to reduce uncertain exposures and facilitate more efficient use of capital while offering more transparent and economical coverage.
  • Risk transfer built around a hybrid of traditional reinsurance and capital market structures offers promise to insurers looking to protect balance sheets.
  • To enhance their own credibility, insurers need to ensure the effectiveness of their own cyber security. Because insurers maintain considerable amounts of sensitive data, any major breach could severely affect their market credibility both in the cyber risk market and elsewhere.

Are ‘Best Practices’ Really Best?

Best practices can help companies gain a competitive advantage. However, the opposite is often true. There are various reasons for this, but, in our experience, we have observed three major problems with implementing what a company perceives to be best practices.

  • First, the benefits are elusive. They are often difficult to measure, with no baseline or true comparison, and the costs to implement them are often excessive and misunderstood. This often means there are significant implementation costs and effort with few tangible results.
  • Second, best practices exist in the rearview mirror. By the time you have adopted them, business conditions and the right ways for implementing them will have changed.
  • Third, once adopted, these practices can be very difficult to change. The organization has invested emotion, credibility, time and money, and it’s very difficult to abandon a practice even if it doesn’t work or needs to be adapted.

What’s often missing is clarity on the best practices that are most relevant to the business.

Companies naturally want to be competitive, and many seek inspiration outside their own industry in their search for the best-of-the-best. Companies tend to reach broadly, embracing a great number of potential best practices. Conversely, some companies narrowly benchmark themselves against only their peers. Following either extreme often results in missed opportunity for real improvement. In addition, if implementation occurs in silos, then best practices tend to compete with one another and increase complexity and overhead.

Although the benefits from deftly applied best practices can be real and demonstrable, they often are more elusive than anticipated and can result in frictional costs, impasses, noise in the system and ultimately few concrete benefits to the bottom line. Moreover, implementation costs can be high but may not be visible until the cost of adoption or compliance becomes evident throughout the organization. Finally, benefits may elude adoption, specification, measurement and capture.

Our observations

Best practices tend to be selected and defined in isolation. Once they have a mandate, individual business and functional areas, centers of excellence and shared services often tend to implement new practices without giving sufficient consideration to downstream implications, such as costs. New best practices come with costs, and when they are supposed to result in higher service levels, they may come with higher-than-average costs.

Accordingly, the application of new practices requires balance and compromise. They may reduce expenses in part of the organization but may increase frictional costs and place an extra burden on people, process and technology elsewhere.

A common problem is that many companies attempt to implement too many new practices in too many places. Most organizations’ ability to adapt to change is limited, and change tends to be undermanaged, especially when new best practices are required by new control mandates.

Many companies also tend to overestimate the opportunities that standardization offers. We have seen some companies try to standardize everything and inevitably encounter challenges they had not anticipated. New best practices need to be capable of changing and evolving over time, as well as being able to adaptable to local differences and requirements.

Lastly, many companies fail to conduct a cost/benefit analysis when instituting new best practice mandates. If a best practice is central to the business strategy’s success, then frictional costs are an acceptable risk. However, excessive application of best practice improvements can waste resources (e.g., a need for excess staff to perform the work, complex policies and procedures, standardization for its own sake and demands for “unnatural acts of cooperation” that hinder the business’ ability to respond to changes in the marketplace).

What should companies do differently?

Many companies have the habit of relying on practices that are not appropriate for them and therefore fail to effectively execute desired strategy. To help prevent these problems, we suggest using a success framework that prioritizes and enhances company focus on improvement efforts. This framework should have the following six characteristics:

Success Framework A Mechanism…
1.  Prioritization To identify what’s most important and to align implementation effort with strategy.
2.  Proportion To confirm that the implementation effort is proportionate to the practice’s perceived value.
3.  Readiness To assess organizational appetite and readiness.
4.  Implementation To assign authority, accountability, responsibility, and appropriate resources.
5.  Impact To track impact, including both intended and potentially unintended consequences.
6.  Change To drive continuous improvement and to authorize a full stop if warranted.

 

It is critical to first identify which best practices are worth the effort, through prioritization.

Implementing leading practices can cost money, but there may or may not be tangible benefits or related savings. The question to ask is: How good is good enough? Moreover, when the implementations of best practices compete with each other for time and focus, there are frictional costs that further minimize expected benefits. Being cognizant of frictional costs and avoiding them is critical to optimizing investment and benefit realization.

What we’ve concluded

Best practices are about performing better and therefore adding strategic and operational value.

Accordingly, because of the highly subjective nature of “best,” we suggest the term “value added practice” (VAP) instead. By putting value at the center of practice improvement efforts, a company can better plan and implement new practices. Frameworks for investment and continuous improvement are key, especially at larger organizations where budgets, controls and approvals tend to be complex.

Before embarking on a new best practices initiative, a company should perform a quick self-diagnosis. Are you trying to implement a best practice for its own sake, or are you clearly focusing on the value you hope to realize?

If you plan to invest in new capabilities without tying them to specific business objectives, then you should step back and determine just how implementing new best practices will benefit the company.