Tag Archives: naic

What Should Future of Regulation Be?

It is of course much easier to look back and second-guess regulatory actions. It is far more difficult to propose a way forward and to do so in light of the emerging hot-button issues, including data and the digitization of the industry, insurtech (and regtech), emerging and growing risks, cyber, the Internet of Things (IoT), natural catastrophes, longevity and growing protectionism. The way forward requires consideration of the primary goals of insurance regulation and raises critical questions regarding how regulators prioritize their work and how they interact with one another, with the global industry and with consumers.

We offer below some thoughts and suggestions on these important questions and on how regulation might best move forward over the next 10 years.

Establish a reasonable construct for regulatory relationships.

Relationships matter, and it is imperative for there to be careful consideration of how regulators organize their interactions and reliance on each other. We have some examples in the form of the Solvency II equivalence assessment process, the NAIC’s Qualified Jurisdiction assessment process (under the U.S. credit for reinsurance laws), the NAIC’s accreditation process for the states of the U.S., the U.S.-E.U. Covered Agreement, ComFrame, the IAIS and NAIC’s memorandum of ynderstanding and the IMF financial sector assessment program (FSAP). Each of these provide varying degrees of assessment and regulatory cooperation/reliance.

These processes and protocols, however, have largely emerged on an ad hoc, unilateral basis and in some cases have had a whiff of imperial judgment about them that may not be justified – and certainly is off-putting to counterparties. We would urge regulators to give careful consideration to the goals, guiding principles and the process for achieving greater levels of cooperation and reliance among global regulators.

We hope these efforts would include an appreciation that different approaches/systems can achieve similar results that no jurisdiction has a monopoly on good solvency regulation. There must also be respect for and recognition of local laws and a recognition that regulatory cooperation and accommodation will benefit regulators, the industry and consumers. Most importantly, regulators need to work together to develop confidence and trust in one another.

The IAIS first coined the phrase “supervisory recognition” in 2009. In March of that year, the IAIS released an “issues paper on group-wide solvency assessment and supervision.” That paper stated that:

“To the extent there is not convergence of supervisory standards and practices, supervisors can pursue processes of ‘supervisory recognition’ in an effort to enhance the effectiveness and efficiency of supervision. Supervisory recognition refers to supervisors choosing to recognize and rely on the work of other supervisors, based on an assessment of the counterpart jurisdiction’s regulatory regime.”

See also: Global Trend Map No. 14: Regulation  

The paper noted the tremendous benefits that can flow from choosing such a path:

“An effective system of supervisory recognition could reduce duplication of effort by the supervisors involved, thereby reducing compliance costs for the insurance industry and enhancing market efficiency. It would also facilitate information sharing and cooperation among those supervisors.”

This is powerful. We urge global insurance regulators to take a step back and consider how they can enhance regulatory effectiveness and efficiency by taking reasonable and prudential steps to recognize effective regulatory regimens − even where these systems are based on different (perhaps significantly different) rules and principles, but which have a demonstrated track record of effectiveness.

As noted above, we have seen some efforts at supervisory recognition. These include Solvency II’s equivalence assessment process, the NAIC’s accreditation process for other U.S. states, the NAIC “Qualified Jurisdictions” provisions for identifying jurisdictions that U.S. regulators will rely on for purposes of lowering collateral requirements on foreign reinsurers, the E.U.-U.S. Covered Agreement and the IAIS’s Memorandum on Mutual Understanding. Some of these processes are more prescriptive than others and have the danger of demanding that regulatory standards be virtually identical to be recognized. This should be avoided.

One size for all is not the way to go.

The alternative approach to recognition of different, but equally effective systems is the pursuit of a harmonized, single set of regulatory standards for global insurers. This approach is much in vogue among some regulators, who assert the “need for a common language” or for “a level playing field” or to avoid “regulatory arbitrage.” Some regulators also argue that common standards will lead to regulatory nirvana, where one set of rules will apply to all global insurers, which will then be able to trade seamlessly throughout all markets.

There are, however, a variety of solvency and capital systems that have proven their effectiveness. These systems are not identical, and indeed they have some profoundly different regulatory structures, accounting rules and other standards such as the systems deployed in the E.U. (even pre-Solvency II), the U.S., Canada, Japan, Bermuda, Australia, Switzerland and others. Attempting to assert a signal system or standard ignores commercial, regulatory, legal, cultural and political realities.

Moreover, we question some of the rationale for pursuing uniform standards, including the need for a common language. We suggest that what is really needed is for regulators to continue to work together, to discuss their respective regulatory regimes and to develop a deep, sophisticated knowledge of how their regimes work. From this, trust will develop, and from that a more effective and efficient system of regulation is possible. The engagement and trust building can happen within supervisory colleges. We have seen it emerge in the context of the E.U.-U.S. regulatory dialogue. We saw it in the context of the E.U.-U.S. Covered Agreement. No one, however, has made a compelling case for why one regulatory language is necessary to establish a close, effective working relationship among regulators.

Similarly, the call for a level playing field sounds good, but it is an amorphous, ambiguous term that is rarely, if ever, defined. Does the “playing field” include just regulatory capital requirements? If so, how about tax, employment rules, social charges? How about 50 subnational regulators versus one national regulator? Guarantee funds? Seeking a level playing field can also be code for, “My system of regulation is heavier, more expensive than yours, so I need to put a regulatory thumb on the scales to make sure you have equally burdensome regulations.” This argument was made for decades in the debate surrounding the U.S. reinsurance collateral rules. We hear it now regarding the burdens of Solvency II. It must be asked, however, whether it is the responsibility of prudential regulators to be leveling playing fields, or should their focus be solely on prudent regulatory standards for their markets.

Finally, the dark specter of regulatory arbitrage is often asserted as a reason to pursue a single regulatory standard, such as the development of the ICS by the IAIS. But one must ask if there is really a danger of regulatory arbitrage today among global, internationally active insures? Yes, a vigilant eye needs to kept for a weak link in the regulatory system, something the IMF FSAP system has sought to do, supervisory colleges can do and the IAIS is well-equipped to do. But using regulatory arbitrage as an argument to drive the establishment of the same standards for all insurers does not seem compelling.

Proportionality is required.

Often, regulators roll out new regulatory initiatives with the phrase that the new rules will be “proportionate” to the targeted insurers. Too often, it seems there is just lip service to this principle. Rarely is it defined – but it is tossed out in an attempt to say, “Do not worry, the new rules will not be excessive.” Greater debate and greater commitment to this principle is needed. Clearly a key component of it must be a careful cost/benefit analysis of any proposed new standard, with a clear articulation of the perceived danger to be addressed – including the likelihoods and severity of impact and then a credible calculation of the attendant costs – economic and otherwise to industry and to regulators. In October 2017, the U.K. Treasury Select Committee published a report criticizing the PRA for its excessively strict interpretation of Solvency II and its negative effect on the competitiveness of U.K. insurers. The report concluded that the PRA had enhanced policyholder protection at the expense of increasing the cost of capital for U.K. insurers, which hurt their ability to provide long-term investments and annuities. Although the PRA emphasized its mandate of prudential regulation and policy holder protection, the Treasury Committee reiterated its concern with how the PRA interpreted the principle of proportionality.

Simplicity rather than complexity.

Over the past 10 years, there has been a staggering increase in proposed and enacted regulatory requirements, many of which are catalogued above. There is a danger, however, that increasingly complex regulatory tools can create their own regulatory blind spots and that overly complex regulations can create a regulatory “fog of war.”

Andrew Haldane, executive director at the Bank of England, in August 2012 delivered a paper at a Federal Reserve Bank of Kansas City’s economic policy symposium, titled “The Dog and the Frisbee.” He graphically laid out when less is really more by talking about two ways of catching a Frisbee: One can “weigh a complex array of physical and atmospheric factors, among them wind speed and Frisbee rotation” − or one can simply catch the Frisbee, the way a dog does. Complex rules, Haldane said, may cause people to manage to the rules for fear of falling in conflict with them. The complexity of the rules may induce people to act defensively and focus on the small print at the expense of the bigger picture.

Focusing on the complexity of the banking world, Haldane compared the 20 pages of the Glass-Steagall Act to the 848 pages of Dodd-Frank together with its 30,000 pages of rulemaking, and compared the 18 pages of Basel 1 to the more than 1,000 pages of Basel III. The fundamental question is whether that additional detail and complexity really adds greater safety to the financial system or has just the opposite effect and significantly increases the cost. Haldane’s analysis provides compelling evidence that increasing the complexity of financial regulation is a recipe for continuing crisis. Accordingly, Haldane calls for a different direction for supervisors with “…fewer (perhaps far fewer), and more (ideally much more) experienced supervisors, operating to a smaller, less detailed rule book.”

Although Haldane’s analysis and discussion focuses on the banking system, his assessment and recommendations should be considered carefully by global insurance regulators. The sheer volume and complexity of rules, models and reports that flood into regulatory bodies raise the real question of who reviews this information, who really understands it and, worst of all, does a mountain of detailed information create a false confidence that regulators have good visibility into the risks – particular the emerging risks – that insurers are facing? A real danger exists of not seeing the forest for the trees.

See also: To Predict the Future, Try Creating It  

Regulation should promote competitiveness rather than protectionism.

At a time when competition has been growing not only from within the established companies but also more importantly from outside the traditional companies, protectionism will only inhibit growth and stifle better understanding of risk in a rapidly changing business environment. The goal must be to make the industry more competitive and to encourage transfer of innovation and create better ways to address risk, distribution of products and climate changes. Protectionism will only limit the potential of growth of the industry and is both short-sighted and self-defeating.

Recognition of the importance of positive disruption through insurtech, fintech and innovation.

The consensus is that the insurance industry is ripe for disruption because it has been slow (but is now working hard) to modernize in view of an array of innovative and technological advancements. Equally, regulators are trying to catch up with the rapid changes and are trying to understand the impacts through sandbox experiments and running separate regulatory models. The pace is fast and presents challenges for the regulators. Solvency and policyholder protection remain paramount, but cybersecurity, data protection, artificial intelligence and the digital revolution make advancements every day. Where this will lead is not clear. But changes are happening and regulators must work to understand the impact and need to calibrate regulatory rules to keep up with the industry and encourage innovation.

Regulation must be transparent.

Too often, regulation is drafted in times of crisis or behind closed doors by regulators believing they know better how to protect policy holders and how to prevent abuse of the system. As we have said, getting it right matters. A strong and healthy industry is the best way to protect consumers and policy holders. Industry engagement is essential and acknowledging and actually incorporating industry’s views is critical. This is particularly true given the dramatic changes in the insurance sector and the need to adopt regulation to new economics, business practices and consumer needs and expectations

This is an excerpt from a report, the full text of which is available here.

How to Speed Up Product Development

The traditional product development cycle in property and casualty insurance moves at a snail’s pace. Drafts, approvals, revisions, verifications of key details and other steps place months between the moment a product is envisioned and the day it becomes available to customers.

As technology speeds the pace of daily life and business, the traditional product development cycle continues to represent a drag on P&C insurers’ efficiency and bottom line. Here, we discuss some of the biggest pain points in the product development cycle and ways to boost speed without sacrificing quality.

Cycle Slowdown No. 1: Outdated Processes

During the last few decades of the 20th century and into the 21st, speeding up the product development cycle wasn’t on most P&C insurers’ to-do lists, Debbie Marquette wrote in a 2008 issue of the Journal of Insurance Operations. Using the fax and physical mail options of the time kept pace with the as-needed approach to product development.

Marquette noted that in previous decades, product development not only involved a team, but it often involved in-person meetings. “It was difficult to get all the appropriate parties together for a complete review of the product before the filing,” Marquette wrote, “and, therefore, input from a vital party was sometimes missed, resulting in costly mistakes, re-filing fees and delays in getting important products to market before the competition.”

In the 1990s, the National Association of Insurance Commissioners (NAIC) realized that the rise of computing required a change in the way new insurance products were filed and tracked. The result was the System for Electronic Rate and Form Filing (SERFF).

SERFF’s use rose steadily after its introduction in 1998, and use of the system doubled from 2003 to 2004 alone, according to a 2004 report by the Insurance Journal. By 2009, however, SERFF’s lack of full automation caused some commentators, including Eli Lehrer, to question whether the system needed an update, an overhaul or a total replacement.

Property and casualty insurers adapted to SERFF and the rise of other tech tools such as personal computing, word processors and spreadsheets. Yet adaptation has been slow. Today, many P&C insurers are still stuck in the document-and-spreadsheet phase of product development, requiring members of a product development team to review drafts manually and relying on human attention to detail to spot minor but essential changes.

The result? A product development process that looks remarkably similar to the process of the 1980s. The drafts and research have migrated from paper to screens, but teams must still meet physically or digitally, compare drafts by hand and make decisions — and the need to ensure no crucial detail is missed slows the product development process to a crawl.

See also: P&C Core Systems: Beyond the First Wave  

Cycle Solution No. 1: Better Systems

The technology exists to reduce the time spent in the development process. To date, however, many P&C insurers have been slow to adopt it.

Electronic product management systems streamline the process of product development. The “new-old” way of using email, spreadsheets and PDFs maintains the same walls and oversight difficulties as the “old-old” way of face to face meetings and snail mail.

In a system designed for product development, however, information is kept in a single location, automated algorithms can be used to scan for minute differences and to track changes and tracking and alerts keep everyone on schedule.

By eliminating barriers, these systems reduce the time required to create a P&C insurance product. They also help reduce errors and save mental bandwidth for team members, allowing them to focus on the salient details of the product rather than on keeping track of their own schedules and paperwork.

Cycle Slowdown No. 2: Differentiation and Specificity

Once upon a time, P&C insurers’ products competed primarily on price. As a result, there was little need to differentiate products from other products sold by the same insurer or from similar insurance products sold by competitors. During product development, insurers allowed differentiation to take a backseat to other issues.

“Prior to the mid-1990s,” Cognizant in a recent white paper notes, “insurance distributors held most of the knowledge regarding insurance products, pricing and processes — requiring customers to have the assistance of an intermediary.”

Today, however, customers know more than ever. They’re also more capable than ever of comparing P&C insurance products based on multiple factors, not only on price. That means insurance companies are now focusing on differentiation during product development — which adds time to the process required to bring an insurance product to market.

Cycle Solution No. 2: Automation

Automation tools can be employed during the product development cycle to provide better insight, track behavior to identify unfilled niches for products and lay the foundation for a strong product launch.

As Frank Memmo Jr. and Ryan Knopp note in ThinkAdvisor, omnichannel software solutions provide a number of customer-facing benefits. A system that gathers, stores and tracks customer data — and that communicates with a product management system — provides profound insights to its insurance company, as well. When automation is used to gather and analyze data, it can significantly shorten the time required to develop insurance products that respond to customers’ ever-changing needs.

“An enterprise-wide solution enables workflow-driven processes that ensure all participants in the process review and sign off where required,” Brian Abajah writes at Turnkey Africa. “Subsequently, there is reduction in product development costs and bottlenecks to result in improved speed-to-market and quality products as well as the ability to develop and modify products concurrently leading to increased revenue.”

The Future of Development: Takeaways for P&C Insurers

Insurtech has taken the lead in coordinating property and casualty insurers with the pace of modern digital life. It’s not surprising, for example, that Capgemini’s Top Ten Trends in Property & Casualty Insurance 2018 are all tech-related, from the use of analytics and advanced algorithms to track customer behavior to the ways that drones and automated vehicles change the way insurers think about and assess risk.

It’s also not surprising, then, that companies using technology from 1998 find themselves stuck in a 20th-century pace of product development — and, increasingly, with 20th-century products.

See also: How Not to Transform P&C Core Systems  

As a McKinsey white paper notes, the digital revolution in insurance not only has the potential to change the way in which insurance products are developed, but also to change the products themselves. Digital insurance coverages are on the rise, and demand is expected to increase as the first generation of digital natives begins to reach adulthood.

Alan Walker at Capgemini recently predicted that in the near future property and casualty insurance product development will become modular. “Modular design enables myriad new products to be developed quickly and easily,” Walker says.

It also allows insurers to respond more nimbly to customers’ demands for personalized coverage. And while the boardroom and paperwork approach to development is ill-equipped to handle modular products, many product development and management systems can adapt easily to such an approach.

“Insurance products embody each insurance company’s understanding of the future,” Donald Light, a director at Celent, wrote in 2006. “As an insurance company’s view of possible gains, losses, risks and opportunities change, its products must change.”

Twelve years later, Light’s words remain true. Not only must insurance company products change, but so must the processes by which companies envision, develop and edit those products.

Just as the fax machine and email changed insurance in previous decades, the rise of analytics and big data stand to revolutionize — and to speed up — the product development process.

Time for E-Signatures, Doc Management

If you want to know why insurance companies need electronic signatures and document management, you must first look at the regulatory landscape.

In the past 10 years, this climate has changed considerably, and most insurance companies are struggling to do one of two things to handle these changes: 1) make internal policies to comply with these changes without sacrificing profitability; and 2) find creative ways to outpace competitors looking for the same solutions to these problems.

Neither is an easy feat.

The National Association of Insurance Commissioners (NAIC) has even devoted a large portion of its industry report to addressing one of the myriad ways insurance companies are striving to transcend regulatory difficulties—through the efficiency of the internet.

This is a major reason why insurance companies need both electronic signatures and document management. Used separately, they are ineffective at delivering that the solutions insurance companies need. Together, their interplay makes navigating regulatory changes easy, especially those administered and upheld by the Federal Insurance Office (FIO) and NAIC.

Understanding E-Commerce and Insurance Sales Problems

Most states in the U.S. require those applying for insurance services over the internet to complete an electronic signature, whether it is used as a standalone technology or integrates with document management technologies. Although the approach may seem like common sense, its advent does away with the use of a witness or notary and brings into question the legitimacy of signatures.

See also: The Most Valuable Document That Money Can Buy  

Despite digital signatures being more efficient (after all, if e-signatures existed in 1776, all 56 U.S. delegates could’ve signed the document on the day our nation was founded; instead, it took roughly a month to collect all the signatures), they require additional authentications. This can be automated by document management tools.

Legitimizing Electronic Insurance Applications

ACORD, the Association for Cooperative Operations Research and Development, achieved this automation by making digital forms available on its domain. Application of electronic signature technology situated in document management solutions just needs to be applied during the final stages of the process.

Why the Need Is Paramount

Above all else, these are the features that create an effective interplay between document management technologies and electronic signatures.

Authentication Procedures

Inclusion of a KBA challenge question helps authenticate the digital signature process. This ensures that the party attempting to sign a document is who he or she says he or she is.

IP Address Verification

IP address verification is an extra layer that can bolster the legitimacy of a signed document if a legal dispute over its authenticity ever arises.

Form Fill Automation

There are new and exciting ways to automate the form fill process for recurring client-based and document related processes. Zonal OCR makes this possible, eliminating manual processes and reducing document workload to a bare minimum.

See also: E-Signatures: an Easy Tech Win  

Bar Code Authentication

Although a bar code authentication in an electronic signature should never be a standalone backup, it does add a layer of legitimacy. A bar code is a stamp of individuality that reveals its purpose and origins quite clearly.

Ensuring Data in Documents is Unaltered

It becomes obvious that electronic signatures are more useful if applied through document management technologies, as these technologies ensure documentation is not altered.

What’s more, the role-based user permissions of a document management system can trace who changed what within a system, ensuring that those who alter data without authorization can be held accountable for their actions.

NAIC’s New Rules: Challenges, Solutions

For security and compliance professionals, the announcement of new regulatory standards can be a stark reminder that the to-do list is long and the day is short. But with careful preparation and concerted, coordinated efforts to mature governance, risk management and compliance (GRC) activities, compliance and security teams can face new rules and standards with confidence.

After many iterations and comment periods, the National Association of Insurance Commissioners (NAIC) announced the adoption of the Insurance Data Security Model Law in October 2017. The model law — which encompasses rules for licensed entities about data security and data breach investigations and notifications — establishes more rigorous guidelines for the insurance industry. It shares many similarities with the New York State Department of Financial Services (NYDFS) cybersecurity requirements for financial services companies, currently considered to be the highest bar — and a best practice — so the NAIC’s model law is likely to be adopted by many states as the governing standard.

The NAIC’s rules specify information security programs should be based on “an ongoing risk assessment, overseeing third-party service providers, investigating data breaches and notifying regulators of a cybersecurity event.”

In particular, take a close look at Section 4: Information Security Program. It details implementing a program and the requirements for assessments, reporting, audits, policies and procedures. It sounds straightforward on the surface but grows in complexity the more you read; you need to not only identify internal and external threats but also assess the potential damage and take active, concrete steps to manage the threats. Section 4 also calls for more accountability when it comes to protecting data — each insurer must submit an annual statement by February 15 certifying compliance with Section 4 or identifying areas that need improvement, as well as remediation plans.

See also: Insurance Is Not a Magazine Subscription

It is important to note that the insurance industry has unique challenges around internal risk, third parties and intricately collaborative processes. Many entities and individuals are involved in a single claim: brokers, dealers, agents, actuaries, adjustors and claims processors. This creates more room for error, more potential gaps in security coverage and more difficulty managing contributors. Comprehensive procedures supported by integrated risk management technology solutions will help weave a tighter web.

Renewed Focus on Third Parties

As is the case with many of the major cyber security and data privacy frameworks (e.g., HIPAA, NYDFS, GDPR), the NAIC’s model law gives special attention to required oversight of third-party providers. Licensed entities are responsible for ensuring that third parties implement administrative, technical and physical measures to protect and secure the information systems and nonpublic information they hold or have access to.

Meeting these requirements means licensed entities need to conduct assessments to ensure third parties are following security, privacy and notification guidelines. In Section 4.c.: Risk Assessment, it stipulates identifying threats by means of an ongoing assessment and an annual review of systems, controls, processes and procedures.

Developing a comprehensive and streamlined system for vendor risk management is an increasingly critical component of both security and compliance programs — especially for large enterprises and those with complex partnership and outsourcing structures.

Incident Response is Key

The NAIC’s model law also specifies requirements for incident investigations and mandates that breaches are reported to the commissioner within 72 hours. In this notification, insurers must provide as much information as possible, including: the date of the breach; how the information was exposed; the types of information exposed; the period during which the system was compromised; planned remediation efforts; a copy of the company’s privacy policy; and more. Additionally, licensees must notify consumers of the breach as their state’s data breach notification law requires.

It will be nearly impossible to meet these demands if your security information is outdated, incomplete or difficult to pull together. Expedient incident response can have a significant effect on outcomes. If you can quickly coordinate clear, accurate communications to regulators, third parties and customers about a breach or cyber attack, you can contain reputational damage, protect end-users and prove negligence was not a factor.

See also: It’s Time to Act on Connected Insurance

How to Become Prepared — and Stay that Way

While some of the specific requirements of NAIC’s new model law might cause alarm, most insurance businesses already have well-defined processes and controls. The need to keep sensitive customer data secure and private isn’t new, and high-profile data breaches (e.g., Equifax, Anthem, Aetna) keep a spotlight on the consequences of failing to do so.

Licensed entities are most likely to be challenged by the outer ends of the integrated risk management spectrum — the granular details of controls, policies and procedures on one end as well as the development of a sustainable security culture on the other. Both can be enhanced and reinforced through an enterprise-wide, technology-driven approach to GRC efforts.

By implementing a centralized integrated risk management platform, insurance organizations can move away from fragmented manual processes (spreadsheets and email) and toward higher degrees of automation and analytics.

The difficulty of meeting the NAIC’s requirements depends on the maturity of a company’s security and compliance program.

Companies that are already using an integrated risk management platform will easily be able to identify the gaps in compliance and efficiently make needed changes to achieve compliance. Those who do not have mature programs in place will have a longer path, from reviewing the requirements and identifying compliance gaps to the challenging goal of creating a culture of security.

Interview with Nick Gerhart (Part 3)

I recently sat with Nick Gerhart to discuss the regulatory environment for U.S. insurance carriers. Nick offers a broad perspective on regulation based on his experience: after roles at two different carriers, Nick served as Iowa insurance commissioner and currently is chief administrative officer at Farm Bureau Financial Services.

Nick is recognized as a thought leader for innovation and is regularly called on to speak and moderate at insurtech conferences and events. During our discussion, Nick described the foundation for the state-based regulatory environment, the advantages and challenges of decentralized oversight and how the system is adapting in light of innovation.

This is the last installment of a three-part series. The first focused on the regulatory framework insurers face (link). In the second part (link), Nick provided the regulator’s perspective, with a focus on the goals and tactics of the commissioner’s office. Here we discuss the best practices of the insurers in compliance reporting as well as future trends in compliance reporting.

From my experience in speaking with carriers, I’ve been struck by the challenges of reporting data in various different reports to so many different entities. A lot of carriers struggle just with the process, and the quality of the data reported suffers. So, to dive into the quality of the filings for a moment, what are you looking for?

Garbage in, garbage out, obviously.

The most obvious issues start with the outliers. And it would come back to the state catching the company filing some bad data. So, for instance, on the life and annuity side, how you define “replacement” can trigger a percentage up or down that maybe you shouldn’t have in there.

If you think about it, from the company side, a lot of MCAS data is probably gathered on an Excel spreadsheet, or in Sharepoint, or a shared drive, and it’s someone’s job to pull the data. And, he or she is often not the subject expert of the report to be filed.

Overall, companies make a commendable effort in terms of timeliness and accurate data. But, to the extent that a carrier does not pay close attention to what’s going into the file, it can be a problem. You really don’t see the output very well from a 30,000-foot view; a carrier is far more likely to have issues unless it has a really solid data entry process in place or someone who owns it on the executive team who actually knows what is going into the report.

Any examples you can share?

One that comes to mind was a company that reported an unbelievably high replacement ratio. And when we dove into it, we realized they had pulled the wrong file to calculate the rate. Now, it worked itself out, and the ratio was actually much lower, which is a good thing, but again I think companies need to pay more attention to how they are filing this data and where they’re pulling it from.

And that’s where every company could do a little bit better job. I’ve had roles in three insurance companies now, and you can look at something as a check-the-box exercise, or hey-let’s-do-it-right. In my view, if you’re a bigger company, all of this does build into your ORSA filing in some respect.

See also: Why Risk Management Certifications Matter  

Your Own Risk and Solvency Assessment is just a picture of where you are on a risk basis. But a lot of your risks are related to market issues. Every company can probably do a little bit better job of making sure the data you submit is timely, relevant and the right data.

And, when you’re looking at specific data with a report, the replacement rate within MCAS, for instance, how do you come up with that benchmark data? Are you looking at trending analysis in the context of industry benchmark data or trending within the company?

That’s a really good question. It’s more art than science; there isn’t one right way to do it. If you had a 75% replacement ratio, but you only sold four annuities, that may or may not mean anything. If you have a 75% replacement ratio, and you sold 25,000, that’s a different issue.

You start to look at it from a benchmarking of industry, a standard across the industry. Whether you can get that data from LOMA, LIMRA or WINK. Regulators have all of those same data points and benchmark studies, so you have a gut feel for what is an industry number.

Then beyond that, to your point, you’d have to dig down for context. For example, Transamerica sells a lot more life insurance and annuities than EMC National Life. A benchmark is a benchmark, but it doesn’t differentiate from a small mutual carrier or small stock carrier.

This is why context is really important. If you see a disturbing relationship or ratio develop on complaints, you have to look at the line of business, how much business they write, whether or not it’s an agent issue, or a producer issue, or home office issue, or a misunderstanding issue. You really have to dig in. Benchmarking is a start, and it’s certainly helpful.

Iowa has 216 carriers, and the vast majority are small or midsize, sometimes just county mutual carriers. You have to look at each carrier on its own, as well. The benchmark helps, but it’s not the end all and be all.

Did you look at consistency of data? For instance, premiums written is a component, in some form, of the financial reporting, market conduct and premium tax filings.

Certainly. Our team would look for consistency of data across filings. Our biggest bureau at the division was on the financial side. And that’s really where I spent a lot of my time to develop staff.

If we start to realize that a premium tax number doesn’t line up with premiums written, they start to ask questions. And sometimes there are good answers, and, other times, it’s a miss. And so, again, it’s data consistency and quality across all the reporting to make sure we have a clear picture.

Because oftentimes, it’s something we didn’t understand, or the carrier filed but didn’t pull the right number. The sophistication of the models that the companies use – as well as the sophistication of the reporting – varies greatly from small carriers to big carriers. Some have home-grown systems; some have ad hoc processes. It’s all done differently.

Do you have a sense – both from your time in industry as well as your role as insurance commissioner – how feasible it is to have a meaningful review process? To put this question in concrete terms: If you’re the CFO, you’re signing off on a lot of reports. Based on the volume of reports you’re signing, are you truly reviewing the data that’s being reported?

That’s a great question.

You’ve got reporting requirements for Sarbanes-Oxley if you’re public. You’ve got other reporting requirements under corporate governance at the state level. It’s impossible to dig into every single report for every single data point. So, you do have to rely on your staff, on your auditors and your chief accounting officer. And that’s why you have those controls in place leading up the reporting structure of those organizations.

That being said, a CFO would want to have a clear picture from a benchmarking dashboard. There are a lot of tools for people in the C-Suite for tracking and visualizing data that call out for attention when a metric is out of place or not reported.

The CFO relies on the team and the controls in place for the data to be correct in order to sign off. But, having a snapshot that showed what is filed, and when, and different data points and sources would be of immense help.

What are the consequences, from a regulator’s standpoint, of poor quality or inconsistent data? Is it reputational? Does it add to question marks around a company?

There are several things. Yes, it’s possibly reputational. But that’s in the longer term. Most immediately, the carrier is going to have to commit resources to resolve the issue.

If a commissioner’s officer is asking questions, he or she has found something. You’ve got to commit resources to adjudicate and resolve the issue. And, it could very well lead to a targeted exam, which, in turn, could end up as a full-blown market conduct exam.

It could also create a number of other issues during the triennial exam or the five-year deeper dive exam, which would require additional resources. These exams can cost quite a bit of money. And so, that’s a hard dollar cost. But, there is also the soft dollar cost of staff time, resources expended and opportunity cost in that it kept the carrier from have done something more productive.

How does this work in practice?

I can think of when I was commissioner once or twice when we had targeted exams based on filings that ultimately led us to say, “Okay, there is a problem here.” Both times were out-of-state companies.

To your point earlier, you can call an exam on any company that is doing business in your state, certainly on the market side. On the financial side, you’re going to have more deference. But, on the market side, every commissioner’s office is reviewing the data, as well.

Often for us, we would start with the complaints that are coming in, and then identify a trend with a carrier. And if you start to see a number of complaints, then you pull the data.

Some insurers have a cynical view of regulators, particularly in some states. I’ve heard them refer to this as “the cost of doing business.” They feel that, if you’re going to write policies in some states, you’re going to get fined from time to time. And then, if you get fined by one state, then you’re going to see fines from other states as well. How does this work in practice?

A carrier has an obligation to report a fine in all states in which it’s licensed. On top of that, there is this thing called the internet. When a state issues a fine – Commissioner Jones or Director Huff was famous for this – it would be followed by a press release, as well.

So, there is some truth to the idea that if an insurer has trouble in one state, it might have it in multiple states. But there is some right to have a level of cynicism. There are some states where you’re much more prone be fined. Whether this is a cost of doing business, that’s a decision for that management team. But, if there is a fine in one state, the chances that of it in multiple states is high

Our view of the world, in the Iowa division, was not necessarily to gang tackle but rather how to resolve the issue in our state. If there was a problem, we asked, “Did you make customers whole?” I would look at a systems issue with billing differently from an issue in which someone was ripped off. We tried to use judgment and look at the issues based on the facts and circumstances.

Currently, data flows from carriers to commissioners in a defined cadence. What do you think of the promises of regtech – the concept that software and system automation will allow for data to flow to regulators seamlessly, in real time and without the need for insurers to prepare and curate data for filings?

Right now the NAIC is the hub of a lot of this. And the idea that a state would get this directly from the insurer is a stretch.

What about through the NAIC?

Through the NAIC, I could see it happening. They’ll go to a cloud-based system, I’m guessing. As they make that shift, could that happen? Possibly.

I always joke that for the state of Iowa, and most states, you have the best technology from 1985. Some states are ‘95. It is a stretch to think that this could happen without the NAIC leading.

See also: The Current State of Risk Management  

The NAIC really is the hub. If you’ve been to Kansas City, you’ve seen how impressive their system is, and their folks are. NIPR, for instance, I would always joke, is a technology firm. It’s not a producer licensing firm. The NAIC has tremendous resources. Their CTO has ideas on how to streamline it further. I could see this happening in 10 years or less. The reality is that a state could never do this.

So, a state has to rely on the NAIC. Going back to why this system works, well it works because you have an association – the NAIC – that has the ability to upgrade and transform quicker than any state ever could.

Is it possible that the states could innovate on their own, outside the NAIC?

It would be hard, at best. If you think about the state-based system, if Iowa doesn’t transform as quickly as California, or Montana as Wyoming, that starts to be a problem.

The NAIC can take care of that in one fell swoop and we, as state regulators, all benefit from that work.

I could see data delivery and reporting being quicker, more meaningful, real-time. I could even see, down the road, machine learning processes put in place to help on policy review form, financial review form. I think you could get there. I don’t know if it’ll be five years, 10 years or 15 years, but it will certainly happen in my career, where it’s going to be a continuously improving process.

The NAIC is the best way that regulators keep up with the demands that are happening, through leveraging the NAIC tech and personnel.