Tag Archives: encryption

The Race Is on for ‘Post-Quantum Crypto’

Y2Q. Years-to-quantum. We’re 10 to 15 years from the arrival of quantum computers capable of solving complex problems far beyond the capacity of classical computers to solve.

PQC. Post-quantum-cryptography. Right now, the race is on to revamp classical encryption in preparation for the coming of quantum computers. Our smart homes, smart workplaces and smart transportation systems must be able to withstand the threat of quantum computers.

Put another way, future-proofing encryption is crucial to avoiding chaos. Imagine waiting for a quantum computer or two to wreak havoc before companies commence a mad scramble to strengthen encryption that protects sensitive systems and data; the longer we wait, the bigger the threat gets.

The tech security community gets this. One recent report estimates that the nascent market for PQC technology will climb from around $200 million today to $3.8 billion by 2028 as the quantum threat takes center stage.

I had the chance to visit at RSA 2019 with Avesta Hojjati, head of research and development at DigiCert. The world’s leading provider of digital certificates is working alongside other leading companies, including Microsoft Research and ISARA, to gain endorsement from the National Institute of Standards for breakthrough PQC algorithms, including Microsoft’s “Picnic”and ISARA’s qTESLA.

See also: Cybersecurity for the Insurance Industry  

Hojjati outlined the challenge of perfecting an algorithm that can make classical computers resistant to quantum hacking — without requiring enterprises to rip-and-replace their classical encryption infrastructure. For a full drill down of our discussion, give a listen to the accompanying podcast. Below are excerpts edited for clarity and length.

LW: What makes quantum computing so different than what we have today?

Hojjati: The main difference is that a classical computer is able to digest a single value (single bit) at a time,  either a zero or a one. But quantum computers are storing information in quantum bits or “qubit.” Quantum computers are able to digest 0, 1 and superposition state of both 0 and 1 to represent information. And that’s where their performance excels.

Just how fast a quantum computer can perform is based on the number of qubits.  However, whenever you’re increasing the number of qubits, you introduce the possibility of error, so what you actually need is stable qubits. Another problem is that quantum computing produces a lot of heat. So the problems of errors and heat still need to be solved.

LW: How close are we to a quantum computer than can break classical encryption?

Hojatti: To break a 400-bit RSA key, you would need to have a 1,000-qubit quantum computer, and the closest one that I have seen today is Google’s, which has around 70 qubits. That’s not enough to break RSA at this point. That being said, we’re in a transition period, and we shouldn’t wait around for quantum computers to be available to transition to post-quantum crypto.

LW: What’s the argument for doing this now?

Hojjati: It takes some forward thinking from the customer side. Do you really want to wait for quantum computers to be available to change to post-quantum crypto? For example, are you willing to distribute 10,000 IoT sensors today and then pay the cost down the line when a quantum computer is there to break the algorithm? Or are you willing to push out hybrid (digital) certificates into those devices, at the time of production, knowing they’re going to be safe 20 or 30 or 40 years from now?

LW: Can you explain “hybrid” certificate?

Hojjati: A hybrid solution is a digital certificate that features a classical crypto algorithm, like RSA or ECC, alongside a post-quantum crypto algorithm — both at the same time. It’s a single certificate that, by itself, carries two algorithms, one that allows you to communicate securely today; and the other algorithm will be one that the NIST currently has under review.

Picnic, for instance, was submitted by Microsoft Research and is one of the post-quantum crypto algorithms under NIST review; the other is qTESLA, which was submitted by ISARA Corp. A hybrid digital certificate provides the opportunity for customers to be able to see how a post-quantum crypto algorithm can work, without changing any of their infrastructure.

See also: Global Trend Map No. 12: Cybersecurity  

LW: So you take one big worry off the table as numerous other complexities of digital transformation fire off?

Hojjati: Absolutely. This is one of the elements of security-by-design. When you’re designing a device, are you thinking about the threats that are going to happen tomorrow? Or are you considering the threats that are going to happen 10 or 20 years from now? Solving this problem is actually doable today, without changing any current infrastructure, and you can keep costs down, while keeping the security level as high as possible.

The Problems With Encryption

Newly released findings from the Ponemon Institute and A10 Networks reveal that nearly half of cyber attacks in the past 12 months used encryption to evade detection and distribute malicious software. These findings challenge how we think about the powerful technology we use to protect privacy, security and authenticity. They also demonstrate very effectively how this security technology has been subverted into a powerful weapon for cyber criminals.

This research is another damning piece of evidence that a significant chunk of enterprise security spending is not effective. Possibly half, or even more, of our security technology is doing little to effectively identify bad guys hiding within encrypted traffic. And because the increasing regulations around encryption will continue to drive a dramatic increase in the volume of encrypted traffic, the number of opportunities for bad guys to hide in plain sight is increasing exponentially. We’re fixing one illness but creating a new disease.

See also: The Costs of Inaction on Encryption

Transport Layer Security (TLS) and its predecessor, Secure Sockets Layer (SSL), encrypt traffic. TLS and SSL turn on the padlock in our web browsers—they are the most widely relied upon indicators for consumers that a transaction is “secure.” This technology is used to hide data traffic from would-be hackers, but it also hides data from the latest, hot-selling security tools.

Because businesses now are being required to turn on encryption by default, encryption keys and certificates are growing at least 20% year over year—with an average of 23,000 TLS/SSL keys and certificates now used in the typical Global 2,000 company.

Volume overwhelms security efforts

As enterprises add more keys and certificates and encrypt more traffic, they are increasingly vulnerable to malicious encrypted traffic. Administrators simply do not have the tools to keep up with the growing number of keys and certificates. Venafi customers reported finding nearly 16,500 unknown TLS/SSL keys and certificates. This discovery represents a huge volume of encrypted traffic on their own networks that organizations don’t even know about.

Sadly, enterprise spending on next-generation firewalls, sandboxing technologies, behavior analytics and other sexy security systems is completely ineffective to detect this kind of malicious activity.

What does a next-generation firewall or sandbox system do with encrypted traffic? It passes the traffic straight through. If a cyber criminal gains access to encrypted traffic, then he is given a free pass by a wide range of sophisticated, state-of-the-art security controls.

Inspection a formidable task

The hard work of SSL/TLS inspection is at the core of today’s cybersecurity dynamics, but it remains largely overlooked in most enterprises. The challenge of gaining a comprehensive picture of how encryption is being used across enterprises and then gathering the keys and certificates that turn on HTTPS is daunting for even the most sophisticated organizations.

See also: How Safe Is Your Data?  

Throw in the challenge of keeping keys and certificates updated as they are renewed and replaced, and most enterprises can’t keep up. Even if multiple full-time employees are applied to the problem, they won’t be able to move at a pace that will enable them to identify bad guys hiding in encrypted traffic.

Unfortunately, as an industry we continue to ignore this gaping blind spot. For example, when the federal government’s chief information officer issued requirements for protecting all government websites with HTTPS by Dec. 31, 2016, no guidance was provided on how to defend against cyber crime that uses encryption as an attack vector.

As an industry, we’ve got to acknowledge and eliminate this blind spot. We need to be able to inspect traffic and automate the secure issuance and distribution of keys and certificates. The technology is available to solve these problems so we can use encryption safely.

But before we can solve any problem we first need to admit that we have one.

This article was written by Kevin Bocek and originally appeared on ThirdCertainty.

The Costs of Inaction on Encryption

Alarm systems have a long and varied history — from geese in ancient Rome, to noise makers that announced the presence of an intruder, to present-day electronic sensors and lasers. Originally, the creation of alarms was driven by the psychological need all humans have to establish a safe environment for themselves. Today, that same need exists, but it has been extended to include other concerns, such as valued personal possessions, merchandise and intellectual property. In the cyber realm, security is as important as it is in the physical world because people must be able to feel secure in their ability to store sensitive, high-value data. Without that sense of security, the cyber realm would lose almost all of its relevance.

Cybersecurity is established by various hardware and software components, but none of the components are more essential than strong encryption. It is such encryption that keeps bank transactions, online purchases and email accounts safe. However, there is a disturbing worldwide governmental trend to weaken encryption, which was exemplified in the legal disagreement earlier this year between Apple and the U.S. government. While there are definite aspects of the dispute that fall outside of the professional insurance sphere, there is an undeniable part of the battle for strong encryption that the professional insurance sector must not fail to acknowledge and address. The outcome of this struggle will be felt well into the 22nd century, and, perhaps, at least in the business arena, the outcome will be borne most keenly by cyber liability and technology E&O insurers.

With global attempts to reduce the effectiveness of encryption, no insurer can claim it lacks a part in the effort for resilient and ever-evolving encryption and cybersecurity measures. The Chinese government is not a supporter of privacy, and it has even hacked Google’s Gmail service and the Dalai Lama’s email account to gain access to information it has deemed disruptive. It also has been stepping up its “investigations” into products produced by U.S-based technology companies. Furthermore, after both the 2015 attack in Paris and the 2016 attack in Brussels, the debate regarding whether encryption should be allowed was re-ignited in Europe and the U.K. Recently, the French, Hungarian and British governments have made various attempts at weakening or removing encryption. Therefore, with this global challenge facing insurers, they are required to be completely aware of what is at risk for them, and they must help pave a path forward that endeavors to balance profitability of products (like cyber liability and technology E&O) with the protection those products should afford any insured.

See also: Best Practices in Cyber Security

Apple, perhaps, serves as the best example of how governmental interference with cybersecurity is an issue that requires direct and immediate intervention from insurers. There are thousands of businesses around the world that rely on the iPhone and iPad for productivity purposes — and almost all of those businesses also rely on the security that those devices provide, both from a hardware and a software standpoint. Recently, the U.S. government attempted to force Apple, in different judicial battles, to write code that will allow the government to have a master key to access the data of any iPhone. However, the U.S government is also pursuing a legislative avenue to pass a law that will force U.S. companies to give the U.S. government unfettered retrieval of any data on which it sets its sight.

To provide such access would almost always require companies to write software code that is purposefully compromised from a security standpoint. It would be extremely unwise for professional insurance companies to assume this disagreement is only between the technology sector and world governments because, if there is an outcome favorable for the U.S. government, it will have direct and immediately negative effects on insurers that offer cyber liability and technology E&O insurance in the U.S., and it will set a dangerous precedent that will embolden other governments to justify similar breaches that will allow them to acquire what should be secure data.

From a cyber liability standpoint, any vulnerability in software code gives hackers another way to compromise a victim’s computers and network. If a company like Apple (which has thousands of businesses depending on it to keep them safe) has to create a master key, then all of the businesses that use Apple products will be vulnerable to attack. The U.S. government has a long history of being unable to keep its own data safe, which means, in time, hackers will be able to figure out what entrance point was created and then exploit it. The most worrisome entities that might access the backdoor would be non-democratic nation-states because they have the most to gain from exploiting any vulnerabilities in U.S-based companies. However, such companies are not the only ones who use products produced by Apple, which means companies located anywhere would also be vulnerable. Additionally, if world governments put restraints on encryption to make it illegal or to limit the ways data can be encoded then, again, that gives power to those entities that would exploit weak encipherment to the detriment of the private sector.

From a technology E&O standpoint, any request by the U.S. government to weaken products produced by an insured creates a breach of contract, which will hurt claims made against technology E&O policies. If Foxconn, which builds the iPhone for Apple, was forced to alter firmware used in the iPhone to allow at least one software flaw, then Apple could sue Foxconn for a breach of contract were Apple to learn of Foxconn obeying a government order to create a security bypass in the firmware code. Worse yet would be a company like FireEye being forced to reduce the effectiveness of its virtual execution engines that are at the heart of its malware analysis appliances. FireEye, and other cyber security companies, are what often stand between a hacker and its victim. Should a cybersecurity company ever be forced to obey a government order, little would stand between a hacker and its potential victims. Moreover, all of the companies that depend on the products of a cybersecurity company would also be in a position to bring claims against the insured organization, which would certainly be detrimental to technology E&O insurers.

To defend itself and its products from government interference, Apple is implementing a security feature that removes its ability to bypass the iPhone’s security. While such method works from a simplicity standpoint, it will not work for a majority of technology companies, with cybersecurity and cloud providers being two examples of where such a solution would not work. Additionally, if a law were passed that forced a company by way of a court order, for example, to decrypt information on its products, then the company so ordered would be put into a bind. Cyber liability and technology E&O insurers could also add exclusions to policies that would void insurance contracts if an insured organization complied with a governmental request to create a backdoor.

However, it would be extremely difficult for an insurer to prove the backdoor was created deliberately, and, ultimately, such exclusions would be ethically ambiguous given they would punish an insured firm for obeying the rule of law. Companies could also contest each governmental request, assuming no law makes it illegal to deny a government request, but not all companies have the time or financial resources with which to fight a government. The only reasonable avenue to rein in disruptive governmental orders, then, is for insurers, technology companies and others to unite and block any legislative attempt to pass a law that would force any technology company to create a security gap. Moreover, the resistance movement will also need to fight against any attempt to weaken or make illegal any type of encryption.

See also: Paradigm Shift on Cyber Security

Currently, the relationship that exists between the insurance and technology sectors is that of provider and client, but that relationship must now evolve into a partnership. The technology sector cannot afford to go without cyber liability and technology E&O insurance because almost every company needs to offset technological risk now that we are in a globally connected and highly litigious age. Insurers also need to continue offering cyber liability and technology E&O policies because they have the clout and financial strength to help protect companies — especially small- and medium-sized ones — from an ever-changing technological landscape. Then, too, whichever insurer develops a realistic understanding of the intersection of risk and technology will be in a position to enrich itself.

The path forward, then, is to create a coalition whose first goal would be to stay on top of both pending and current judicial cases and bills being drafted or voted on in any legislature worldwide that would degrade the security strength of any member’s product. The U.S. government has recently tried to force Apple to create a master key to one of its product lines, and there is no reason to believe that it will not force other companies (like cloud providers) to build similar backdoors into their products. To work against such actions, the coalition might be composed of two representatives from each sector’s main representative organization. For instance, for the professional insurance sector that would be PLUS, and for technology companies that would be IEEE.

Furthermore, the coalition might also be composed of members from automotive manufacturers, educators and telecommunication firms. The coalition’s protective approach, then, would be to identify cases or bills and then attempt to bring all resources forward to eliminate or mitigate the offending threat. A recent example on the judicial side of a case that would have been a threat to the putative coalition was the Apple vs. the U.S. government in Central District of California, Eastern Division. A current example of a legislative threat to the coalition is the Burr-Feinstein Anti-Encryption draft that seeks to allow courts to order a company to decrypt information it has encoded, like the way the iPhone protects a user’s data.

In a judicial case, the main measure could be filing amicus curiae briefs on the part of the aggrieved organization, but another measure might be ensuring the defendant is crafting the most reasonably persuasive anti-governmental interference arguments and appealing unfavorable rulings. On the legislative front, measures might include lobbyists but, more importantly, ought to involve the unity achieved by the existence of the coalition, working with an organization like the EFF and even creating public relation campaigns to appeal to the support of the world populace. In the rare instances when a government attempts to work with the private sector to understand the concerns that it has — for instance, as the U.S. government is trying to do with the proposed “Digital Security Commission” — then the coalition would need to support such efforts as much as possible.

It is true that the coalition’s efforts in countries like China and Russia might be limited, and they will be also be limited when a country feels that a criminal act, like terrorism, is better dealt with by eroding encryption and cybersecurity measures. In an instance concerning China, insurers could consider increasing the amount of re-insurance that they purchase on their cyber liability and technology E&O portfolios to offset the damage from increased claims. Insurers will also need to be extremely cautious when providing cyber liability and technology E&O coverage to organizations that have close relationships with non-democratic governments (like the Chinese government) or ones that produce products that have a high likelihood of being the result of IP theft, such as any mid- to high-end binary processor.

The pursuit of the best encryption and cybersecurity measures needs to be unencumbered by the efforts of any government, just as alarm systems have been free to evolve over the past two or three millennia. This can only be achieved, though, through the unified actions and vigilance of a coalition. Encryption and resilient cybersecurity frameworks are the essential and irreplaceable elements in a safely connected world. To limit, in any way, the efforts to perfect those elements or to purposefully reduce their effectiveness is irresponsible regardless of whether the reason is national security or the pursuit of breaking a criminal enterprise. Lloyds, and other organizations involved with cyber liability and technology E&O insurance, see a future where insurers are able to achieve healthy profits off those two products. However, if insurers do not responsibly oppose governmental attacks on encryption and cybersecurity, that profitable future will give way to a future of excessive claims, damaging losses and very little profit.

How Safe Is Your Data — Really?

The number and the potential severity of cyber breaches is increasing. A recent PwC survey found that nearly 90% of large organizations suffered a cyber security breach in 2015, up from 81% in 2014. And the average cost of these breaches more than doubled year-on-year. With more connected devices than ever before—and a total expected to reach 50 billion by 2020 —there are more potential targets for attackers, and there is more potential for accidental breaches.

What’s more, as of late 2015, companies are, for the first time, listing their information assets as nearly as valuable as their physical assets, according to the 2015 Ponemon Global Cyber Impact Report survey, sponsored by Aon.

So, how do you keep your organization’s data—and that of your clients and customers—safe?

It’s not just a matter of investing in better technology and more robust systems, according to Aon cyber insurance expert Stephanie Snyder Tomlinson, who says, “A lot of companies find that the weakest link is their employees. You need to train employees to make sure that if they get a phishing email, they’re not going to click on the link; that they don’t have a Post-It note right next to their monitor with all of their passwords on it. It’s the human error factor that companies really need to take a good hard look at.”

From intern to CEO: Simple steps everyone can take

It’s easy for individuals to become complacent about data security, says Aon’s global chief privacy officer, Brad Bryant. But, with cyber threats increasing, it’s more important than ever to be aware of seemingly innocent individual actions that can potentially lead to serious cost and reputational consequences for your organization.

According to Bryant, there are four key things that everyone can do to help protect themselves and their organizations from the rising cyber threat:

  • Be alert to impersonators. Hackers are becoming increasingly sophisticated at tricking people into giving away sensitive information, from phishing to social engineering fraud. You need to be more vigilant than ever when transmitting information. Are you certain they are who they say they are?
  • Don’t overshare. If you give out details about your personal life, hackers may be able to use them to build a profile to access your or your company’s information. From birthdays to addresses, small details build up.
  • Safely dispose of personal information. A surprising amount of information can be retained by devices, even after wiping hard drives or performing factory resets. To be certain that your information is destroyed, you may need to seek expert advice or device-specific instructions.
  • Encrypt your data. Keeping your software up to date and password-protecting your devices may not be enough to stop hackers, should your devices fall into the wrong hands. The more security, the better, and, with the growing threat, encryption should be regarded as essential.

Key approaches for organizations to better protect data

To protect your, your customers’ and your and clients’ information, investing in better cyber security is one element. But data breaches don’t just happen through hacks, or even employee errors. At least 35% of cyber breaches happen because of system or business process failures, so it’s vital to get the basics right.

Prevention is key, says Tom Fitzgerald, CEO of Aon Risk Solutions’ U.S. retail operations. There are four key strategies he recommends all organizations pursue to limit the risk and make sure they’re getting the basics right:

  • Build awareness. Educate employees on what social engineering fraud is, especially those in your financial department. Remind employees to be careful about what they post on social media and to be discreet at all times with respect to business-related information.
  • Be cautious. Always verify the authenticity of requests for changes in money-related instructions, and double-check with the client or customer. Do not click on random hyperlinks without confirming their origin and destination.
  • Be organized. Develop a list of pre-approved vendors and ensure employees are aware. Review and customize crime insurance—when it comes to coverage or denial, the devil is in the details.
  • Develop a system. Institute a password procedure to verify the authenticity of any wire transfer requests, and always verify the validity of an incoming email or phone call from a purported senior officer. Consider sending sample phishing emails to employees to test their awareness and measure improvements over time.

Much of this advice is not new, but the scale of the threat is increasing, making following this advice more important than ever. Fitzgerald warns, “Social engineering fraud is one of the greatest security threats companies can encounter today. … This is when hackers trick an employee into breaking an organization’s normal digital and physical security procedures to access money or sensitive information. It can take many forms, from phishing for passwords with deceptive emails or websites, to impersonating an IT engineer, to baiting with a USB drive.”

How governments are driving data protection

The potential consequences of inadequate data security are becoming more serious, and courts and regulators are focusing on this issue globally.

The European Union is considering a Data Protection Directive to replace previous regulations implemented in 1995. The expected result will be a measure that focuses on the protection of customers data. Similarly, an October 2015 ruling by the European Court of Justice highlighted the transfer of customer data between the E.U. and U.S.

Bryant warns: “Regardless of where a company is located, the provision of services to E.U. customers and the collection or mere receipt of personal data from European citizens may potentially subject companies to E.U. jurisdiction. … Failure to comply could present unprecedented risk for companies, including fines of up to 4% of a company’s total global income.”

Changing E.U. rules aren’t the only thing that could affect your business. Internet jurisdictions and organizational operations are increasingly becoming cross-border. This global patchwork of Internet rules and regulations is why only 24% of cyber and enterprise risk professionals are fully aware of the possible consequences of a data breach or security exploit in countries outside their home base of operations.

Why getting the basics right is critical

As the Internet of Things continues to grow, the number and range of potential targets for cyber attack is only going to increase. While eliminating all cyber risk may be impossible, getting the basics right is becoming more important than ever.

Bryant says, “Given the large scope and impact of the various changes in data protection law—coupled with the drastic increase in fines—becoming educated on how to protect our data is more business-critical now than ever before.”

5 Practical Steps to Get to Self-Service

To participate in the new world of customer self-service and straight-through processing, many insurance carriers find themselves having to deal with decades of information neglect. As insurers take on the arduous task of moving from a legacy to a modernized information architecture and platform, they face many challenges.

I’ll outline some of the common themes and challenges, possible categories of solutions and practical steps that can be taken to move forward.

Let’s consider the case of Prototypical Insurance Company (PICO), a mid-market, multiline property/casualty and life insurance carrier, with regional operations. PICO takes in $700 million in direct written premiums from 600,000 active policies and contracts. PICO’s customers want to go online to answer basic questions, such as “what’s my deductible?”; “when is my payment due?”; “when is my policy up for renewal?”; and “what’s the status of my claim?” They also want to be able to request policy changes, view and pay their bills online and report claims.

After hearing much clamoring, PICO embarks on an initiative to offer these basic self-service capabilities.

As a first step, PICO reviews its systems landscape. The results are not encouraging. PICO finds four key challenges.

1. Customer data is fragmented across multiple source systems.

Historically, PICO has been using several policy-centric systems, each catering to a particular line of business or family of products. There are separate policy administration systems for auto, home and life. Each system holds its own notion of the policyholder. This makes developing a unified customer-centric view extremely difficult.

The situation is further complicated because the level and amount of detail captured in each system is incongruent. For example, the auto policy system has lots of details about vehicles and some details about drivers, while the home system has very little information about the people but a lot of details about the home. Thus, choices for key fields that can be used to match people in one system with another are very limited.

2. Data formats across systems are inconsistent.

PICO has been operating with systems from multiple vendors. Each vendor has chosen to implement a custom data representation, some of which are proprietary. To respond to evolving business needs, PICO has had to customize its systems over the years. This has led to a dilution of the meaning and usage of data fields: The same field represents different data, depending on the context.

3. Data is lacking in quality.

PICO has business units that are organized by line of business. Each unit holds expertise in a specific product line and operates fairly autonomously. This has resulted in different practices when it comes to data entry. The data models from decades-old systems weren’t designed to handle today’s business needs. To get around that, PICO has used creative solutions. While this creativity has brought several points of flexibility in dealing with an evolving business landscape, it’s at the cost of increased data entropy.

4. Systems are only available in defined windows during the day, not 24/7.

Many of PICO’s core systems are batch-oriented. This means that updates made throughout the day are not available in the system until after-hours batch processing has completed. Furthermore, while the after-hours batch processing is taking place, the systems are not available, neither for querying nor for accepting transactions.

Another aspect affecting availability is the closed nature of the systems. Consider the life policy administration system. While it can calculate cash values, loan amounts, accrued interest and other time-sensitive quantities, it doesn’t offer these capabilities through any programmatic application interface that an external system could use to access these results.

These challenges will sound familiar to many mid-market insurance carriers, but they’re opportunities in disguise. The opportunity to bring to bear proven and established patterns of solutions is there for the taking.

FOUR SOLUTION PATTERNS

There are four solution patterns that are commonly used to meet these challenges: 1) establishing a service-oriented architecture; 2) leveraging a data warehouse; 3) modernizing core systems; and 4) instituting a data management program. The particular solution a carrier pursues will ultimately depend on its individual context.

1. Service-oriented architecture

SOA consists of independent, message-based, contract-driven and, possibly, asynchronous services that collaborate. Creating such an architecture in a landscape of disparate systems requires defining:

  • Services that are meaningful to the business: for instance, customer, policy, billing, claim, etc.
  • Common formats to represent business data entities.
  • Messages and message formats that represent business transactions (operations on business data).
  • Contracts that guide interactions between the business services.

Organizations such as Object Management Group and ACORD have made a lot of headway toward offering industry-standard message formats and data models.

After completing the initial groundwork, the next step is to enable existing systems to exchange defined messages and respond to them in accordance with the defined contracts. Simple as it might sound, this so-called service-enablement of existing systems is often not a straightforward step. Success here is heavily dependent on how well the technologies behind the existing systems lend themselves to service enablement. An upfront assessment would be entirely warranted.

Assuming service enablement is possible, we’re still not in the clear. SOA only helps address issues of data format inconsistencies and data fragmentation. It will not help with issues of data quality and can offer only limited reprieve from unavailability of systems. Unless those can be addressed in concert, this approach will only provide limited success.

2. Data warehouse

A data warehouse is a data store that accumulates data from a wide range of sources within an organization and is ultimately used to guide decision-making. While using a data warehouse as the basis of an operational system (such as customer self-service) is a choice, it is really a false choice for a couple of different reasons.

    • Building a data warehouse is a big effort. Insurers usually can’t wait for its completion. They have to move ahead with self-service now.
    • Data warehouses are meant to power business intelligence, not operational systems. If the warehouse already exists, there’s a 50% chance that it was built on a dimensional model. A dimensional model does not lend itself to serving as a source for downstream operational systems. On the other hand, if it’s a “single version of truth” warehouse, the company is well on its way to addressing the data challenges under discussion.

3. Modernizing core systems

Modern systems make self-service relatively simple. However, unless modernization is already well underway, it, too, cannot be waited for, because implementation timeframes are so long.

4. Instituting a data management program

A data management program is a solution that deals with specific data challenges, not the foundational reasons behind those challenges. To overcome the four challenges mentioned at the beginning of the article, a program could consist of a consolidated data repository implemented using a canonical data model on top of a highly available systems architecture leveraging data quality tools at key junctions. Implementing such a program would be much quicker than the previous three options. Furthermore, it can serve as an intermediate step toward each of the previous three options.

As an intermediate step, it has a risk-mitigation quality that’s particularly appealing to mid-sized organizations.

The particular solution a carrier pursues will ultimately depend on its individual context. In the final part of this series, we’ll discuss practical steps that a carrier can take towards instituting its own data management program.

PRACTICAL STEPS

Here are the practical steps that a carrier can take toward instituting its own data management program that can successfully support customer self-service. The program should have the following five characteristics:

1. A consolidated data repository

The antidote to data fragmentation is a single repository that consolidates data from all systems that are a primary source of customer data. For the typical carrier, this will include systems for quoting, policy administration, CRM, billing and claims. A consolidated repository results in a replicated copy of data, which is a typical allergy of traditional insurance IT departments. Managing the data replication through defined ETL processes will often preempt the symptoms of such an allergy.

2. A canonical data model

To address inconsistencies in data formats used within the primary systems, the consolidated data repository must use a canonical data model. All data feeding into the repository must conform to this model. To develop the data model pragmatically, simultaneously using both a top-down and a bottom-up approach will provide the right balance between theory and practice. Industry-standard data models developed by organizations such as the Object Management Group and ACORD will serve as a good starting point for the top-down analysis. The bottom-up analysis can start from existing source system data sets.

3. “Operational Data Store” mindset — a Jedi mind trick

Modern operational systems often use an ODS to expose their data for downstream usage. The typical motivation for this is to eliminate (negative) performance impacts of external querying while still allowing external querying of data in an operational (as opposed to analytical) format. Advertising the consolidated data repository built with a canonical data model as an ODS will shift the organizational view of the repository from one of a single-system database to that of an enterprise asset that can be leveraged for additional operational needs. This is the data management program’s equivalent of a Jedi mind trick!

4. 24/7/365 availability

To adequately position the data repository as an enterprise asset, it must be highly available. For traditional insurance IT departments, 24/7/365 availability might be a new paradigm.

Successful implementations will require adoption of patterns for high availability at multiple levels. At the infrastructure level, useful patterns would include clustering for fail-over, mirrored disks, data replication, load balancing, redundancy, etc.

At the SDLC level, techniques such as continuous integration, automated and hot deployments, automated test suites, etc. will prove to be necessary. At the integration architecture level (for systems needing access to data in the consolidated repository), patterns such as asynchronicity, loose coupling, caching, etc., will need to be followed.

5. Encryption of sensitive data

Once data from multiple systems is consolidated into a single repository, the impact of a potential breach in security will be amplified several-fold – and breaches will happen; it’s only a matter of time, be they internal or external, innocent or malicious. To mitigate some of that risk, it’s worthwhile to invest in infrastructure level encryption (options are available in each of the storage, database and data access layers) of, at a minimum, sensitive data.

A successful data management program spans several IT disciplines. To ensure coherency across all of them, oversight from a versatile architect capable of conceiving infrastructure, data and integration architectures will prove invaluable.