Tag Archives: data management

7 Ways Your Data Can Hurt You

Your data could be your most valuable asset, and participants in the workers’ compensation industry have loads available because they have been collecting and storing data for decades. Yet few analyze data to improve processes and outcomes or to take action in a timely way.

Analytics (data analysis) is crucial to all businesses today to gain insights into product and service quality and business profitability, and to measure value contributed. But processes need to be examined regarding how data is collected, analyzed and reported. Begin by examining these seven ways data can hurt or help.

1. Data silos

Data silos are common in workers’ compensation. Individual data sets are used within organizations and by their vendors to document claim activity. Without interoperability (the ability of a system to work with other systems without special effort on the part of the user) or data integration, the silos naturally fragment the data, making it difficult to gain full understanding of the claim and its multiple issues. A comprehensive view of a claim includes all its associated data.

2. Unstructured data

Unstructured documentation, in the form of notes, leaves valuable information on the table. Notes sections of systems contain important information that cannot be readily integrated into the business intelligence. The cure is to incorporate data elements such as drop-down lists to describe events, facts and actions taken. Such data elements provide claim knowledge and can be monitored and measured.

3. Errors and omissions

Manual data entry is tedious work and often results in skipped data fields and erroneous content. When users are unsure of what should be entered into a data field, they might make up the input or simply skip the task. Management has a responsibility to hold data entry people accountable for what they add to the system. It matters.

Errors and omissions can also occur when data is extracted by an OCR methodology. Optical character recognition is the recognition of printed or written text characters by a computer. Interpretation should be reviewed regularly for accuracy and to be sure the entire scope of content is being retrieved and added to the data set. Changing business needs may result in new data requirements.

4. Human factors

Other human factors also affect data quality. One is intimidation by IT (information technology). Usually this is not intended, but remember that people in IT are not claims adjusters or case managers. The things of interest and concern to them can be completely different, and they use different language to describe those things.

People in business units often have difficulty describing to IT what they need or want. When IT says a request will be difficult or time-consuming, the best response is to persist.

5. Timeliness

There needs to be timely appropriate reporting of critical information found in current data. The data can often reveal important facts that can be reported automatically and acted upon quickly to minimize damage. Systems should be used to continually monitor the data and report, thereby gaining workflow efficiencies. Time is of the essence.

6. Data fraud

Fraud finds its way into workers’ compensation in many ways, even into its data. The most common data fraud is found in billing—overbilling, misrepresenting diagnoses to justify procedures and duplicate billing are a few of the methods. Bill review companies endeavor to uncover these hoaxes.

Another, less obvious means of fraud is through confusion. A provider may use multiple tax IDs or NPIs (national provider numbers) to obscure the fact that a whole set of bills are coming from the same individual or group. The system will consider the multiple identities as different and not capture the culprit. Providers can achieve the same result by using different names and addresses on bills. Analysis of provider performance is made difficult or impossible when the provider cannot be accurately identified.

7. Data as a work-in-process tool

Data can be used as a work-in-process tool for decision support, workflow analysis, quality measurement and cost assessment, among other initiatives. Timely, actionable information can be applied to work flow and to services to optimize quality performance and cost control.

Accurate and efficient claims data management is critical to quality, outcome and cost management. When data accuracy and integrity is overlooked as an important management responsibility, it will hurt the organization.

Unstructured Data: New Cyber Worry

Companies are generating mountains of unstructured data and, in doing so, unwittingly adding to their security exposure.

Unstructured data is any piece of information that doesn’t get stored in a database or some other formal data management system. Some 80% of business data is said to be unstructured, and that percentage has to be rising. Think of it as employee-generated business information—the sum total of human ingenuity that we display in the workplace, typing away on productivity and collaboration software and dispersing our pearls of wisdom in digital communications.

Free IDT911 white paper: Breach, Privacy, And Cyber Coverages: Fact And Fiction

Unstructured data is all of the data that we are generating on our laptops and mobile devices, storing in cloud services, transferring in email and text messages and pitching into social media sites.

Many companies are just starting to come to grips with the complex challenge of figuring out how to categorize and manage this deluge of unstructured data.

Sensitive data at risk

But what’s more concerning is the gaping security exposure.

It was unstructured data—in the form of a text message transcript of employees conversing about deflating footballs—that blindsided the New England Patriots NFL team and its star quarterback, Tom Brady.

Yet the full scope of risk created by unstructured data is much more profound.

“The risk that unstructured data poses dwarfs that of any other type of data,” says Adam Laub, product management vice president at STEALTHbits Technologies. “It is the least understood form of data in terms of access, activity, ownership and content.”

STEALTHbits helps companies that use Windows Active Directory identify and keep more detailed track of shared files that hold unstructured data. That may sound basic. Yet the fact that STEALTHbits is part of a thriving cottage industry of technology vendors helping organizations get a grip on unstructured data is truly a sign of the times. I met with Laub as he was pitching STEALTHbits’ technology at the recent RSA Conference in San Francisco. “Any single file can contain the data that puts an organization in the headlines, and turning a blind eye to the problem or claiming it’s too big to handle is not a valid excuse for why unstructured data hasn’t been secured properly,” Laub says.

A decade and a half has elapsed since the Y2K scare. During that period, business networks have advanced and morphed and now tie extensively into the Internet cloud and mobile devices.

Time to close loophole

Along the way, no one had the foresight to champion a standard architecture to keep track of—much less manage and secure—unstructured data, which continues to grow by leaps and bounds.

Criminals certainly recognize the opportunity for mischief that has resulted. It’s difficult to guard the cream when the cream can be accessed from endless digital paths.

Just ask Morgan Stanley. Earlier this year, a low-ranking Morgan Stanley financial adviser pilfered, then posted for sale, account records, including passwords, for 6 million clients. The employee was fired and is being investigated by the FBI. But Morgan Stanley has to deal with the hit to its reputation.

“The urgency is that your information is under attack today,” says Ronald Arden, vice president at Fasoo USA, a data management technology vendor. “Somebody is trying to steal your most important information, and it doesn’t matter if you’re a small company that makes widgets for the oil and gas industry or you’re Bank of America.”

Fasoo’s technology encrypts any newly generated data that could be sensitive and fosters a process for classifying which types of unstructured data should routinely be locked down, Arden told me.

Technology solutions, of course, are only as effective as the people and processes in place behind them. It is incumbent upon executives, managers and employees to help make security part and parcel of the core business mission. Those that don’t do this will continue to be easy targets.

Steps forward

Simple first steps include identifying where sensitive data exists. This should lead to clarity about data ownership and better choices about granting access to sensitive data, says STEALTHbits’ Laub.

This can pave the way to more formal “Data Access Governance” programs, in which data access activities are monitored and user behaviors are baselined. “This will go a long way towards enabling security personnel to focus on the events and activities that matter most,” says Laub.

Smaller organizations may have to move much more quickly and efficiently. Taking stock of the most sensitive information in a small or mid sized organization is doable, says Fasoo’s Arden.

“If you are a manufacturing company, the intellectual property around your designs and processes are the most critical pieces of information in your business, if you are a financial company it’s your customer records,” Arden says. “Think about securing that information with layers of encryption and security policies to guarantee that that information cannot leave your company.”

Some unstructured business data is benign and may not need to be locked down. “If I write you a memo that says, ‘We’re having a party tonight,’ that’s not a critical piece of information,” says Arden. “But a financial report or intellectual property or something related to healthcare or privacy, that’s probably something that you need to start thinking about locking down.”

Getting to 2020: the Finance Function

Even as economies recover, the insurance sector continues to face many competitive pressures and regulatory challenges. Yet a new drive for growth is emerging. The 2014 EY Global Insurance CFO Survey captures the priorities and challenges for finance and actuarial teams as they seek to support business growth strategies while addressing regulatory and cost pressures.

Delivering more value to the business through performance measurement and improved decision support is the top priority for the finance function through 2020. Among senior finance professionals participating in the survey, 71% indicated that “being a better business partner” ranked among their top three priorities, with 35% placing this as number one.

As insurance companies around the world continue to invest in data management and analytics capabilities, the role of finance and actuarial functions has become even more critical. The processes and systems supporting these functions are key to developing deep insights into business performance, as well as customer needs, preferences and behavior. In response, finance leaders have been increasing their efforts to improve the capabilities of their organizations to meet the new demands. In the survey, 89% of respondents stated that they have either begun a change program or are in the planning stage.

However, the drive to better insights is not without challenges. Among the issues is the impact of continuing regulatory compliance demands. According to 35% of those surveyed, implementing new regulatory and financial reporting requirements was the highest priority for finance and actuarial organizations; 56% ranked this among their top three. As a result, the ability for these organizations to strike a balance between delivering value to the business and meeting daily operational demands will continue to be a challenge.

Not surprisingly, the current data and technology footprint will require significant change to meet the challenges of the finance function of the future. Across the finance operating model, survey participants scored data as the least developed capability on average, while technology recorded the greatest gap between current and required future state.

Other Key Findings

  • Top three business drivers: #1 growth, #2 managing costs and #3 regulatory changes
  • Two-thirds of respondents rank data and technology issues among the top three challenges facing finance and actuarial functions; participants on average score data as their least developed capability
  • By 2020, the most significant shifts in maturity levels by operating model will be in data management and technology capabilities
  • Respondents expect onshore shared services to support transaction processing functions, with outsourcing selectively used for payroll and internal audits
  • Decision support and controls are expected to account for a larger share of finance and actuarial headcount by 2020

What insurers must do

We see three key areas where insurers can take action:

  • Modify current reporting processes by developing an efficient reporting solution architecture.
  • Deliver timely and relevant management information and link strategic objectives to performance indicators.
  • Improve finance and actuarial operational performance by using the right skills and processes to strike a balance between effectiveness and efficiency.

For the full survey from which this excerpt was taken, click here.

Solvency 2: An Outcome Very Different Than Planned

The original intention of the EU's Solvency 2, the regulatory requirement for capital held by insurers, was to create a framework that inspired policyholder confidence and restore trust. The real outcome was to force insurers to undertake massive programs of data management at costs that, for some Tier 1 insurers, have exceeded $200 million. Some insurers said they would pass the cost on to their customers, which I’m sure wasn’t the intention.

In what was arguably worse, the cost became so great that other useful programs were put on hold because of this burning regulatory platform. The knock-on effect has been to create delay especially in customer-facing activities (which would have had a far better impact in improving confidence and trust).

Some international insurers suggested that the requirements might prevent them from trading in Europe – creating a “Fortress Europe” – but Solvency 2 seems to be emerging in multiple guises around the globe, in China, Latin America, South Africa and of course the U.S. in the form of RMORSA.

There’s lots written on this topic, such as http://www.solvencyiiwire.com/, and I won’t bore you, but as I looked out at the faces at a major conference in the U.S. where I spoke recently, I recognized the look I saw in many insurers in Europe in 2008 — that of not really knowing what was going to hit them.

Insurers were to discover that more than 80% of both cost and implementation time was absorbed in data management, 15% on analysis and the small balance on risk reporting. Yet the reporting element proved to be the only part visible – reminding me of an iceberg analogy, with the reporting being that part of the ‘berg visible above the waterline.

Comparing risk and regulation to an iceberg is interesting, and as I looked around the room at the conference, I wondered how many attendees were ready for what would be, for them, a long and difficult passage. But not, I hope, a Titanic one…

Realities of Post-Disaster Data Recovery

The construction industry’s dependence on information technology systems continues to expand with the dramatic shift from document management to data management. With this reliance comes an increased vulnerability to business disruption. Data management, business continuity and post-disaster data recovery requires a shift in mindset from firefighting to fire prevention. Zero disruptions is a bold strategic imperative that provides a competitive advantage by enhancing field productivity, increasing office efficiency, reducing downtime and preventing data losses. Effective data backup and post-disaster recovery protocols are the essential steps to minimize business disruptions.

Data management today requires an enterprise view integrating a company’s increasingly complex networks. Data must be construed to encompass all information generated, received, transmitted, stored and retrieved throughout the organization. Additionally, data must be incorporated from its various physical and virtual locations, including mobile devices. Following are IT trends affecting AEC companies:

  • expansion of email as the predominant form of intra- and inter-company communication;
  • growth of online data mobility project management tools using smartphones and tablets to access and transmit data;
  • increased adoption of document imaging to replace paper recordkeeping files;
  • growth of enterprise resource planning (ERP) platform systems and integration with best-in-class specialty software programs;
  • estimators’ use of the same database to work from multiple locations on complex projects;
  • increased adoption of, and massive data files generated by, BIM;
  • emergence of hosted and cloud-based data recovery systems;
  • expansion of e-discovery in litigation, which raises expectations for (and increases the risks of ) record retention; and
  • proliferation of social media networks combined with bring-your-own-device policies, which creates new portals for hacking, malware and viruses.

The severity of natural disasters and the escalating number of man-made emergencies and technological disruptions compounds the construction industry’s dependence on IT systems. Many of these disruptions “only” result in temporary IT system shutdowns, while others pose a threat to the viability of the business.

A company’s vulnerability to data loss can be increased or decreased by the actions taken (or not taken) with regard to data backup and recovery. A robust business continuity plan is the first step. Companies have many choices when selecting the best way to back up their vital information and mission-critical data.

The Need for a Comprehensive Business Continuity Strategy

Automatic offsite (hosted or cloud-based) data backup protocols at regular intervals are the best prevention for data loss. These backups must be set for every type of data and for every type of device accessing, transmitting or storing information.

Another data recovery strategy is imaging the company’s server and running the restored replica image from a new server in a remote location. However, this strategy requires pre-planning. In a large-scale disaster, obtaining replacement servers may not be possible.

Causes, Costs and Consequences of Data Loss

Data disruption is a reality of the modern work environment. Causes of data loss include:

  • failure to initiate or maintain regular data backups;
  • hardware failure;
  • human error resulting in accidental deletion, overwriting of data or forgetting to add new IT systems/devices to backup protocols;
  • failure to test the backup and data recovery restoration process to determine adequacy;
  • software or application corruption;
  • power surges, brownouts and outages;
  • computer viruses, malware or hacking;
  • theft of IT equipment; and
  • hardware damage or destruction from vandalism, fire and water (rain, flood or sprinkler system discharge).

The consequences of lost data include direct loss of revenue from missing bid submissions or customer orders, direct expenses to pay for technical specialists to help recover data, decreased productivity during the shutdown and costs to re-key or obtain replacement data. For contractors selling directly to consumers, the loss of Internet connection for any extended time could prove costly. Lost data also can result in litigation for breach of confidential information plus adverse publicity.

A 2012 study commissioned by cloud-based data backup company Carbonite revealed 45% of small businesses (defined as fewer than 1,000 employees) had suffered a data loss. Fifty-four percent of the data losses were attributed to hardware failure, and the average cost for data recovery was $9,000.

Real-World Data Loss Scenarios

  • Laptop motherboard failure. A project estimator was working offline when the motherboard crashed. Because of a tight deadline, he had to restart the estimate from scratch. Although the bid was successfully submitted on time, the estimator fell behind on pricing other jobs that the company failed to win.
  • Lost iPhone. Pictures of a project safety incident with documentation of a mismarked “one-call system” utility spot were lost. The photo documentation had not been transmitted to the office, and the contractor lost the request for damages against the utility locating service. Moreover, the smartphone was not properly password-secured, allowing unauthorized access to contacts, client information and company data.
  • Desktop computer backup location not properly mapped to server. When a workstation was upgraded with a new desktop computer, it was not mapped to the server for automatic backup. The computer hard drive crashed, and no files were backed up. Recovery using the old desktop computer was slow, and data created on the new computer was lost.
  • New database not added to the nightly backup protocol. A company purchased a new customer relationship management database and, after a power outage, realized it had not been added to the nightly data backup protocol.
  • Onsite data backup location destroyed. The building housing an onsite backup server was struck by lightning, which started a fire and resulted in a total loss of all current and historical data.
  • Disaster recovery software not properly configured. While conducting a test of a company’s disaster recovery plan, it was discovered that some critical data was not being captured in the backup files.
  • Laptop and tablet stolen from a jobsite trailer. The field equipment had not been backed up for several weeks, resulting in the loss of key project documentation.

Best Practices for Data Management

Data management and IT network administration is a strategic, unique function for all companies. It is not possible to delineate all data management best practices, but the following guidelines should help enhance most companies’ post-disaster data recovery efforts:

  • Determine the company’s recovery-time objectives, and plan and budget accordingly. Identify which functions and systems must remain operational at the time of a disruption or disaster. This requires advance planning and budgeting for necessary systems and technical support services. It also helps prioritize risk-reduction strategies, including investments in data management backup system and security upgrades.
  • Develop a written business continuity plan that outlines specific responsibilities for protecting vital information and mission-critical data. The business continuity plan should include protocols for backup and synchronization of all office systems and virtual/mobile devices. It also should address the frequency and format for testing data management integrity and security, as well as how gaps will be identified and addressed.
  • Inventory the company’s vital information and mission-critical data, and verify it is being backed up. Key considerations include how the data is being backed up, by whom and how frequently, as well as where the backup data is stored. It is important to ensure the data backup and restoration process work as designed.
  • Initiate automatic scheduled backups, ensure the backup data is stored offsite, and test the adequacy of the data backup and restoration methods. Consider the added benefits of imaging the company’s servers to achieve a complete restoration of the data management system
  • Develop a comprehensive diagram of the company’s integrated data management network, including all physical and virtual/mobile subsystems. Ideally, this will be an “as built” blueprint of the company’s configuration consisting of the hardware, operating systems, software and applications that make up the data management network.
  • Institute policies regarding the use of the company’s Internet, including security protocols. Implement policies for user authentication, password verification, unacceptable personal devices and reporting of lost equipment. It is essential to communicate these policies and security protocols to all users and to train new employees.
  • Establish proactive management of the company’s data and IT network. Ensure the company’s network administrator has state-of-the-art tools, including remote access, help desk diagnostics and anti-spam and malware protection. Request periodic updates on all software licensing audits and verification that all security patch updates have been installed on a timely basis. Establish a fixed replacement schedule for hardware and software.

There is good news and bad news regarding business data management and recovery. The bad news is that the need for post-disaster data recovery can no longer be ignored. The increasingly complex and connected business world demands pre-planning for business continuity. The good news is that data management and recovery services are scalable to meet the custom needs of every business regardless of the size and scope of the operation and its degree of data dependence.

Reprinted with permission from Construction Executive, January 2014, a publication of Associated Builders and Contractors Services Corp. Copyright 2014. All rights reserved.