Download

InsurTech Forces Industry to Rethink

As consumers use more and more online sources or digital agents to manage their policies, local agents will become less important.

sixthings
Actors in the insurance industry put the whole sector under universal suspicion in the minds of customers. Customers want to be able to identify with a company. This can only be achieved by successful branding campaigns that target establishing a relationship between the company and the consumer. If consumers agree with the values a company stands for, they are more inclined to trust the company. Young companies have the advantage of a fresh start; while traditional insurance carriers, as well as traditional insurance agents, are suffering a decline in reputation and need to win back trust, young companies can position themselves as an alternative to an unpopular industry and therefore benefit from unbiased consumers. See Also: A Mental Framework for InsurTech In a way, InsurTech start-ups can provide a more personal service than traditional agents can, making use of marketing and branding advertising, as well as social media presence and availability. This becomes increasingly important as customers will use a vast variety of channels to interact with their insurance agent — brick-and-mortar agents will have a hard time keeping up. Besides reputation, the use of technological advances will be an essential determinant for the success of insurance agents. As consumers use more and more online sources or digital agents to manage their insurance policies, local insurance agents will become less important. A report recently released by consulting firm McKinsey includes a chapter on “the end of an era for the local insurance agent.” This sums up the future of traditional, analogue agents, whose business model will no longer be able to withstand innovative alternatives. In addition to digital solutions and mobile-first applications, traditional agents will face competition from insurance carriers that will focus more on establishing a direct relationship with their customers and will gradually factor agents out of the equation. Technology and simplification are changing nearly every aspect of our lives. Customers have become increasingly used to managing everything from shopping to paying bills with the help of apps. Our always-on culture is bringing more customized options into the very personal areas of finance through technology. Therefore, managing insurance policies offline is beginning to seem like an unnecessary and tedious task. Comparing insurance online has been popular ever since the first portals came up, but the gap in consulting was harder to close. Insurance brokers watched the onset of insurance portals with ease, knowing those portals were unable to provide advice for the mostly helpless customer when it came to insurance. However, the development and distribution of insurance apps that offer an easy-to-understand overview of complicated insurance products, as well as independent advice, led to fierce competition on the insurance market. Consumers don’t want to store tons of paperwork in large folders and block out their spare time to meet their insurance agent for coffee. As a result, these agents are losing customers to the innovative alternative. New players provide comprehensive services, replacing traditional broker services and thereby replacing traditional agents who do not have the resources to develop the tools necessary for better customer service. The new market players figure out what kind of solution consumers want and need, and they adapt accordingly. This can only be forced by pushing into the market and spending money — investors' money that established companies usually do not have. Providers of mobile-first insurance solutions drive change in the insurance industry with a clear focus on advisory automation and mobile service experience. Apps like Knip do not just use technological advances as an addition to traditional services; they change the business model the insurance industry is based on, following a consumer-friendly approach and making products more transparent. That benefit will be hard for traditional brokers and insurers to copy. They can implement technological features and develop their own apps, but their business is based on sales rather than advisory, which will make them lose customers to more innovative companies. Both traditional agents and insurance carriers are pondering this development. Their inflexible organizational structures will be hard to overcome to provide more technology- and customer-oriented services. Sure, traditional brokers and advisers will still be needed as technology advances. Some people prefer to welcome their insurance broker in their home and would not dream of a mobile customer-broker relationship. Yet, this group will continue to shrink and become more specialized with time, something everyone working in this sector should be aware of. Only the best of the best of the traditional brokers will survive, be it because of their special expertise, their marketing and brand advertising success or their operational efficiency. Consumers are so used to digital products they will not hesitate to punish any provider that cannot or does not offer them such. There are still some challenges ahead for start-ups and innovative companies; patience and persistence are crucial. As they are innovating in a very traditional industry, not everyone is excited about digital solutions — mostly because they don’t know what to expect. Informing and educating are important to overcome this fear. It might take a while, but it is starting to work. See Also: InsurTech: Golden Opportunity to Innovate What everyone needs to be aware of is that ignoring the development will not end it. Those who turn a blind eye to it now will end up being left behind. Ten years ago, online banking was something extraordinary. Today, it is hard to imagine life without it, and very few people still fill out transfer forms at their local bank. This development will continue with more innovative ideas and more technological progress. Insurance is undergoing the same process, and, a few years from now, it will be perfectly normal to manage policies online and via mobile. The insurance industry realizes it needs to adjust to customers’ expectations if it does not want to be left behind.

Dennis Just

Profile picture for user DennisJust

Dennis Just

Dennis Just is the founder and CEO of Knip. At age 16, he founded his first company in the e-sports business, turned it into the largest German portal in that field and sold it three years later, shortly after graduating from school.

Is Driverless Moving Too Fast?

Is Tesla racing toward victory or calamity with its Autopilot? The answer hinges on a key issue in human/robot interaction.

sixthings
An excellent article by Levi Tillemann and Colin McCormick at The New Yorker lays out the advantages that Tesla has in the race towards driverless cars. Some, however, think that Tesla is driving recklessly toward that goal. Is Tesla racing toward victory or calamity? The answer hinges on a key issue in human/robot interaction. DonNorman2003-4-1200x798 Don Norman, Director, Design Lab, University of California San Diego (Source: JND.org) Don Norman, the director of the Design Lab at University of California, San Diego, argues that the most dangerous model of driving automation is the “mostly-but-not-quite-fully-automated” kind. Why? "Because the more reliable the automation, the less likely the driver will be to respond in time for corrective action. Studies of airline pilots who routinely fly completely automated airplanes show this (as do numerous studies over the past six decades by experimental psychologists). When there is little to do, attention wanders." [Source: San Diego Union-Tribune] Norman contends that car autopilots are more dangerous than airplane autopilots. Airplanes are high in the sky and widely distributed. Pilots are well-trained and have several minutes to respond. Drivers are not nearly as well-trained and may have only seconds to respond. Yet, Tesla’s “Autopilot” follows exactly the “mostly-but-not-quite-fully-automated” model about which Norman warns. I asked Norman about Tesla’s approach. His response: "Tesla is being reckless. From what I can tell, Tesla has no understanding of how real drivers operate, and they do not understand the need for careful testing. So they release, and then they have to pull back." As examples, Norman pointed to Tesla’s highway passing, automatic parking and summon features. Norman considers Tesla’s highway passing feature dangerous because its cars do not have sufficient backward-looking sensors. Mercedes and Nissan, he noted, have far better back-looking sensors. Consumer Reports found serious issues with Tesla’s automatic parking and summon features. Tesla’s sensors had high and low blind spots, causing the car to fail to stop before hitting objects like duffel bags and bicycles. There were also issues with the user interface design. The parking control buttons on the car key fob were not marked. The car continued to move when the iPhone app was closed. Consumer Reports told its readers, “It is critical to be vigilant when using this feature, especially if you have children or pets.” Tesla fixed these problems once Consumer Reports raised its safety concerns. Here’s Don Norman’s observation about Tesla’s quick response: "Good for Tesla, but it shows how uninformed they are about real-world situations." "Tesla thinks that a 1 in a million chance of a problem is good enough. No. Not when there are 190 million drivers who drive 2.5 trillion miles." If Norman is right, Tesla owners will grow less attentive—rather than more vigilant—as Tesla’s autopilot software gets better. Situations where their intervention is needed will become rarer but also more time-sensitive and dangerous. Indeed, customer experience with Tesla’s early autopilot software produced a number of reports and videos of silly antics and near-calamitous cases. Here are just a few: Jump to time mark 2:45 for the near accident: Be sure to read the background comments from Joey Jay, the uploader of the video: Jump to time mark 4:00 for a particularly “fun and scary” segment: If Norman is wrong, Tesla does have a huge advantage, as Tilleman and McCormick note. Other companies pursuing a semi-autonomous approach, like GM and Audi, have been slower to deploy new models with comparable capabilities. Google, which advocates a fully driverless approach for the reasons that Norman cites, is mired in a state and national struggle to remove the regulatory limits to its approach. Even if Google gets the green light, its pace is constrained by its relatively small fleet of prototypes and test vehicles. Tesla, on the other hand, has a powerful software platform that allows it to roll out semi-autonomous capability now, as it deems appropriate. And, it is doing so aggressively. Autopilot is already on more than 35,000 Tesla models on the road—and Tesla just announced a promotion offering one-month free trials to all Model S and X owners. Soon, it will be preinstalled on all of the more affordable Model 3, of which more than 300,000 have been preordered. That’s a critical advantage. The quality of autonomous driving software depends in large part on the test cases that feed each developer’s deep learning AI engines. More miles enable more learning, and could help Tesla’s software outdistance its competitors. The challenge, however, is that Tesla is relying on its customers to discover the problems. As noted in Fortune, Elon Musk has described Tesla drivers as essentially “expert trainers for how the autopilot should work.” Tom Mutchler wrote in Consumer Reports that “Autopilot is one of the reasons we paid $127,820 for this Tesla.” But, he also noted, “One of the most surprising things about Autopilot is that Tesla owners are willingly taking part in the research and development of a highly advanced system that takes over steering, the most essential function of the car.” Telsa’s early-adopter customers are willing, even enthusiastic, about Autopilot. But, should untrained, non-professional drivers be relied upon to be ready when Tesla’s autopilot needs to return control to the human driver? Can they anticipate problems and intervene to retake control without being asked? Will they follow safety guidelines and use the autopilot only under recommended conditions, or will they push the limits as their confidence grows? Imagine the consequences if a new slew of Tesla owner videos ended with a catastrophic failure rather than a nervous chuckle? It would be tragic for the victims and Tesla. It might also dampen the enthusiasm for driverless cars in general and derail the many benefits that the technology could deliver. While the mantra in Silicon Valley is “move fast and break things,” Elon Musk needs to reconsider how much that principle should apply to Tesla’s cars and customers.

How to Redesign Customer Experience

A new approach -- the Humalogy Scale -- lets you improve customer service while creating a lean organization that lowers costs.

||

Humans have amazing capabilities and, even more, they can be amplified by the power of technology. When both are working in harmony, what was once impossible becomes possible. Technology is now the “X” factor that can help you become more efficient while more effectively serving your base. When it is intentionally aligned with human effort, technology acts a weapon you can wield to strengthen your organization, increase the ability of your team and delight your customers or members. Discovering this human/technology balance is a process we often walk our clients through; many of these clients are in the financial services space, which is an industry being transformed by technology as much as any. There will be winners and losers in the financial space and the defining variable will be how well you can learn to integrate humans and technology to deliver your business model. We call this integration Humalogy. What is Humalogy? Humalogy is the integration of technology and human effort to improve processes and offer a positive and meaningful impact on an organization. That’s only when Humalogy is properly balanced. Understanding the Humalogy balance is critical because if left unbalanced it can be expensive to an organization, highly infuriating to customers or both. The balance you find will enable you to do some magical things. Here are just a few examples: See Also: Tips on Improving Customer Experience

  • Increase your individual, team and organizational effectiveness and capacity through lean processes and efficiency
  • Increase the quantity of potential and current customers that you are able to effectively reach with your messaging
  • Create an environment of profit amplification by both reducing costs through automation while shaping a better customer experience through the use of digital tools
  • Implement customer service enhancements through technology-enabled convenience such as self-service
  • Enable people in your organization to be more productive and satisfied in their role, because technology has freed their time to accomplish tasks that humans do well while avoiding mundane, automated assignments. Simply, they’re able to focus on the satisfying, cerebral aspects of their careers.

The Humalogy Scale We have developed the Humalogy scale to measure the balance of human and technology effort. We use this scale with our clients, many in the risk insurance space. Some processes lean heavily on humans (H5 on the Humalogy scale), and others primarily rely on technology (T5). Zero is an equal balance of effort from both humans and technology. What is important to recognize is that there is no “universally correct” balance. The proper balance for any space is the balance that yields you the highest level of efficiency combined with the best possible customer or member experience. pic1 For example, H5 would be an insurance agent traveling to an accident and manually filling out a claim. Moving across the scale, we find a claimant snapping a picture of the accident using a smartphone and then filing a claim using a mobile app and receiving payment through direct deposit. This requires much less human effort and so is high on the T-side of the scale. By defining where these processes are on the Humalogy Scale, it becomes easier to determine where to apply technology to drive efficiency, scalability or repetition. At the same time, in our technology-augmented world, we need to be conscious that some processes can be improved by adding the human elements that supply empathy, innovation and build trust. Tasks more suited for human work involve rational processing of information, deep thinking, social and emotional intelligence and those tasks that require creativity, intuition and improvisation. Meanwhile, tasks more suited for computers are those that execute rules or processes, involve repetition or mechanization, require big data analysis or are too dangerous or too large or small for a human to accomplish. Finding Humalogy Balance Humalogy is important because when you apply this process to your business, it becomes a lens that can help you improve customer service while creating a lean organization that lowers costs. Have you taken inventory of the technology expectations of your members? No industry is exempt from the evolving expectations of constituents who want access to services easily and instantly. Self-service is how industries are meeting the customer where they are -- customers are now equipped to complete tasks that once required a service representative, often from their personal tablet or smartphone. Defining which processes you can automate and provide self-service using technology will help satisfy your customers and endear them to you. On the other hand, the wrong Humalogy balance can result in poor customer service and a loss of loyalty. If your approach to Humalogy is not planned, often what may have been calculated for good can result in catastrophe. How many times have you felt alienated as a customer because a service provider tipped its Humalogy scale and traded personal touch for an automated call center? If someone wants to speak to a human representative, it is important to offer the opportunity. Humalogy is a tool that can be considered in a number of functional areas. The two primary ways we apply Humalogy in the risk insurance space is through lean and relationship journey mapping. Humalogy-Based Lean to Strengthen Process Efficiency Humalogy-based lean is designed to help organizations improve their back office processes so that they’re more efficient. Some companies follow Lean Six Sigma practices that have emerged from years of optimizing physical and manufacturing processes. These methods are powerful and effective but can be very narrowly focused on the process. At other times, this approach may improve the human parts of the process but fall short when it comes to implementing technology. On the other end of the spectrum, aggressive automation efforts driven by technologists may miss important nuances that may be better handled by humans. In the worst case, a technology-centric approach can result in automating broken processes. How do you get, and stay, on the right path so that you both improve your processes and automate appropriately? We recommend applying a Humalogy lens that lets you examine a process from some distinct angles:

  • It lets us decide if a process involves a greater emphasis on human effort or technology effort. This helps us understand which processes are too heavily human and in need of automation.
  • It helps determine which processes we should immediately devote attention to improving. We are able to prioritize more effectively.
  • It acts as a reminder that a solution isn’t always a technology solution. Often with processes, a greater human involvement is necessary to help a process run more effectively.
  • When we analyze processes, we are forced to diagram those processes to understand what is happening at each step. This gives deeper insight into how technology may be used to transform a process.

While Humalogy-based lean can help you improve back-office processes, studying Humalogy from the perspective of your customers will help improve their experience. This is accomplished by mapping the relationship journey. Humalogy to Improve Customer Experience Relationship journey mapping involves walking alongside your customers as they engage with your organization. We develop a subset of very targeted groups based on individual personas. In this process, we analyze together each critical stage of your customer or member journey and evaluate the touch points where you have the opportunity to engage directly with these personas. The goal of journey mapping is to maximize each opportunity and design the best possible experience for each customer. See Also: Keen Insights on Customer Experience The consideration of Humalogy is an important component of our journey mapping process. As you consider each of the personas who interact with your organization, you will also consider their proclivity to use technology at each stage. Would he want you to deliver all correspondence electronically? Would she be more willing to read a print newsletter you’ve sent her in the mail, or would an email with the information that you wish to present her suffice. Is he more likely to use a desktop computer or a smartphone? Would she be interested in a mobile application or online portal? Journey mapping allows you to consider the needs of each individual and then discover ways to satisfy those needs. Developing and using proper journey maps allow you to create a one-to-one experience for each of your customers. You will understand how to provide positive engagements that they will likely choose to discuss with their networks. In short, you can increase your value to your customers, and that’s really what it’s all about. Technology is already transforming your life and your industry. Technology can also, in an incredible way, transform your organization, everything from your day-to-day operations to the way you engage your customers.


Scott Klososky

Profile picture for user ScottKlososky

Scott Klososky

Scott Klososky is a speaker, consultant and author, guiding leaders as they use technology to transform their organizations. Klososky is the founder and principal at Future Point of View and the author of four books, including his most recent title, "Did God Create the Internet? The Impact of Technology on Humanity."

How to Create Risk Transparency

A new era is quickly approaching where information and analysis have the potential to remove the cloud engulfing insurance risk.

sixthings
There was a time not long ago when a bank originated a loan and kept that loan on the balance sheet until it was repaid. The amount banks could lend was limited to the deposits they had on hand and the banks' own ability to borrow. Today, credit risk is traded regularly, with specialized data and analytical services giving investors confidence they understand the risks they are assuming. But there has been limited opportunity for investors to deploy capital against specific pools of insurance risks, because of a lack of that sort of transparency. With the vehicles that do exist, it has been difficult to structure the transfer of risk to meet investors’ respective objectives and risk tolerances. However, insurance may be reaching a point in its evolution where the information gap will begin to narrow. Up until today, insurance risk had often been opaquely and highly subjectively valued. Today, actuaries set reserves based on highly summarized data, and underwriters set premiums based on claims experience that is extrapolated forward using historical loss development patterns and subjective future “trend” projections (or ad hoc substitute measures for risk), neither of which may represent future risk of loss. Outside of property catastrophe risk, where the data elements are generally available in some detail, granular risk data simply has not existed. However, rapid change could now be approaching. Vehicle telematics, wearable sensors, connected machines and other components of the Internet of Things (and Beings) are producing real-time data that allow us to look at risk in real time, rather than relying on current industry practices. See Also: A Better Way to Assess Cyber Risks? Credible, data-driven risk indices may create a variety of opportunities, including:
  • Capital Providers: Investment in specific index-based structured insurance pools that are aligned with respective objectives and relative risk tolerances could improve on the alternatives available today, where those who want to invest in insurance risk are often restricted to investing in insurance companies or risk pools that involve assuming underlying exposure to the operational, asset and credit risk, as well as the insurance risk of the originating insurer’s business.
  • Insurance Clients: Clients are likely to observe premium and associated underwriting decisions more transparently and could thus anticipate the cost/benefit implications of decisions taken to reduce risk.
  • Regulators: Regulators could gain greater confidence in the balance sheet valuation disclosed by insurance companies, which has the potential to decrease the regulator's view of risk capital necessary to support risk.
One could argue that straightforward consumer and commercial loans are much simpler than the risks underwritten by insurers. However, when taking a critical look at the complexity of the financial projects currently being traded by investors, that notion is hard to support. In fact, many of the underlying risks facing lenders are very closely related to the risks facing insurers. Perhaps the biggest differentiating factor is the lack of standardization of contracts, which creates a degree of complexity. From a contractual perspective, however, complex derivatives, other hard-to-value instruments and non-transparent assets can be at least as opaque and complex. Yet the core elements for assessing risk are available, and credible calculations exist within the valid range of assumptions. The insurance industry could benefit from the increasing availability of relevant data. That data could be the byproduct of other applications, such as route data from fleet management software; vehicle data from predictive maintenance applications; traffic density data from road management applications; or environmental data from various sources. Or, it could be data that has been custom-generated for insurance applications, such as the data from telematics devices used by personal auto insurers to capture driving behavior. I see the biggest promise in using the data exhaust from other applications. I suspect clients would be averse (in many cases) to additional data capture specifically for insurance but would be open to sharing data already captured—as long as there are appropriate safeguards to ensure that it does not disadvantage them as clients. The industry will need to invest in new analytical techniques to leverage these new data sources. In many other sectors of the economy, “big data” is having a real impact. This has required new tools and algorithms that might be unfamiliar to most analytical professionals within insurance. David Mordecai and Samantha Kappagoda, co-founders of the RiskEcon Lab at Courant Institute of Mathematical Sciences, which is among the world’s leading applied mathematics and computer science research institutions, explained the necessary evolution: “The increasingly pervasive proliferation of remote sensing and distributed computing (e.g. wearable tech, automotive telematics), and the resulting deluge of 'data exhaust' should both necessitate and enable the emergence of digitized risk management programs. Ubiquitous peer-to-peer interactions between human 'crowds' and machine 'swarms' promise to dominate commercial and consumer activity, as already observed within omni-channel advertising exchanges and high-frequency algorithmic stock trading platforms. Financing and insurance functions involving risk-transfer, risk-sharing and risk-pooling will increasingly be facilitated by and executed seamlessly within code. Among others, Bayesian statistical and adaptive process control methods (e.g. neural networks, hidden Markov models)—originally employed within the telecommunication, electricity, chemical industry and aviation during the mid-20th century, and more recently adapted for voice, visual and text recognition, along with other supervised and unsupervised data mining and pattern recognition methods—will need to be widely adopted to identify, monitor, measure and value underlying risk factors.” In my opinion, new data and new techniques are likely to create a degree of transparency in insurance risks that has never existed. That transparency could benefit capital providers (both insurance company investors and direct investors in insurance risk), clients and regulators. A new era is quickly approaching where information and analysis have the potential to remove the cloud engulfing insurance risk. There are likely to be substantial benefits for those forward-thinking companies that exploit the opportunity.

David Bassi

Profile picture for user DavidBassi

David Bassi

David Bassi is an industry leader with experience in underwriting. risk management and analytics. He has led efforts at prominent global companies to integrate advances in data science, technology and the capital markets into traditional business models.

Ending Cost-Shifting to Workers’ Comp

Doctors can't always know if an injury is work-related, but a test can provide a baseline for evaluating each worker.

sixthings
An April 2016 study by the Worker’s Compensation Research Institute (WCRI) titled, "Do Higher Fee Schedules Increase the Number of Workers’ Compensation Cases?" found that, in many states, workers' compensation reimbursement rates were higher than group health reimbursement rates. The study stated that cost shifting is more common with soft tissue injuries, especially in states with higher workers’ compensation reimbursement rates. The study found that an estimated 20% increase in workers’ compensation payments for physician services provided during an office visit is associated with increases in the number of soft-tissue injuries being called "work-related" by 6%. This study goes hand-in-hand with another study by the WCRI called, “Will The Affordable Care Act Shift Claims to Worker’s Compensation Payors” (September 2015), which said that if only 3% of group health soft tissue conditions were shifted to workers’ compensation in Pennsylvania, costs could increase nearly $100 million annually — in California, this cost shifting to workers' compensation could increase costs more than $225 million. See Also: What Will Workers' Comp Be in 20 Years? Soft-tissue injuries typically defined as musculoskeletal disorders (MSD) are typically muscle or nerve conditions that primarily affect the neck, back and shoulders and can include conditions such as cumulative trauma, neck, back sprain/strains or any damage to the muscles, ligaments and tendons. They are often difficult to diagnose and treat because there are very few reliable objective tests that demonstrate soft tissue injuries. The diagnosis is often based on the patient’s history and the doctor’s physical examination of the patient. Therefore, the diagnosis frequently depends on the individual’s subjective complaints of pain, as well as the individual’s compliance and genuine effort during the musculoskeletal and neurological phases of the exam. Historically, in workers’ compensation, both the patient’s subjective complaints and his or her effort during the physical exam are often unreliable. Inaccurate histories and poor effort on physical exams can, more often than not, lead to misdiagnoses and ineffective or inappropriate treatments, which increase the cost, shifting burden to the employer even more. In many states, the burden to determine causation of a soft tissue injury and to determine if the medical necessity of treatment falls under workers' compensation or group health resides solely with the treating physician. In fact, states like Florida place an extra burden on doctors because of an apportionment law that states that the individual is responsible for the non-work-related treatment. If there is a major discrepancy in reimbursement between workers’ compensation and commercial insurance, the treating physician is tempted to accept the patient’s history of the event and does not have an incentive to investigate history that may place the causation of the patient’s symptoms in doubt. If clear-cut evidence documenting a pre-existing condition is lacking or not reviewed, the physician’s decision can be affected by secondary gain, and the physician is more likely to state that the soft tissue injury is work-related. In these economic times, the cost-shifting issue is hard to resist for physicians. That is coupled with the fact that soft tissue injuries are often hard to demonstrate radiographically or with objective testing. In addition, radiographic tests are unreliable at timing injuries. X-rays and MRIs can show chronic changes like osteophytes and severely collapsed discs that usually take years to develop, but if a patient states that all of the pain began after a work-related injury, the treating physician may be tempted to attribute causation to the work-related event despite conflicting (yet unclear) radiographic findings. If this trend continues and remains uncontrolled, employers' workers’ compensation costs can skyrocket. The key to this issue is only accepting claims that arise out of the course and the scope of treatment. The law in each jurisdiction has one simple common theme: The employee needs to be returned to baseline. An electrodiagnostic functional assessment soft tissue management (EFA-STM) program can resolve the issues. It is a bookend solution that measures current and new employees before and after a work-related event is reported. It assists in determining if an injury arose over the course and scope of employment (AOECOE) and helps in providing better care for the work-related condition. EFA-STM is non-discriminatory. It objectively determines pre-injury status and whether there is a change in condition after a reported occurrence. A baseline assessment is performed and the unread data is immediately stored in a secure database. When a work-related event is reported, a post-injury assessment is conducted and compared with the baseline test to determine whether there is a change in condition. Without a pre-injury exam for comparison, no radiographic test (including an MRI) can accurately time a soft-tissue injury and, thus, the ultimate opinion on causation of injury can be subject to bias. In addition, it is commonly accepted that an MRI, for example, shows structural abnormalities that are common in asymptomatic patients. The EFA-STM program allows physicians to more accurately determine if structural changes on an MRI are causing nerve/muscle irritation and disturbance. Therefore, more accurate diagnoses are made and more appropriate treatments are recommended. Unnecessary, costly and invasive tests (e.g. discography) and treatments can be avoided. See Also: 25 Axioms of medical Care in Workers' Comp System The EFA-STM program is specifically designed to allow better treatment for the work-related condition and has proven invaluable to prevent cost shifting to workers' compensation. The program provides objective information that enables doctors to more accurately establish causation and to avoid the potential temptation to shift the burden to a work comp carrier if a soft tissue injury is not work-related. Finally, the EFA-STM program minimizes false positive structural abnormalities that are commonly seen on an MRI and allows for more accurate diagnoses so that safer, more cost-effective treatments can be rendered.

Frank Tomecek

Profile picture for user FrankTomecek

Frank Tomecek

Frank J. Tomecek, MD, is a clinical associate professor of the Department of Neurosurgery for the University of Oklahoma College of Medicine-Tulsa. Dr. Tomecek is a graduate of DePauw University in chemistry and received his medical degree from Indiana University. His surgical internship and neurological spine residency were completed at Henry Ford Hospital.

Key Regulatory Issues in 2016 (Part 1)

Though historically under the purview of the states, U.S. insurers have been responding to influences at both the international and federal levels.

sixthings
The complexities of the current regulatory environment undoubtedly pose significant challenges for the broad spectrum of financial services companies, as regulators continue to expect management to demonstrate robust oversight, compliance and risk management standards. These challenges are generated at multiple (and sometimes competing) levels of regulatory authority, including local, state, federal and international, as well as, in some cases, by regulatory entities that are new or have been given expanded authority. Their demands are particularly pressing for the largest, most globally active firms, though smaller institutions are also struggling to optimize business models and infrastructures to better address the growing regulatory scrutiny and new expectations. Across the industry, attentions are focused on improving overall financial strength and stability, guided by the recommendations of international standards-setting bodies and U.S. regulatory mandates that encompass governance, culture, risk management, capital and liquidity. Though historically under the purview of individual states, the insurance sector in the U.S. has been responding to influences at both the international and federal levels. The efforts of the International Association of Insurance Supervisors (IAIS) to develop insurance core principles (ICPs), a common framework for the supervision of internationally active insurance groups (IAIGs) and capital standards, have all laid the foundation for global regulatory change. These efforts have been further supported by new authorities given to the Federal Reserve Board, the Financial Stability Oversight Council and the Federal Insurance Office and by the designation of certain nonbank insurance companies as systemically important financial institutions (SIFIs). Following are some of the key regulatory issues we anticipate will have an impact on insurance companies this year: 1. Strengthening Governance and Culture Despite heightened attention from regulators and organizations to strengthen governance structures and risk controls frameworks, instances of misconduct (i.e., professional misbehavior, ethical lapses and compliance failures) continue to be reported across the financial services industry, including the insurance sector, with troubling frequency. Boards and senior management are now expected to define and champion the desired culture within their organizations; establish values, goals, expectations and incentives for employee behavior consistent with that culture; demonstrate that employees understand and abide by the risk management framework; and set a “tone from the top” through their own words and actions. Line and middle managers, who are frequently responsible for implementing organizational changes and strategic initiatives, are expected to be similarly committed, ensuring the “mood in the middle” reflects the tone from the top. Regulators are also assessing an organization’s culture by looking at how organizations implement their business strategies, expecting firms to place the interests of all customers and the integrity of the markets ahead of profit maximization. They will consider business practices and associated customer costs relative to the perceived and demonstrable benefit of an individual product or service to the customer, giving attention to sales incentives and product complexities. State and federal insurance regulators have joined the global push for enhanced governance, and, in 2016, insurers can expect heightened attention in this area through the Federal Reserve Board’s (Federal Reserve) supervision framework and its enhanced prudential standards (EPS) rule; the Financial Industry Regulatory Authority’s (FINRA) targeted review of culture among broker-dealers; and the National Association of Insurance Commissioners’ (NAIC) Corporate Governance Annual Disclosure Model Act, which became effective Jan. 1, 2016, and requires annual reporting following adoption by the individual states. Given the regulatory focus on conduct, insurers might experience some pressures to put in place governance and controls frameworks that specifically recognize and protect the interests of policy holders. 2. Improving Data Quality for Risk Data Aggregation and Risk Reporting Financial institutions continue to struggle with improving their risk-data aggregation, systems and reporting capabilities, which means insurers, in particular, will be challenged to handle any coming changes in regulatory reporting, new accounting pronouncements, enhanced market opportunities and increasing sources of competition because of legacy actuarial and financial reporting systems. These data concerns are augmented by information demands related to emerging issues, such as regulatory interest in affiliated captives. In addition, there are expected requirements of anticipated rulemakings, such as the Department of Labor’s Fiduciary Rule, which necessitates a new methodology or perspective regarding product disclosure requirements and estimations of the viability and benefits of individual products. There is also the Federal Reserve’s single counterparty credit limit (SCCL) rule, which requires organizations, including nonbank SIFIs, to track and evaluate exposure to a single counterparty across the consolidated firm on a daily basis. Quality remains a challenge, with data integrity continually compromised by outmoded technologies, inadequate or poorly documented manual solutions, inconsistent taxonomies, inaccuracies and incompleteness. Going forward, management will need to consider both strategic- level initiatives that facilitate better reporting, such as a regulatory change management strategic framework, and more tactical solutions, such as conducting model validation work, tightening data governance and increasing employee training. By implementing a comprehensive framework that improves governance and emphasizes higher data-quality standards, financial institutions and insurance companies should realize more robust aggregation and reporting capabilities, which, in turn, can enhance managerial decision making and ultimately improve regulatory confidence in the industry’s ability to respond in the event of a crisis. See Also: FinTech: Epicenter of Disruption (Part 1) 3. Harmonizing Approaches to Cybersecurity and Consumer Data Privacy Cybersecurity has become a very real regulatory risk that is distinguished by increasing volume and sophistication. Industries that house significant amounts of personal data (such as financial institutions, insurance companies, healthcare enrollees, higher education organizations and retail companies) are at great risk of large-scale data attacks that could result in serious reputational and financial damage. Financial institutions and insurance companies in the U.S. and around the world, as well as their third- party service providers, are on alert to identify, assess and mitigate cyber risks. Failures in cybersecurity have the potential to have an impact on operations, core processes and reputations but, in the extreme, can undermine the public’s confidence in the financial services industry as a whole. Financial entities are increasingly dependent on information technology and telecommunications to deliver services to their customers (both individuals and businesses), which, as evidenced by recently publicized cyber hacking incidences, can place customer-specific information at risk of exposure. Some firms are responding to this link between cybersecurity and privacy by harmonizing the approach to incidence response, and most have made protecting the security and confidentiality of customer information and records a business and supervisory priority this year. State insurance regulators have a significant role in monitoring insurers’ efforts to protect the data they receive from policyholders and claimants. In addition, they must monitor insurers’ sales of cybersecurity policies and risk management services, which are expected to grow dramatically in the next few years. Insurers are challenged to match capacity demands, which may lead to solvency issues, with buyers’ needs and expectations for these new and complex product offerings. The NAIC, acting through its cybersecurity task force, is collecting data to analyze the growth of cyber-liability coverage and to identify areas of concern in the marketplace. The NAIC has also adopted Principles for Effective Cybersecurity: Insurance Regulatory Guidance for insurers and regulators as well as the Cybersecurity Consumer Bill of Rights for insurance policyholders, beneficiaries and claimants. Insurance regulatory examinations regularly integrate cybersecurity reviews, and regulatory concerns remain focused on consumer protection, insurer solvency and the ability of the insurer to pay claims. 4. Recognizing the Focus on Consumer Protection In the past few years, the Consumer Financial Protection Bureau and the Federal Trade Commission have pursued financial services firms (including nonbanks) to address instances of consumer financial harm resulting from unfair, deceptive or abusive acts or practices. The DOL Fiduciary Rule redefines a “fiduciary” under the Employee Retirement Income Security Act to include persons — brokers, registered investment advisers, insurance agents or other types of advisers — that receive compensation for providing retirement investment advice. Under the rule, such advisers are required to provide impartial advice that is in the best interest of the customer and must address conflicts of interest in providing that advice. Though intended to strengthen consumer protection for retirement investment advice, the rule is also expected to pose wide-ranging strategic, business, product, operational, technology and compliance challenges for advisers. In addition, the Securities and Exchange Commission (SEC) has announced it will issue a rule to establish a fiduciary duty for brokers and dealers that is consistent with the standard of conduct applicable to an investment adviser under the Investment Advisers Act (Uniform Fiduciary Rule). The consistent theme between these two rules is the focus on customer/investor protection, and the rules lay out the regulators’ concern that customers are treated fairly; that they receive investment advice appropriate to their investment profile; that they are not harmed or disadvantaged by complexities in the investments markets; and that they are provided with clear descriptions of the benefits, risks and costs of recommended investments. In anticipation of these changes, advisers are encouraged to review their current practices, including product offerings, commissions structures, policies and procedures to assess compliance with the current guidance (including “suitability standards” for broker/dealers and fiduciary standards for investment advisers, as appropriate) as well as to conduct impact assessments to identify adjustments necessary to comply with the DOL Fiduciary Rule. Such a review should consider a reassessment of business line offerings, product and service strategies and adviser compensation plans. 5. Addressing Pressures From Innovators and New Market Entrants The financial services industry, including the insurance sector, is experiencing increased activity stemming, in large part, from the availability of products and services being introduced to meet the growing demand for efficiency, access and speed. Broadly captioned as financial technology, or FinTech, innovations such as Internet-only financial service companies, virtual currencies, mobile payments, crowdfunding and peer-to-peer lending are changing traditional banking and investment management roles and practices, as well as risk exposures. The fact that many of these innovations are being brought to market outside of the regulated financial services industry — by companies unconstrained by legacy systems, brick-and- mortar infrastructures or regulatory capital and liquidity requirements — places pressures on financial institutions to compete for customers and profitability and raises regulatory concerns around the potential for heightened risk associated with consumer protection, risk management and financial stability. For insurance companies, the DOL Fiduciary Rule will affect the composition of the retirement investment products and advice they currently offer and, as such, creates opportunity for product and service innovation as well as new market entrants. Insurers will want to pursue a reassessment of their business line offerings, product and service strategies, and technology investments to identify possible adjustments that will enhance compliance and responsiveness to market changes. Regulators will be monitoring key drivers of profit and consumer treatment in the sale of new and innovative products developed within and outside of the regulated financial services industry. This piece was co-written by Amy Matsuo, Tracey Whille, David White and Deborah Bailey. 

Stacey Guardino

Profile picture for user StaceyGuardino

Stacey Guardino

Stacey Guardino is a New York based partner in KPMG’s financial services regulatory practice. She has more than 25 years of experience serving diversified financial institutions focusing on insurance and bank holding companies.

How to Improve Claim Audits -- and Profit

A study found that a workers' comp audit must be designed to affect not just the actions of the adjuster but all elements of the claims process.

sixthings
A session at RIMS 2016 illustrated how to methodically examine and review the right activities in claims audits to improve the bottom line. Speakers in this session were:
  • Jenny Novoa, senior director of risk management, Gap
  • Joe Picone, claim consulting practice leader, Willis Towers Watson
They explained that, in a claims management context, an audit assesses compliance with the carrier and industry best practices and special handling instructions. A “typical” claim audit determines if the TPA/carrier’s performance is meeting its obligations in the service agreement. It also determines adequacy of reserves, benchmarks the TPA/carrier and adjuster performance, measures against best practices, provides constructive observations and recommends and identifies areas for improvement. A group came together from some major companies including Gap, Foot Locker, Saks/Lord & Taylor, Corvel and Willis Towers Watson to study the claim auditing process. This study explored different areas of the process and was conducted over the course of about a year. The mission of the study was to determine several things, including:
  • Does the claim audit fairly measure the outcome of the claim?
  • Is there’s a better way to audit the claim?
  • How is “outcome” defined?
  • What factors are important in defining claims outcome?
  • Does a best practice score really equate to a good outcome?
The study group came up with categories of what matters most in the claims process, including: quality of the adjuster, overall health of employee and quality of medical care. They looked at various audit criteria for retail business with the basis for “outcome” being days out of work. They also had a set of specific audit rules. See Also: How to Manage Claims Across Silos The group used a large sample of questions by category and compared the Best Practice Audit (BPA) with the Outcomes-Based Audit (OBA). Results were very different. A few observations from the study:
  • BPA audit scores did not identify any of the 28 claims with poor outcomes.
  • OBA identified just 10 of the 28 claims with poor outcomes.
  • The average OBA audit score was 91, and the average BPA score was 97.
  • The OBA overall audit score is much more in line with the overall outcome of the universe of claims audited.
More takeaways:
  • The team proved that audits must be designed to really affect not just the performance of the adjuster but all elements of the claims process.
  • Review your questions. For example — each question should be individually reviewed with regression analysis to determine correlation levels. Questions that have no correlation should be eliminated and those that do show correlation added.
  • Know that BPA can score 100, but the claim can still have a bad outcome.
  • OBA is a better predictor of outcomes than BPA.
The group determined the correlation between a best-practice compliance audit score and outcome may be lost if the wrong activities are audited. Critical activities that are never audited may cause poor outcomes in a claim. Again, only when you methodically examine and review the right activities do you improve the bottom line.

FinTech: Epicenter of Disruption (Part 3)

Traditional insurers believe that 21% of their revenue is at risk to InsurTech start-ups within five years.

sixthings
This is the third in a four-part series. The first article is here. The second is here. Typically, disruption hits a tipping point at which just less than 50% of the incumbent revenue is lost in about a five-year timeframe. Recent disruptions that provide valuable insight include streaming video’s impact on the video rental market. When broadband in the home reached ubiquity and video compression technology matured, low-cost streaming devices were developed and, within four years, the video rental business was completely transformed. The same pattern can be seen in the Internet-direct insurance model for car insurance. At present, 50% of the revenue from the traditional agent-based distribution model has been moved to direct insurance providers. Revenue at risk will exceed 20% by 2020 According to our survey, the vast majority (83%) of respondents from traditional financial institutions (FIs) believe that part of their business is at risk of being lost to standalone FinTech companies; that figure reaches 95% in the case of banks. In addition, incumbents believe 23% of their business could be at risk because of the further development of FinTech, though FinTech companies anticipate they may be able to acquire 33% of the incumbents’ business. In this regard, the banking and payments industries are feeling more pressure from FinTech companies. Fund transfer and payments industry respondents believe they could lose as much as 28% of their market share, while bankers estimate that banks are likely to lose 24%. Screen Shot 2016-04-08 at 2.28.21 PM A rebalancing of power FinTech companies are not just bringing concrete solutions to a morphing consumer base, they are also empowering customers by providing new services that can be delivered with the use of technological applications. The rise of “digital finance” allows consumers to connect to information anywhere at any time, and digital services can address their needs in a more convenient way than traditional nine-to-five financial advisers can. According to our survey, two-thirds (67%) of the companies ranked pressure on margins as the top FinTech-related threat. One of the key ways FinTechs support the margin pressure point through innovation is step function improvements in operating costs. For instance, the movement to cloud-based platforms not only decreases up-front costs but also reduces continuing infrastructure costs. This may stem from two main scenarios. First, standalone FinTech companies might snatch business opportunities from incumbents, such as when business-to-consumer (B2C) FinTech companies sell their products and services directly to customers and position themselves as more dynamic and agile alternatives to traditional players. Secondly, business-to-business (B2B) FinTech companies might empower specific incumbents through strategic partnerships with the intent to provide better services. Screen Shot 2016-04-08 at 2.33.19 PM FinTech, a source of opportunities FinTech also offers myriad possibilities for the financial services (FS) industry. B2B FinTech companies create real opportunities for incumbents to improve their traditional offerings. For example, white label robo-advisers can improve the customer experience of an independent financial adviser by providing software that helps clients better navigate the investment world. In the insurance industry, a telematics technology provider can help insurers track risks and driving habits and can provide additional services such as pay-as-you-go solutions. Partnerships with FinTech companies could increase the efficiency of incumbent businesses. Indeed, a large majority of respondents (73%) rated cost reduction as the main opportunity related to the rise of FinTech. In this regard, incumbents could simplify and rationalize their core processes, services and products and, consequently, reduce inefficiencies in their operations. But FinTech is not just about cutting costs. Incumbents partnering with FinTech companies could deliver a differentiated offering, improve customer retention and bring in additional revenues. In this regard, 74% of fund transfer and payment institutions consider additional revenues to be an opportunity coming from FinTech. This is already true in the payments industry, where FinTech generates additional revenues through faster and easier payments and digital wallet transactions. Screen Shot 2016-04-08 at 2.33.19 PM This post was co-written by: John Shipman, Dean Nicolacakis, Manoj Kashyap and Steve Davies.

Haskell Garfinkel

Profile picture for user HaskellGarfinkel

Haskell Garfinkel

Haskell Garfinkel is the co-leader of PwC's FinTech practice. He focuses on assisting the world's largest financial institutions consume technological innovation and advising global technology companies on building customer centric financial services solutions.


Jamie Yoder

Profile picture for user JamieYoder

Jamie Yoder

Jamie Yoder is president and general manager, North America, for Sapiens.

Previously, he was president of Snapsheet, Before Snapsheet, he led the insurance advisory practice at PwC. 

10 Reasons Why Healthcare Varies

What if healthcare came with a warning label? "Results vary. They can include hospital-acquired infections and premature death."

sixthings
Imagine your recommended medical treatment came with this warning label: “Your results may vary. Your results are not guaranteed. Outcomes can include preventable complications, up to (and including) hospital-acquired infections, hospital readmission and premature death.” Caveat emptor or, "buyer beware,” has never been truer than in today’s healthcare system. The use of evidence-based medicine) protocols delivers higher quality, lower prices and improved outcomes throughout the country for many different treatments. Scientific studies have proven the efficacy of following best-practice guidelines. Achievable results include reduced premature mortality, improved quality of life and better clinical outcomes, which means faster recovery. See Also: Cutting Healthcare Costs Doesn't Lower Quality By no means is this a blanket assertion that the practice of all medicine can be reduced to a checklist, a differential diagnosis and a universal treatment regimen. The seven billion human beings on this planet each have trillions of cells and billions of possible variations. In addition, there are many social determinants of health, including social, economic and physical environmental factors. The fact is, no treatment regimen works 100% of the time on 100% of the people. However, there are proven, evidence-based strategies that effectively deliver higher quality and better outcomes with scale (which means lower costs). Therefore, it is incumbent upon healthcare providers and purchasers to live up to their fiduciary responsibility to act in the best interest of the consumer and the insured employee. So, what happens in the practice of medicine that results in so much variability in treatment? Today’s medicine is part science and part art. Unfortunately, for too many years, perverse reimbursement incentives have clouded and conflicted an industry that requires incredibly nuanced judgment on conditions with many variables and possible outcomes. Outcomes are largely determined by the skill and experience of a physician or team of physicians. Parity may exist in professional sports, but that is not the case in the practice of medicine. As a result, the practice of medicine is significantly influenced by individual providers and their practice patterns, beliefs, biases, needs and preferences, what we call “10 Reasons Why Medical Quality, Price and Outcomes May Vary." See Also: Healthcare Costs: We've Had Enough! Depending on your location, your level of engagement and your particular treatment, the quality, price and outcome are likely to be affected by the actual provider of services. The following list includes 10 reasons why the practice of medicine is driven by the attitude, behavior and skill of the provider: Screen Shot 2016-04-22 at 12.12.31 PM The typical American healthcare consumer still believes he is a patient and acts accordingly to eliminate the illness, not always recognizing the role he plays in his outcomes. The irreversible change taking place is that individuals have to learn to become consumers of healthcare by becoming engaged and taking responsibility for both their life outside the medical system and the choices they make when accessing medical care. The risks are real. Understanding the risk can empower recognition and awareness that acting like a consumer is in your best interest, and that might just save your life. For additional free assistance on avoiding wasteful, unnecessary or poor quality medical tests, treatments or procedures go to www.choosingwisely.org.

Craig Lack

Profile picture for user CraigLack

Craig Lack

Craig Lack is "the most effective consultant you've never heard of," according to Inc. magazine. He consults nationwide with C-suites and independent healthcare broker consultants to eliminate employee out-of-pocket expenses, predictably lower healthcare claims and drive substantial revenue.

Managing Behavioral Health at Work

Employers can reduce the duration of disability for behavioral health issues and perhaps see improvement in workers' comp, too.

sixthings
At the RIMS 2016 Annual Conference, Kimberly George, senior vice president of Sedgwick, and Scott Daniels, director of disability for Comcast, discussed an approach to managing mental and behavioral health in the workplace. The discussion focused on how Comcast deals with these issues. Comcast has a very diverse workforce, owning a cable company, multiple television networks and even theme parks. go Behavioral health claims not only affect your employees directly, but they also can have a significant impact on your business. According to a recent study by IBI, four of the top six employment-related concerns of employers related to the health of their workforce. The study also found that mental health was the second-highest duration of disability diagnosis for their short-term disability programs. Comcast has had 1,300 to 1,600 behavioral health claims per year, paying millions of dollars in benefits. One area of concern for the company is that 60% of those being treated were not being seen by licensed behavioral health experts. Instead, they were being treated by general practitioners who lacked the expertise to adequately address the issues. Comcast is trying to focus on being an advocate for its workers on health issues, and part of that includes assisting them in being treated by the appropriate medical providers. See Also: A New Focus for Health Insurance Comcast’s program is currently focused on the group benefits side. The company hopes to someday expanded to workers’ compensation. If employees have a behavioral health diagnosis, they are required to treat with a practitioner specifically licensed in that area. Comcast does not direct to specific providers but instead work with the employee to help identify providers in the network. The Comcast employee assistance program (EAP) comes into play as the employee can receive a certain number of behavioral health visits under this at no cost to the worker. The program has been in place less than a year, but Comcast is already seeing  significant decreases in duration of disability for behavioral health claims. There is hope that this program can have a positive impact on workers’ compensation claims, as well. Under the EAP program, Comcast can provide the behavioral health treatment outside the workers’ compensation claim to help address the psycho-social issues that could have an impact on the claim. This approach recognizes that you must treat the whole person to effectively manage workers’ compensation claims, and you cannot ignore psycho-social issues that may be affecting the case. One of the first resources that Comcast tapped into in developing its program was its EAP provider. The  provider offers a variety of resources to the workforce, not just in the area of behavioral health but also with a variety of lifestyle issues. The EAP was being underutilized before this program started, but the change in focus helped employees to fully understand the benefits under their EAP. Resilience is a also very important issue that can affect both disabilty and workers’ compensation claims. Comcast is working with a vendor partner to assist employees in developing coping skills and being more resilient. Comcast feels that by strengthening the resilience of its workforce it can significantly reduce all disability in the workplace. Comcast is also using more telehealth, which is yielding positive results. It makes it easier for the employees to receive medical care in a timely manner. This has been especially useful with behavioral therapy. The company is also hoping that the focus on getting the employee the proper care will decrease relapse in disability. Oftentimes, relapse is driven by the employee's not receiving the appropriate treatment. The overall focus at Comcast is establishing a culture of health for the workforce. The company wants employees to engage in the healthcare experience and become educated consumers. The hope is this culture will ultimately lead to healthier employees, which will result in fewer disability and workers’ compensation claims.