Download

The Multibillion-Dollar Opportunity

Seven key digitalization technologies have already begun to disrupt the industry. Their impact will accelerate in the next three to five years.

sixthings
The business of property and casualty insurance—assessing risk, collecting premiums and paying claims—hasn’t changed much since 1861, when a group of underwriters sold the first policies to protect London homeowners against losses from fire. Recently, though, the insurance industry has embarked on a radical transformation, one spurred by a series of digital innovations whose widespread adoption is just a few years away. Bain & Company and Google have identified seven key technologies—namely, infrastructure and productivity, online sales technologies, advanced analytics, machine learning, the Internet of Things, distributed ledger and virtual reality—that have already begun to disrupt the industry and whose impact will accelerate in the next three to five years. These new technologies are likely to be a boon for consumers, bringing more choice, better service and lower prices. For those insurers ready to seize the initiative, digitalization presents an immense opportunity. The companies that stand to benefit the most are those that use the impetus of digitalization to rethink all their operations, from underwriting to customer service to claims management. The impact on both revenues and costs can be enormous. An analysis by Bain and Google shows that a prototypical P&C insurer in Germany that implemented these technologies could increase its revenues by up to 28% within five years, reduce claims payouts by as much 19% and cut policy administration costs by as much as 72% (see Figure 1). These pioneers in digital technology can gain an edge over their rivals by becoming more effective and efficient. They’ll be able to trim costs and pass on those savings to their customers, thereby winning new business and gaining market share. The digital laggards, by contrast, will find themselves fighting an intensified price war and scrambling to protect their competitive positions. Customers are pressing for change. They now expect their insurers to offer simple, transparent and flexible products and services—all online. And companies have begun to respond. In Australia, for example, you can use your smartphone to snap a photo of something you want to insure, such as a bicycle; upload the picture into an app called Trov; and then request a policy for a specific period, say a month. Trov uses available data about you and your bicycle and, within seconds, comes back with an offer. If you like the terms, you press the “I accept” button, and you’re covered. Claims are also handled online, with a rapid exchange of photos and texts. See also: Preparing for Future Disruption…   So far, companies have focused primarily on customer-facing applications. But some insurers are beginning to realize that digital means much more than cool and convenient apps for consumers; it is a force that will touch and reshape the very core of their business. Yet firms will reap the full benefit of digital technology only when they embrace its potential along the entire insurance value chain, including underwriting and claims management. Seven disruptive technologies To assess the impact of various technologies along the insurance value chain, Bain and Google identified and analyzed more than 100 digital use cases and focused on the 30 most likely to be disruptive within the next three to five years. Technologies that fall outside of that time frame, even potentially transformative ones like self-driving cars, biosensors and smart contact lenses, were excluded. The 30 use cases were grouped into seven broad categories and evaluated for the effect they would have on the revenues and profits of a prototypical German insurer—and by extension on the global insurance industry (see Figure 2):
  • Infrastructure and productivity. A modern IT architecture is critical for digital innovation. Many insurers consider the cloud the best option for processing, computation and storage. They can also use productivity tools such as coauthoring and video calling, and they can connect with their customers through a seamless, omni-channel approach.
  • Online sales technologies. Insurers can use cutting-edge techniques for targeting customers, identifying user groups and analyzing consumption patterns.
  • Advanced analytics (AA). With AA, insurers can gain extensive insights into customer needs and preferences. Insurers can also draw on it to help fight fraud.
  • Machine learning. With machine learning, insurers’ information systems can quickly adapt to new data, without the need for re-programming. Insurers can use machine learning to shape underwriting, price products and manage claims.
  • The Internet of Things. Networked devices in cars and buildings can protect people and property and facilitate proactive, preventive maintenance, thus reducing accidents—and claims. By analyzing data from sensors embedded in vehicles and other equipment, insurers can gain insights into customer behavior.
  • Distributed ledger technology. By arranging and documenting claims on distributed ledgers, insurers can greatly reduce processing time. A whole new field is opening up for smart contracts—that is, policies that are fully automated and updated based on a blockchain’s entire database.
  • Virtual reality (VR). The global fascination with the smartphone game Pokémon Go shows VR’s popularity, but this technology also has the potential to transform the way information for underwriting is gathered, as well as the way claims are settled. For example, an insurer could use VR to create a three-dimensional image of a room or to reconstruct an accident in minute detail.
The common feature all these technologies share is their practical relevance. They are already in use today in differing degrees and will be widely available in three to five years. And more change is coming. Entirely new concepts in automotive insurance will be needed for driverless vehicles. Who is at fault in an accident if nobody was driving? 3D printers will unlock new possibilities for claims settlement. Imagine an insurer “printing” a new fender to replace the one bent in an accident. Even in the near future, though, insurance will look very different. Key question to ask: Is it good for the customer? Consider a car accident that occurs three years from now. New technologies will help the involved parties receive help quickly and efficiently. Immediately following an incident, software built into the vehicles can assess the damage and notify a towing service, if necessary. Assuming the drivers are not seriously injured, they can use their smartphones to record 3D images of the damage and then send the images, together with the electronic address cards of all the involved parties, to their insurance companies. In the future, insurers won’t need to dispatch human adjusters to gather facts and evaluate accident damage. Using machine learning, automated advisers will draw on virtual reconstructions of the accident and a wealth of background data. They’ll enter into a virtual dialogue with customers and immediately inform them where any damage can best be repaired. Insurers deciding which digital technologies to pursue can ask themselves a simple, and fundamental, question: Will it enhance the customer’s experience? Putting the customer first is more than a platitude. Simply put, an improved customer journey—one built on ultra-precise information, greater transparency, more flexibility and simplified interactions—is good for business. Each of the 30 cases that formed the basis of this study will enhance the customer experience—and, at the same time, help companies increase revenues and contain costs. See also: Mutual Insurance: Back to the Future?   Take the typical experience of a customer calling an insurer today. It’s likely an automated answering service will say to press buttons 1, 2 or 3 for various options. With machine learning, though, insurers will be able to serve a customer much faster and effectively, without all the button-pushing. The system will instantly analyze the customer’s flow of communications across all channels, including past phone calls, letters, emails and even public social media postings. When the customer starts speaking, the computer can analyze the tone of voice, determining whether the caller is confused or angry or both. Armed with all this information, a virtual agent can assess the customer’s needs and suggest a solution. By the time a real-life agent comes onto the phone—if that’s even necessary—the customer’s problem will likely have been resolved. Generally speaking, the moment of truth for every customer and every insurer comes when claims need to be processed. Digital technologies will be able to dramatically shorten the period between reporting and settling claims. That’s primarily because all relevant data will be collected within minutes and all parties involved will have access to the same information. Digital technologies will open up new vistas in claims prevention, thanks to the Internet of Things. In the future, for example, a sensor will be able to monitor a household’s water consumption patterns, detecting potential leaks and interrupting the flow before the basement is flooded, thus preventing major damage and a costly claim. The digital path to higher revenues and lower costs Digitalization will create fascinating new possibilities for insurers. But what actual implications will these have for revenues and earnings in the next five years? To answer that question, Bain and Google looked at a prototypical German P&C insurer that had adopted all 30 of the most promising digital use cases. Similar prototypes can be derived for specific business lines and for insurers operating in other countries, factoring in regional preferences. Across markets, insurers that serve individual consumers, as opposed to business customers, are likely to experience the earliest and biggest bottom-line impact from digitalization. Underwriting risk and processing claims for business customers are relatively complex operations, making automation more challenging. But commercial insurers will still be able to benefit from innovation—including the use of 3D technology to register objects and machine-generated data to calculate policies. Across the entire P&C sector, digitalization presents billions of dollars in opportunities to boost revenues and cut costs. To exploit these opportunities, the insurance industry needs a major rethink. Many insurers are focusing their digital efforts on product development and distribution, yet it’s underwriting and claims management that hold the biggest potential for change. It’s in those areas that machine learning, advanced analytics and the Internet of Things can have the biggest impact. Based on the Bain and Google analysis, the prototypical German insurer that consistently pioneers the use of digitalization can expect its premium receipts to rise by about 28% in five years, with most of the increase coming from gains in market share. By operating more efficiently, the insurer will be able to lower its costs, reduce its prices and thereby attract more customers. At the same time, the company will be able to use some of the money saved from its new technologies to invest in more digital innovation—forming a virtuous cycle. As rich as the potential is for top-line growth, the opportunities for cost reduction are even greater. By using digital technologies, a prototypical insurer can lower its gross costs by up to 29% in five years, with most of that savings coming from claims management. With digital tools, insurers will be able to more effectively underwrite risk, enhance preventions and minimize fraud. By deploying automated advisers and machine learning, they’ll save money on distribution and administration. To P&C insurers battling in a fiercely competitive marketplace, digitalization can be a multibillion dollar opportunity. The insurers mostly like to reap these benefits are those who give primacy to improving the customer experience. Digital tools that don’t make the customer’s journey more efficient, economical and satisfying aren’t likely to help the insurer’s top or bottom lines. Insurers can use digital tools to deliver added services, lower premiums and an all-around better experience. Companies that do this well will reduce costs and raise revenues—and they’ll be that much further along on the road to achieving a broad-based, customer-focused digital transformation. Signposts on the digital journey Take the customer’s point of view. Digitalization is not an end in itself, nor is it primarily a means of increasing profitability. Rather, it is a way to serve evolving and demanding customers. Design digital use cases that improve the customer’s experience and add value. Profits will follow. See also: Let’s Keep ‘Digital’ in Perspective   Expand your digital horizons. Insurers should establish a view now on those technologies that are likely to add the most customer value and differentiate them from their competitors. The biggest opportunities for gaining sway lie in underwriting and claims management. Launch and iterate. Rapidly evolving technologies and customer behavior present a challenge to long-term planning. Insurers should quickly bring new prototypes to market and continue to improve them. Companies should abandon those tools that don’t improve the customer experience, help cut costs or give them an edge over their rivals. Establish a digital culture. Digitalization means much more than technological change. Insurers should commit to new and improved ways of working and serving the customer, with employees who are trained and motivated to work in a digital environment. This article was originally published by Bain & Company.

Henrik Naujoks

Profile picture for user HenrikNaujoks

Henrik Naujoks

Dr. Henrik Naujoks is a partner at Bain & Company in Zurich and head of the Financial Services practice for Europe, Africa and the Middle East (EMEA). He has more than 20 years of management consulting experience and advises clients on corporate and business unit strategy, customer focus programs and post-merger integration in particular.

Startups Take a Seat at the Table

The mix of new voices and seasoned experts proves that innovation doesn’t have to come exclusively from one generation.

sixthings
In an industry where experience matters, and where specific domain knowledge has traditionally been prized above all other things, startups are increasingly being included in strategic conversations, and given a seat at the insurance table. Insurtech startups are bringing important emerging technology innovations and smart business solutions to a stalwart industry, and interest and investment in insurtech is climbing steadily. With the pace of change and competition increasing, as well, leading industry incumbents are beginning to pursue collaboration with fresh partners and platforms. Age Is Just a Number There is no right age for launching a startup, or for undertaking an innovation initiative, but many naively assume that younger is always better. In fact, some mix of experience in the industry being targeted along with an innovative idea and entrepreneurial state of mind are likely the best combination. The Global Insurance Accelerator (GIA) in Des Moines, for example, provides support to insurtech startups worldwide through a mentoring system that matches industry professionals with startups for a chance to better focus product-market fit. The average age of program participants working from Des Moines has increased each year since inception in 2015. The average age was 35 in the first year. It bumped one year to 36 in 2016, and jumped to 40 in 2017. See also: Will Startups Win 20% of Business?   This mix of new voices and seasoned experts proves that innovation doesn’t have to come exclusively from one generation. Leveraging industry knowledge and experience with ideas from newcomers can lead to great things when attacking problems worth solving. Everyone Needs Mentors Over the course of three cohorts at the GIA, a shift has occurred in the amount of insurance experience the entrepreneurs had coming into the program. In 2015, only a couple of participants had worked in the industry. Now, in 2017, the pendulum has swung to the other end of the spectrum, and almost every member of the cohort has worked in insurance at some point during his or her career. However, this prior industry experience hasn’t diminished the impact the GIA’s mentors have on any given startup’s evolution. The amazing pool of mentors who have raised a hand and taken a front seat in helping these early-stage InsurTech startups navigate a complex industry remain critical to the program’s success. Although the mentor role is largely to guide and advise, almost all of the GIA’s more than 100 mentors have reported learning as much from the startups. Collaboration Is Key There are six companies currently participating in the 100-day GIA program from a combination of the United States, Canada, Germany, and Serbia. The ideas and products offered by these InsurTech startups differ, as do the technologies powering the innovation, but these startups are all entrepreneurs who understand the vast opportunities within the insurance sector. Moving to the main stage, GIA’s InsurTech startup cohort members gain a seat at the table this Spring during the fourth annual Global Insurance Symposium in Des Moines. Sitting alongside peers in one of the global hubs of the insurance industry, these startups will be able to both learn from seasoned industry experts and share wisdom as well. See also: 5 Challenges Facing Startups (Part 5)   The Global Insurance Accelerator experience will culminate in a panel discussion at the Global Insurance Symposium which will discuss lessons learned, and provide an opportunity to network with leaders from around the industry. This experience will allow GIA’s cohort to better understand the industry so transformation can continue from the inside out. Collaborative efforts like these will not only allow insurance industry players to remain relevant and competitive, but to transform the insurance industry by meeting customers’ needs through new and improved methods.

3rd District Upholds Validity of IMR

The ruling on independent medical review provides nuggets for challenges to the authority of the W.C.A.B. to review medical decisions.

|
The Third District Court of Appeals has issued its decision in Ramirez v W.C.A.B., again upholding the constitutionality of the independent medical review (IMR) process for review UR determinations and providing, perhaps, some additional nuggets for potential challenges on the W.C.A.B. decision in Dubon II that concerns the authority of the W.C.A.B. to review medical decisions. Ramirez is the third in a series of cases where applicant attorneys have attempted to challenge the constitutionality of the IMR process on various ground. In two prior decisions (Stevens v W.C.A.B. and Margaris v W.C.A.B.), different districts of the Courts of Appeal had rejected constitutional challenges to the IMR process based on similar arguments presented by the applicant in this case. See also: Appellate Court Rules on IMR Timeframes   While the applicant’s arguments in this appeal were somewhat broader than either of the prior appeals, the court’s rejection was just as emphatic. Ramirez’ challenge to IMR was based on multiple arguments:
  • He argued the underlying UR was based on an incorrect standard, in effect appealing the UR determination itself to the court. This argument was rejected by the court on the grounds that the attack was at the heart of the determination of medical necessity, a determination that Labor Code  4610.6(c) prohibits the court from making. The court noted the applicant attorney did not argue that the IMR reviewer used in improper standard and that was the only one the court could only review for nonsubstantive reasons as set out in Labor Code  4610.6(h).
  • Ramirez also challenged the constitutionality of the IMR process arguing that it violates the separation of powers clause as well as state and federal principles of due process. Both of these arguments were rejected in much the same manner as the court in Stevens rejected a similar argument.
  • Ramirez argued that the W.C.A.B. decision in Dubon II, which limited the W.C.A.B.’s authority to review UR determinations to the timeliness of the decisions, was incorrectly decided and that other flaws in the UR process should allow the W.C.A.B. to assume jurisdiction over medical treatment issues. The court specifically rejected the argument that the W.C.A.B. had jurisdiction to review an IMR determination on the ground that the UR determination did not use the Medical treatment utilization schedule (MTUS).
It is on this last point the court’s language becomes interesting. The court reviewed the history of the Dubon decisions and the progression from an expansive view of the W.C.A.B.’s authority to the much narrower result in Dubon II that limits the W.C.A.B.’s authority to review only timeliness. The court does note that in Dubon II, where a UR determination is late, the W.C.A.B. could determine the medical necessity for the proposed treatment.  After review the W.C.A.B.’s decision and Cal Code Regs Tit 8 §10451.2 the Court goes on to state:
“To the extent the Board has any jurisdiction to review a utilization review as provided by this regulation, it has jurisdiction only over nonmedical issues such as timeliness of the utilization review as stated in the Final Statement of Reasons and Dubon II. We are not presented with a nonmedical issue. Any question that has the effect of assessing medical necessity is a medical question to be conducted by a qualified medical professional by way of independent medical review.  (§ 4610.6, subd. (i) [“In no event shall a workers’ compensation administrative law judge, the appeals board, or any higher court make a determination of medical necessity contrary to the determination of the independent medical review organization.”].) Whether the utilization reviewer correctly followed the medical treatment utilization schedule is a question directly related to medical necessity, and is reviewable only by independent medical review.”
While the court does not specifically indicate the W.C.A.B. was incorrect in Dubon II in its ruling that an untimely UR determination vests jurisdiction with the W.C.A.B. on medical issues; the above language certainly (at least) infers that any medical determination is beyond the W.C.A.B.’s authority. In the instant case, the court held there was not a basis to challenge the UR decision as it was timely and the other issues were not subject to W.C.A.B. review. The bolded language in the above quote certainly provides food for thought and perhaps some additional basis to challenge the W.C.A.B.’s holding in Dubon II, which, so far, has not been given a serious challenge at the appellate level. See also: IMR Practices May Be Legal, Yet…   Comments and Conclusions: That this court essentially followed the logic and reasoning of the prior appellate cases on this issue certainly suggests the options for challenging the IMR process are rapidly closing. While there are still a couple of additional challenges pending in the appellate courts (Zuniga in the first district challenging on one of the issues raised here — that the limitation on disclosure of the IMR doctor prohibited the applicant’s ability to challenge the doctor on bias, conflict of interest, etc. — and the Southard and Baker cases addressing the issue of late IMR as valid IMR as previously addressed in the negative in Margaris), so far the appellate courts have shown little interest in challenging the legislature’s authority to create and mold the workers’ compensation system. As one who has consistently believed the W.C.A.B. exceeded its jurisdiction in deciding it could address medical issues in Dubon II in spite of the strongly stated legislative purpose prohibiting exactly that conduct, I am cautiously optimistic that someone will challenge that decision; even the W.C.A.B. might have second thoughts about maintaining its ability to decide medical issues.

Richard Jacobsmeyer

Profile picture for user richardjacobsmeyer

Richard Jacobsmeyer

Richard (Jake) M. Jacobsmeyer is a partner in the law firm of Shaw, Jacobsmeyer, Crain and Claffey, a statewide workers' compensation defense firm with seven offices in California. A certified specialist in workers' compensation since 1981, he has more than 18 years' experience representing injured workers, employers and insurance carriers before California's Workers' Compensation Appeals Board.

A Lesson From a Serial Innovator

We get too focused on the technology. Disruptive innovation comes from the strategy that uses technology, not the technology itself.

sixthings
Disruptive innovation is not about technology Systems that are innovative at one time can become the “good enough” systems we need to overcome as they age and calcify. While it's inspiring to see new systems render old ones obsolete, this prescription of change creates a future where decisions about our collective future will be commercial engineering decisions and not social ones. Disruptive innovation comes at you fast. It is not about creating the best products and protecting profits. For example, with the launch of ApplePay, the whole world can do something Kenyans have done every day for more than 10 years. M-PESA, the mobile payment system offered by Safaricom, has been used by most adult Kenyans and is the model for hundreds of digital payment startups around the world today. See also: What Is the Right Innovation Process?   Kenyans don’t have bank accounts, making paper checks useless for all but the largest transactions. M-PESA was an appealing alternative to the status quo for transferring money from one city to another. Before you could transfer money through an SMS, it was common to give money to a taxi driver heading in that direction and ask him to deliver your payment for you. Safaricom, a leading mobile network provider in Kenya, captured consumers out of mainstream banking institutions and built customers — not the best technology. Disruptive innovation refers to the strategy that employs technology; the technology itself isn't disruptive, but rather the application of the technology can be disruptive or not. This depends on whether the technology is positioned with a disruptive strategy.

Shahzadi Jehangir

Profile picture for user ShahzadiJehangir

Shahzadi Jehangir

Shahzadi Jehangir is an innovation leader and expert in building trust and value in the digital age, creating scalable new businesses generating millions of dollars in revenue each year, with more than $10 million last year alone.

Big Data Can Solve Discrimination

With big data, we can better understand the causal paths between data generation and an event. There becomes no need for stereotyping.

sixthings
Big data has the opportunity to end discrimination. Everyone creates data. Whether it is your bank account information, credit card transactions or cell phone usage, data exists about anyone who is participating in society and the economy. At Root, we use data for car insurance, an industry where rating variables such as education level or occupation are used directly to price the product. For a product that is legally mandated in 50 states, the consumer’s options are limited: give up driving and likely your ability to earn a living or pay a price based on factors out of your control. Removing unfair factors such as education and occupation from pricing leaves room for variables within an individual’s control — namely: driving habits. In this way, data can level the playing field for all consumers and provide an affordable option for good drivers whom other companies are painting with a broad brush. In the lon term, everyone wins as roads become safer and driving becomes prohibitively expensive for irresponsible drivers. This is just one example where understanding the consumer’s individual situation deeply allows for more precise — and more rational — decision making. But we know that the opportunity of big data goes beyond the individual. For example, the unfair practice of naively blanketing entire countries, religions or races unfairly as “dangerous” is a major topic in the news. What happens if you apply the lens of big data to this policy? See also: Industry’s Biggest Data Blind Spot Causal Paths vs. Assumption-Based Decisions With the increased availability of data, we are able to better understand the causal paths between data generation and an event. The more direct the causal path, the better predictions of future events (based on data) will perform. Imagine having something as trivial as GPS location data from a smartphone on a suspected terrorist. Variables such as having frequent cell phone conversations with known terrorists or being located within five miles of the last 10 known terrorist attacks will allow us to move away from crude, unjust and discriminatory practices and toward a more just and rational future. Ahmad Khan Rahami, who placed bombs in New York and New Jersey, was flagged in the FBI’s Guardian system two years earlier. The agency found there weren’t grounds to pursue an investigation — a failure that may have been averted if the FBI had better data capture and analysis capabilities. Rahami purchased bomb-making materials on eBay and had linked to terrorist-related videos online before his attempted attack. Dylann Roof’s activities showed similar patterns in the months leading up to his attack on the Emanuel AME Church in Charleston, SC. The causal path between a hate-crime or terrorist attack and the actions of Dylann Roof and Ahmad Khan Rahami is much more direct than factors such as religion, race or skin color. Yet we naturally gravitate toward making blanket assumptions, particularly if we don’t understand how data provides a better, more just approach. Today, this problem is more acute than ever. Discrimination is rampant — and the Trump administration's ban on travel is unacceptable and unnecessary in the era of big data. For those unmoved by the moral argument, you should also know policies like the ban are hopelessly outdated. If we don’t begin to use data to make informed, intelligent decisions, we will not only continue to see backlash from discriminatory policies, but our decision making will be systematically compromised. The Privacy Red Herring Of course, if data falls into the wrong hands, harm could be done. However, modern techniques for analyzing and protecting data mitigate most of this risk. In our terrorism example, there is no need for a human to ever view GPS data. Instead, this data is collected, passed to a database and assessed using a machine learning algorithm. The output of the algorithm would then direct an individual’s screening process, all without the interference of a human. In this manner, we remove biased decision making from the process and the need for a “spy” to review the data. See also: Why Data Analytics Are Like Interest   This definitely provides a challenge for the U.S. intelligence community, but it is an imperative one to meet. If used responsibly, analytics can provide insights based on controllable and causal variables. The privacy risk is no longer a valid excuse to delay the implementation of technologies that can solve these problems in a manner that is consistent with our values. This world can be made a much better and safer place through data. And we don’t have to sacrifice our privacy; we can have a fair world, a safe world and a world that preserves individual liberties. Let’s not make the mistake of believing we are stuck with an outdated and unjust choice.

Innovation takes root in unexpected places

A CEO says he can run his company entirely off his smartphone, and customers can interact with the company entirely via apps and chat, too. 

sixthings

This week will be quick, because I'm at the AAIS Main Event in beautiful Amelia Island, FL (a rough job, but somebody has to do it), where I delivered a talk to the general session and held a breakout.

The best example I've seen of innovation thus far here came from Art Meadows, CEO of Panhandle Farmers Mutual Insurance, a little company that would hardly be expected to be on the cutting edge. Panhandle is based in Moundsville, WV, population 8,813. TripAdvisor's list of Things to Do in Moundsville puts at #1 the West Virginia Penitentiary, which closed in 1995. No. 3 is Archive of the Afterlife: The National Museum of the Paranormal. No. 8 is Foster Glass, which left town in 1986. Yet Meadows told the general session that he can now run his company entirely off his smartphone and that customers can interact with the company entirely via apps and chat, too. 

If the 60-something Meadows can be so advanced in rural West Virginia, can't the rest of us at least do somewhat better?

Two articles to call to your attention:

"An Insurtech Greenhouse: Future US-UK Regulatory and Fintech Collaboration," by our friend Paul Thanos, director of the office of finance and insurance industries at the Department of Commerce and a fellow at the Woodrow Wilson International Center for Scholars. It's a very smart piece laying the groundwork for U.S. companies beyond our shores. You'll recognize the name of the guy quoted at the top of the piece.

"Teaching Watson the Urban Dictionary Turned Out to Be a Huge Mistake." This one offers a bit of comic relief but also a lesson. The comic relief: Once Watson absorbed the dictionary, it sometimes responded to queries with an answer like "bull****." The lesson: Every innovation has unintended and unforeseen consequences.  

Cheers,

Paul Carroll,
Editor-in-Chief 


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

Slump in VC Spending on Cyber?

It’s show-me time for startups battling for the dwindling pool of venture capital funds for cybersecurity startups.

sixthings
Venture capital funding in cybersecurity is cooling. And it’s show-me time for startups battling for the dwindling pool of funds. While the cybersecurity market is maturing, startups are still innovation drivers and venture capitalists are keen on finding the next big unicorns. Large enterprises’ tendency to juggle products from multiple vendors—despite their wishes for seamless, one-vendor-only solutions—leave the market perpetually fragmented. And the fact that cybersecurity threats are evergreen enables venture capitalists who specialize in the sector to operate with little regard for broader macroeconomic conditions. Still, the ample opportunities afforded by the fragmented, constantly shifting market have bred too many me-too companies and fast followers, driving some venture capitalists to pause and reflect on the next phase. “It’s definitely overfunded, massively so,” Ravi Viswanathan of New Enterprise Associates told a panel at CB Insights’ Future of Fintech Conference last year. After growing steadily since 2012, venture capital funding in cybersecurity dipped in 2016, alarming entrepreneurs. The cybersecurity market captured roughly $3.1 billion of venture funding in 2016, down from $3.8 billion a year earlier, according to research firm CB Insights.“You saw a material pause in the fourth quarter,” says Bob Ackerman, founder and managing director of Allegis Capital, which specializes in the sector. “You have too many undifferentiated companies. There’s a level of noise that develops as a result of that. … Cybersecurity is one of those areas where experience and domain knowledge matter a great deal.” See also: Quest for Reliable Cyber Security   The cybersecurity market will undergo a few years of retrenchment with a host of companies shutting down, VCs say. More judicious spending But the market is hardly mature. Money will still be spent, just more selectively. At this phase, fewer deals will be struck. But those deals will be reserved for larger companies, with proven products further along in development. “The deal size and valuation is coming down a bit,” says Sean Cunningham, managing director of Trident Capital Cybersecurity, which raised $300 million this month for a fund to invest in cybersecurity startups. “I don’t think there’s any shortage of capital for the right type of companies. But the dollars being invested are smaller.” Appthority is one of the companies that made Trident’s cut. Appthority, which develops mobile threat protection software for corporations, didn’t land its first paying customer until more than a year after it was founded in 2011. Four years later, its customer renewal rate stands at 98 percent, with about 20 percent of its revenue coming from the government sector. Heartened by solid proof of growth, venture capitalists poured in another $7 million in Series B funding last July, led by Trident Capital Cybersecurity. “You’re going to see a lot of startups out there, and good ones will rise to the top,” Cunningham says. “There’s ample supply of capital to fund them. They can get traction.” Innovation niches As seen in the early days of the internet, the cybersecurity market is recalibrating for a second wave of innovative technology that’s more comprehensive and cohesive. And that means more seamless products for large clients who are eager to cut down on the number of vendors. “Companies that can stand on their own two feet, deliver value, and have deep knowledge will do fine,” Ackerman says, citing one of the companies he’s invested in, EnVeil, which uses “homomorphic encryption” to secure data in operation. As more companies employ automation and “big data” to enhance efficiency and find new markets, data encryption products will continue to be in heavy demand. The emergence of the industrial internet—the integration of complex machines to network sensors and software—also will breed startups eager to provide cybersecurity solutions to power and water grids, refineries and pipelines. In May, Trident helped raise $6.6 million in Series A funding for Bayshore Networks, which develops cloud-based software that offers “visibility” into operational technology infrastructure, networks, machines and workers. Meanwhile, the proliferation of enterprise mobile devices will continue to see vulnerabilities and pose a ripe market for startups like Appthority, Cunningham says. Early investors haven’t gone away That VC dollars are chasing more evolved companies doesn’t mean early-stage investing is passé, Ackerman says. “That’s where the new things get started.” But cybersecurity, unlike more consumer-oriented technology sectors, is a competitive and difficult market, rife with startups struggling to recruit and market products. See also: Paradigm Shift on Cyber Security   That’s partly why Allegis funded DataTribe, a startup studio based in Fulton, Maryland. It was designed to tap into the wealth of cybersecurity-savvy technologists in the region with experience or ties to the federal government and intelligence agencies. Ackerman also anticipates more mergers and acquisitions activity from large cybersecurity companies that may find it easier to acquire smaller niche players as they seek to add new product lines. As venture capitalists squeeze their wallets, startups lucky enough to land Series A funding also will have to justify more vigorously their pursuit of Series B funding, Cunningham says. “And unicorns are in trouble,” he says, referring to startups valued at over $1 billion. The Trump factor Meanwhile, venture capitalists are hopeful that the federal government, with President Trump at the helm and promising a rollback in regulations, will cut steps in federal procurement and stay engaged in securing networks. “We think the administration understands the value of national cybersecurity,” Cunningham says. “We’re not counting on incremental increases in spending. But we’re excited about the awareness level.” This post originally appeared on ThirdCertainty. It was written by Roger Yu.

Byron Acohido

Profile picture for user byronacohido

Byron Acohido

Byron Acohido is a business journalist who has been writing about cybersecurity and privacy since 2004, and currently blogs at LastWatchdog.com.

Changing Business Models, 'New' ERM

Here are three key developments that insurers should incorporate into their evolution on enterprise risk management (ERM).

sixthings
Significant social, technological, economic, environmental and political forces are reshaping the needs and expectations of insurance buyers, as well as the business environment in which insurance providers operate. Even a partial list of these forces is daunting: aging populations in developed markets; different needs and purchasing behavior of younger buyers of insurance; self-driving vehicles; telematics; artificial intelligence; the internet of things; and persistent low interest rates. With so many forces in play, it’s difficult to determine the exact landscape of the new insurance world. But it’s not too early for insurers to prepare. Regardless of exactly how they plan to address a rapidly changing and more unstable world, one capability that will remain critical to all insurers’ success is enterprise risk management. We describe below three key developments that insurers should incorporate into their ERM evolution. Insurers’ business models are changing and ERM needs to keep pace. Stress testing will join economic capital as the main risk decision tool. VAR-based economic capital measures originated in banking and asset portfolio management more than 40 years ago. Over the last couple of decades, the insurance industry has widely adopted the concept. This is particularly true for insurers’ credit and market risk taking, areas where the VAR concept is endemic. For some aspects of insurance risk, like statistical variability around a stable mean, the concept also fits well. In an insurance world where credit, market and insurance are insurers’ main risks, economic capital is effective. But what if the world changes to one where other risks join these at center stage? Life insurance in a persistent low-interest-rate environment with rapidly evolving distribution models provides a clear example of recent change and its implications for ERM. The bulk of many life insurers’ liabilities and supporting assets are composed of permanent type products they wrote when asset returns were markedly higher. These higher returns supported the stable distribution model of a sales force based on up-front commissions. In turn, this fit the products' complex features that needed such a model to explain and sell them. Delivering on these guarantees necessitated focus on the credit and market risks they created. And VAR was developed to manage these risks. See also: Minding the Gap: Investment Risk Management in a Low-Yield Environment   However, now that asset returns are much lower, supporting this distribution model will be difficult. Fortunately, other less costly models are available and probably preferable to younger buyers of insurance. This demographic group has shown a preference for a more t-to-purpose protection model that is less permanent and less complex. As a result, credit and market risks cease to be ERM’s overwhelming focus. Instead, strategic and operational challenges created by transitioning to and maintaining the new business model take center stage, as do the risk tools that can address these challenges. Among these, stress testing figures most prominently. Trends in the property and casualty sector also point to a shift in risk focus and risk management tools. Impending and actual changes in the nature of driving and vehicle ownership will radically and permanently alter the auto insurance landscape. Developing an understanding of the implications of these changes and their risks to an insurance enterprise needs a tool like stress testing. Similarly, an increased emphasis on assisting customers with mitigating and managing their own risks, rather than just insuring them, moves more of an insurer’s risk profile out of the traditional risk-taker role and into a service provider model. VAR is a good risk tool for a risk taker, but stress testing is the tool best suited to the service provider model. Lastly, we note that rapidly emerging technologies, often cited for their role in shaping customer preferences, also shape insurers’ own capabilities. Insurers have begun to modernize their back offices, and computing power continues its exponential growth. Operational challenges and resource demands to implement new and improved risk tools, like stress testing, will diminish significantly. With benefits going up and costs going down, it seems clear that stress testing is on its way to a prominent ERM role. Customer analytics decision platforms will become the key focus of model risk management efforts Model risk management (MRM) is receiving extensive ERM focus at present. Much of the original impetus may have come from European companies seeking to validate their Solvency II internal models. In the U.S. and Canada, due in part to direct or indirect regulatory encouragement, the scope goes beyond economic capital and solvency models, and most insurers seek to apply their efforts to all models. The early priority for validation has skewed toward economic capital and complex liability valuation models. Insurers with advanced MRM capabilities have begun to focus more attention outside of risk and financial reporting models. This is to be expected to some degree, as insurers model validation activities work their way through their inventory of models. In addition, as they develop a working experience of risk rating their models, many are reconsidering the irrecoverable nature of product pricing decisions and the importance of getting those models right. In other words, while small errors in financial and risk reporting models can be rectified once errors are uncovered, losses from inadequate premium charges are permanent. The impetus for higher attention to pricing and risk selection models is further amplified when insurers implement newer, non-traditional approaches. Without a long history of successful use, newer customer analytic models put a higher priority on their timely and thorough validation. Additionally, we have observed insurers further enhancing their level of attention when these models move to autonomous execution mode. In this mode, the model makes decisions in an automated fashion without manual intervention or deliberation. Deploying more models of this sort is a common feature of most visions of the near-term future of insurance. As their use expands, so too should ERM’s focus on effective risk management of these models. In an environment in which these types of customer analytics decision platforms become an insurer’s key business engine they also will need to become the key focus of MRM efforts. Small errors in financial and risk reporting models can be remedied; however, losses from inadequate premium charges are permanent. Risk diversification measurement will become the single most important element in economic capital calculations There is a continuing focus on the effectiveness of economic capital modeling, especially in connection with IAIS and regulatory efforts outside of the U.S. In the U.S. as well, insurers continue to look at how they can improve their calculations. However, one area we believe attracts insufficient attention is diversification. Not only is an effective understanding and quantification of diversification an important goal in the current insurance environment, it will likely become even more critical in the future. As the new risk profile moves away from a credit/market nexus to a more diverse insurance, business and strategic risk set, managing the interaction between and among them will be especially important. If customers move to a more holistic view of insurance and blur the distinctions between life, property and casualty and health, just quantifying the diversification across all insurance risks will be a key task on its own. See also: Developing A Safe Work Environment Through Safety Committees   Implications If they haven’t done so already, CROs should start to sketch out a few versions of what their company might look like in the future and consider what might be required of their ERM capabilities. They can adjust and clarify this high-level road map as the future becomes clearer. Considerations CROs should keep in mind while creating this roadmap include:
  • On the life side in particular, credit and market risks will cease to be ERM’s overwhelming focus, but stress testing will figure more prominently in new business models.
  • Assisting customers with mitigating and managing their risks instead of just insuring them will move more of an insurer’s risk profile out of the traditional risk-taker role and into a service provider model. VAR is a good risk tool for a risk taker, but stress testing — which is becoming cheaper and easier to do — is better suited to the service provider model.
  • As advanced customer analytics decision platforms become an insurer’s key business engine, they will need to become the key focus of model risk management efforts.
  • As insurance becomes more holistic for customers, quantifying diversification across all insurance risks will be a key task for insurers.

Henry Essert

Profile picture for user HenryEssert

Henry Essert

Henry Essert serves as managing director at PWC in New York. He spent the bulk of his career working for Marsh & McLennan. He served as the managing director from 1988-2000 and as president and CEO, MMC Enterprise Risk Consulting, from 2000-2003. Essert also has experience working with Ernst & Young, as well as MetLife.

Developing Programs for Shifting Channels

Like TV, insurance is changing at its core because of a reduction in “viewership” and the changing demographics of younger generations.

sixthings
Though cable TV has technically been in use since 1948, broadcast television was the staple of home entertainment for decades. It offered a handful of channels, but most viewing was done on ABC, CBS, NBC and PBS — the big four. As the number of satellites grew and the number of cable providers proliferated, so did channel options. According to Nielsen, today’s average home receives 189 channels of cable programming. This has obviously detracted from the viewership of the big four. To counter, the networks and early cable channels simply added new, sometimes niche channels to their network ecosystem to reach new market segments.  NBC, now owned by Comcast, operates dozens of channels, such as CNBC, MSNBC, Syfy, USA Network, Bravo and the Weather Channel.  And now cable providers, along with the big four, are being challenged by streaming TV via Netflix, Hulu, Amazon and others, which are popular with younger generations. Is this analogous to what is happening in insurance? It could be, especially if we back up to consider how insurance is changing at its core because of a reduction in “viewership” and the changing demographics of younger generations. The term "channels" seems to be appropriate, because insurance is undergoing its own channel proliferation and change. To look at channel development more closely, I’ve drawn on many of the insights found in our Future Trends 2017 report. Complexity and relevance Insurance is often a complex product that is hard to research, buy and use, requiring a great deal of thought by customers. The problem is enormous for the insurance industry because every gap and point of complexity looks like a giant bull’s-eye for potential startup solutions. Those who can develop simplified products and make insurance easier for customers to understand and buy stand a good chance of capturing business from companies whose products and processes remain complex. See also: New Channels, New Data for Innovation   Innovators started attacking this opportunity years ago, led by online insurers like Progressive, Geico, USAA and Esurance, and aggregator and comparison sites like Compare.com. In the insurtech world, we see companies like Lemonade, Slice, Haven Life and Quilt challenging these pioneers. They recognized the tremendous opportunity offered by making the process of researching and buying auto, property or life insurance easier. They also made the product meet expectations of a new demographic. But it isn’t just complexity that can drive insureds toward new products — in many cases, it is convenience, relevance and placement. A recent example is the partnership between startup CarSaver and retail giant Walmart. In the pilot program, Walmart will put CarSaver kiosks in stores in Houston, Dallas, Phoenix and Oklahoma City that allow consumers to select a car, finance it and insure it. CarSaver lists nine well-known auto insurance brands on its website as participating companies. Insurtech startups have responded to demand by facilitating channel development. CB Insights reported that 18 of the top 20 deals in insurtech since 2015 were focused on P&C insurance distribution. These 20 deals accounted for about 82% of the $2.02 billion aggregate funding since the start of 2015. As of January 2017, Coverager listed 179 global companies classified as an “intermediary” that are an aggregator, provide online quotes, provide online purchasing or do any combination of the three. On-demand insurer Slice is currently one of the best examples of relevance meeting convenience in channel development. Slice uses homesharing sites, such as Airbnb and HomeAway to distribute temporary rental insurance. Allstate would be an example of a traditional insurer prepared to step into the same market space, now offering homesharing insurance in six states. Other innovative technology startups are occupying unique positions in the distribution space as enablers and connectors. Like selling pick axes to gold miners, their role is to simplify distribution processes for agents, brokers and carriers. AskKodiak, BoldPenguin, Indio and Insurr, for example, offer digital platforms to automate workflows and connect agents/brokers, carriers and risks in the commercial space. Others like Denim, MyNameFlow and InsuranceSocial.Media provide social and e-mail marketing platforms to insurance companies and agencies to link buyers with them. Consumer preparedness Despite all the activity in the front end of the value chain, most insurance is still ultimately sold through human interaction, either on the phone or face to face. Traditional insurers that are considering preparing their operations for additional channel use should move forward with more than just a hunch. Are consumers prepared mentally to jump insurers if presented with new channels?  Would new channels allow insurers to reach new market segments? Majesco’s consumer and SMB research showed that the answer is, “Yes.” There is significant interest in at least considering new, non-traditional ways of obtaining insurance in the next 3-5 years. Among most generations of consumers, nearly 40% indicate they would be likely to try several alternative insurance acquisition methods. As consumers gravitate beyond traditional options, they will explore and seek alternatives across a wide spectrum of choices, regardless of whether or not their insurer offers them. Insurers who remain committed to only the agent channel will likely lose out on new customers and potentially existing customers who will seek alternative channels, placing their relevance and growth strategies at risk. Stepping in With the customer in control, the need for an ecosystem of channels is established.  The remaining hurdle for some insurers is simply where to begin. Interestingly, many startup initiatives are organized as managing general agents (MGAs). The MGA structure is an ideal testing ground for new product innovations, programs and markets, because it allows the company to rely on its partners for capital, core systems and the carrying of risk while it focuses on assembling all of these components to meet the specific needs of unique markets and niches. Conning reported that the MGA market accounted for 14% of commercial lines business in 2015, and has been growing at a faster rate than the P&C market as a whole. But life focused MGAs from InsurTech are also emerging rapidly. See also: 10 Trends at Heart of Insurtech Revolution   Other insurers will find it simplest to create a value-added channel that ties in closely with niche markets they may already serve. Majesco executive, Bill Freitag, gives some great examples in his last blog, It’s the Customer Experience, Stupid”. Some insurers will grow their channel development through M&A activity, acquiring InsurTech startups or those who already have a blueprint for new channels. Utilizing any of these approaches may work, but the foundation of all of them is the same. Innovative channel development begins by understanding insurance need, insurance use and customer experience enhancement. The best new channels will be those that exist at the point of need and fulfill the need without friction. To match those requirements, insurers must have created a flexible system for data acquisition, a scalable real-time solution for policy administration and lightweight approaches to testing and rollout. Cloud solutions and SaaS offerings are well-suited to these needs and can provide the flexibility needed for both new initiatives and low-cost testing methodologies. Collaborations and partnerships will be common in most cases of channel development. A SaaS solution provider such as Majesco can often act as the bridge between the innovative culture of the startup and the deep experience of the traditional insurer. They can also design a framework for adaptability that will accommodate new channels without compromising the capability for managing risk. Just like TV channels shifted, expanded and changed … so too should insurance channels.  If not, insurers risk relevance and growth, two critical factors for a fast paced changing marketplace.

Denise Garth

Profile picture for user DeniseGarth

Denise Garth

Denise Garth is senior vice president, strategic marketing, responsible for leading marketing, industry relations and innovation in support of Majesco's client-centric strategy.

Opportunities for Treatment Guidelines

Common-sense tips can lead to better outcomes for injured workers — and, ultimately, lower costs for payers.

sixthings
Medical treatment guidelines can be a great benefit to any workers’ compensation system. They can prevent unnecessary medical procedures and the prescribing of potentially harmful medications. However, they are not all the same, nor are they without challenges. Understanding a jurisdiction’s strengths and shortcomings, taking a strategic approach to developing treatment guidelines and following some common-sense tips can lead to better outcomes for injured workers — and, ultimately, lower costs for payers. That’s the view of workers’ compensation experts who spoke during our Out Front Ideas webinar on the subject. The panel included representatives from the regulatory, medical, pharmacy benefit management and third-party administrator communities. They were:
  • Amy Lee – special advisor, Texas Department of Insurance, Division of Workers Compensation
  • Dr. Douglas Benner, MD – chief medical officer from EK Health and national medical director of Macy’s Inc, Claims Services
  • Mark Pew – senior vice president, PRIUM
  • Darrell Brown – executive vice president, chief claims officer – Sedgwick.
Dr. Benner brought a unique and important viewpoint to the panel. As a practicing physician for over 30 years, he has firsthand experience practicing medicine under guidelines. He has also been involved in the development of treatment guidelines for both the Official Disability Guidelines (ODG) and the American College of Occupational and Environmental Medicine (ACOEM). A majority of states now have some type of medical treatment or return-to-work guidelines in their workers’ compensation systems, and nearly half either have or are considering drug formularies. But there is some confusion about how they work within various jurisdictions and how effective they are. The speakers gave us great insights to better understand how to develop and implement successful treatment guidelines and how to get the most out of them. Texas’ Example Many in our industry look to Texas as a state with highly effective treatment guidelines. Texas had some of the highest workers’ compensation costs in the nation, along with some of the poorest return-to-work and patient satisfaction outcomes. After implementing treatment guidelines and a drug formulary, the state now boasts some of the best workers’ compensation outcomes in the nation, as well as lower costs. But the Texas story is not quite as simple or transferrable as you may think. As our panel explained, it took a multi-year, painstaking effort by representatives in all facets of the system to develop and implement the model now in place. The change also required a deep understanding of the workers’ compensation system as it existed in Texas for the treatment guidelines to get to the point they did. The changes in Texas began with legislative reforms in 2005. It would be two more years before the treatment guidelines were implemented and three years after that for the drug formulary to begin being phased in — first with new claims, then with legacy claims. One of the keys to Texas’ success was a change to include evidence-based medicine in the guidelines. See also: Texas Work Comp: Rising Above Critics   EBM Evidence-based medicine (EBM) is a term we hear often these days, but there’s disagreement about what it truly means. Texas sought to clarify the issue by including a statutory definition in the treatment guidelines, so it defined EBM as follows: “Evidence-based medicine means the use of current best quality scientific and medical evidence formulated from credible scientific studies, including peer-reviewed medical literature and other current scientifically based texts, and treatment and practice guidelines in making decisions about the care of individual patients.” Texas switched to basing the guidelines on EBM to reform the previous consensus-based model, which was perceived as allowing for too much unnecessary medical care. EBM was chosen as the standard for selecting treatment guidelines, return-to-work guidelines and adjudicating claim level disputes on medical care. It is also the standard expected from healthcare providers, payers and others. The idea of EBM is to provide a systematic approach to treating injured workers based on the best available science. Ideally, medical providers should base their treatment regimens on EBM, although it is also important to consider the specific needs of each individual patient. Unfortunately, some of the most pervasive medical conditions among injured workers have not been as heavily researched as other ailments, such as heart disease or hypertension. This means EBM is not the basis for every single medical condition. The developers of EBM for workers’ compensation consider all available research, ‘weigh it’ in terms of quality then fill in the ‘gaps’ with a consensus of expert panels. That does not mean those particular guidelines are not scientific. For example, there is little research indicating someone with chest pains should undergo an electrocardiogram (EKG), but medical common sense dictates that is the appropriate action to take. Formularies Ensuring injured workers are given the most appropriate medications for their conditions is, or should be, the goal of drug formularies in workers’ compensation, according to the panelists. Not all drug formularies are the same, and it is helpful to understand their differences. As we learned in the webinar, drug formularies started in the group health area and were primarily a way to reduce costs, because out-of-pocket expenses are involved. There are different tiers to guide the best drug for the patients with the aim of finding the one that is the least expensive. Because workers’ compensation does not typically include co-pays, the goal for many jurisdictions is clinical efficacy — finding the medication that will result in the best outcome for the injured worker and get him or her back to function and, ultimately, work. See also: States of Confusion: Workers Comp Extraterritorial Issues States such as Texas have a “closed” drug formulary, although compared to closed formularies in group health, it is not the same. Whereas in the group health context, some medications will be disallowed in terms of reimbursement, formularies in workers’ compensation instead require pre-authorization for certain medications. The term “preferred drug list” is more appropriate for workers’ compensation. Texas uses the Official Disability Guidelines for its list of “Y” and “N” drugs. All FDA-approved drugs are included, but those on the “N” list are not automatically paid for through the workers’ compensation system. Almost immediately after Texas implemented its drug formulary, prescribing patterns changed. Physicians began prescribing more medications on the “Y” list, rather than justifying the use of those on the “N” list. That was among the main goals of the drug formulary — to get prescribers to avoid prescribing opioids and other potentially dangerous drugs right from the start. The formularies in workers’ compensation systems in other states differ. However, the goal is the same: to encourage providers and others to prescribe medications that are the best for the injured worker, considering his or her injury and any comorbid conditions. Patient safety, rather than lower costs, should be the goal. Many in the industry are closely watching California as it faces a summer deadline to finalize its drug formulary. There are estimates that the state could see about 25% of its currently-prescribed medications put on the fast track for approval and thus avoid delays from utilization review once the formulary is implemented. Challenges Having heard about the many potential benefits of treatment guidelines, we then turned to the panelists to discuss some of the obstacles and how to overcome them. Educating all stakeholders was among the most important strategies they mentioned. For example, a claims examiner may not see a recommended treatment in the guidelines for a particular jurisdiction and issue a denial for a requested procedure. But, upon further investigation, the treatment requested by the provider may be the best for all considered. In a California case, a claim was halted for several years — with indemnity expenses continuing to be paid — as the parties awaited the outcome of a dispute over an MRI scan. The case points to the need for those involved in a claim to be flexible. While following the guidelines should be the general rule of thumb, it’s also important that those overseeing a claim take a holistic approach and see what really makes sense for the injured worker. It is also vital to educate physicians on what to do to gain approval for treatments that stray from treatment guidelines. Often, little or no explanation is provided as to why a particular patient needs a certain procedure or medication. Without complete information, the rate of denials increases. Texas took the unique step of implementing Appendix B to provide guidance to physicians on how to document exceptions to its guidelines. The consistency (or lack thereof) of guidelines can be frustrating, especially for organizations that operate in multiple jurisdictions. Again, those involved in the claim need to be informed about the guidelines used in each. It is important that everyone involved in reviewing treatment recommendations — including claims examiners, nurses, physicians and even administrative judges — understand the treatment guidelines and their limits for the jurisdictions in which they operate. The decisions each person makes must be consistent for the guidelines to be most effective. Keeping the guidelines current is another challenge for some jurisdictions. With medical science changing rapidly, it’s best if jurisdictions find a way to get updated information published as soon as possible and make it easily accessible. The Future While a majority of states have medical treatment guidelines in their workers’ compensation systems, 21 did not at the time of the webinar. About 20 states either have or are considering drug formularies. There are additional efforts underway on the state level to address medical care for injured workers. Several Northeastern states, for example, have placed limits on the number of days for which opioids can be prescribed. Some have limited it to seven days, while New Jersey is imposing a five-day limit. That trend is expected to continue. See also: 25 Axioms Of Medical Care In The Workers Compensation System   Other states are looking at helping wean injured workers off opioids. New York recently rolled out a new hearing process to address claims that involve problematic drug taking. Progress is being made to improve injured workers’ outcomes and treatment guidelines, and drug formularies are a big part of these efforts. The goals of better safety and clinical outcomes, quicker return-to-work, shorter treatment periods and better overall outcomes should drive the conversations going forward. To listen to the complete Out Front Ideas with Kimberly and Mark webinar on this subject, please visit Out Front.

Kimberly George

Profile picture for user KimberlyGeorge

Kimberly George

Kimberly George is a senior vice president, senior healthcare adviser at Sedgwick. She will explore and work to improve Sedgwick’s understanding of how healthcare reform affects its business models and product and service offerings.