Download

Why Healthcare Costs Bleed Firms Dry

Many employers are rising up about soaring healthcare costs. The secret is to write your own checks for your employees' healthcare.

sixthings
“It is impossible to prove something to someone whose salary depends on believing the opposite.” – Upton Sinclair Today’s overpriced healthcare system is hurting American businesses and job creation, eating into profitability and, quite frankly, bleeding companies dry. What’s worse, the lack of cost control and price transparency have created a culture of helplessness and even resignation. But employers have had enough. Many are rising up and demanding change. They want lower costs and better care for their people and will no longer tolerate the status quo. In 2007, I made it my mission to put an end to overpriced healthcare when my own companies’ healthcare costs were cutting dangerously into the bottom lines. At the time, I operated numerous healthcare clinics throughout the Phoenix metro area. We found our best hourly employees were leaving us for jobs at larger corporations with better health insurance, and we couldn’t attract replacements with the same level of training. Productivity and efficiency plummeted. It was an absolute mess, and I felt like a failed CEO. But we discovered a secret that no one else seemed to know – or at least nobody seemed to be saying aloud. It’s a secret we uncovered when we started doing something I had never heard of anyone doing: writing our own checks for our employees’ healthcare. See Also: When a Penalty Is Not a Penalty It seemed strange that the cost of giving birth at one hospital was $6,000, while the cost at a neighboring hospital was $17,000 – even though the same doctor had attended both births! Strange that an ankle X-ray could cost $1,200 in a hospital emergency room but only $35 at my own clinics. Stranger still that a simple antibiotic could cost $900 at one pharmacy when Walmart sold the exact same drug for only $12. Those observations helped lead to the secret to not overpaying for healthcare. Controlling PLACE OF SERVICE is all that really matters In the vast majority of cases, my employees could receive the right level of care in a setting that provided the same service (with the same or even better quality) at a much lower cost than in another setting. Of course, sometimes a hospital emergency room visit is absolutely necessary. On occasion, an urgent care is the right option. But qwe saw that many medical expenses were needlessly incurred in hospitals and other expensive settings. MRIs, X-rays, blood tests, specialists consultations and other common procedures were costing my companies five to 20 times more than the exact same services performed across the street in an imaging center, lab or doctor’s office not owned by the hospital. Why would someone choose to get a $3,600 MRI or $1,200 X-ray at a hospital instead of going to an imaging center across the street for an equally good, $400 MRI or $35 X-ray? Why would anyone get a procedure at one hospital instead of paying 40% less for an identical procedure at another hospital around the corner? It’s not that people don't care. THEY DO! The answer is that they simply don't know – and the system is designed so that it is very hard for people to uncover this truth. It seems crazy, but this sort of thing happens systematically all the time. When employer health plans work well – when prices are transparent and employees are protected and guided away from overpriced services – then common sense prevails and costs stay in check. But if people are part of a health plan that benefits from keeping costs hidden – and most do – business owners and their people simply don’t know they’re being duped. Why is this is happening? 
  1. Hospitals with the greatest market share negotiate much higher reimbursement rates from insurance companies. A December 2015 study by researchers from Yale, University of Pennsylvania and Carnegie Mellon University analyzed billions of hospital clams paid by commercial insurance companies to hospitals. The study concluded that costs at hospital systems with significant market share were as much as 12 times higher than other, smaller hospitals – with no difference in quality. It was an important and revealing study, yet it failed to evaluate the even bigger differences in price for routine procedures performed at a hospital vs. outside a hospital – procedures that never needed to be done in a hospital in the first place. These price differentials and subsequent overpayments are even more shocking and have the biggest impact on overall healthcare cost.
  2. Hospitals are “buying” doctors so they can fill beds and price excessively. Even though hospitals lose approximately $165,000 each year for every primary care doctor and about $300,000 for each specialist they hire, this strategy has proven effective; it increases market share and allows hospital systems to negotiate higher prices with insurers. What’s more, these doctors are obligated to refer their patients for services or specialty care in an exorbitantly overpriced hospital setting. Of course, emergency procedures are occasionally necessary, and of course hospital infrastructure costs are always higher and will need to be taken into account when assessing fair pricing. But when millions of dollars are used to market elective services that are arbitrarily priced much higher than what is fair – well, this just shouldn’t feel right to the unknowing business owner and employee. After all, they trust the healthcare system to guide and care for them.
  3. Urgent care centers are now owned by hospitals. It’s no surprise, then, that urgent cares are owned by hospitals, providing a perfect entry point for funneling services and profitable patients to hospitals and the doctors who are employed by those hospitals. Following this same line of thinking, urgent cares also help hospital systems gain market share, negotiate higher rates and “mine” the sickest people from among those patients.
  4. There are huge price differentials in prescription drugs. This problem is rampant in the healthcare industry, even extending to runaway prices in common prescriptions. The costs of medications vary dramatically depending on the pharmacy, the insurer and the way the doctor writes the prescription. The cost of a simple generic antibiotic can range from $12 at a grocery store to more than $50 at a widely known national pharmacy – and to more than $900 for the brand name that legally gets substituted when the pharmacy chooses. You might think the answer is obvious – just stop overpaying – but many people simply aren’t aware of the pricing tricks.
  5. High-deductible health plans partner with hospital systems. Often, such plans require that services be performed exclusively at a particular hospital’s health centers or affiliated urgent cares, imaging centers, doctor’s offices, etc. In other words, the hospital system that has negotiated higher rates with insurers now requires health plan participants to use their overpriced services. They say they have negotiated lower prices, but we see that costs are much lower when a patient pays cash outside the hospital.
In the case of high-deductible plans, it’s employees who get stuck with much of the bill. The premiums are cheaper upfront, but employees and their families are charged for services until their deductibles are met, often paying inflated prices for procedures performed in a hospital or affiliated setting. When they can’t afford to pay the deductible, employees often direct their frustration at their employers for providing this sort of coverage. And, sadly, many low-wage people will decide to forgo needed care. See Also: Why Healthcare Costs Soar (Part 6) What if brokers could help their small business clients by providing the negotiated fee schedule with the hospital system employees will be required to use? Or at least educate them about the dangers of using hospital facilities for services that could be performed outside a hospital? This is especially important for people with high deductibles. Though it’s not common to request the price list – and insurance companies won’t grant the request – it’s certainly common sense. Shouldn’t employees understand the costs before choosing a doctor or facility? Simply providing the fee schedule would at least give them and their doctors a fighting chance to make care decisions based on both quality and value. Increased transparency in an industry of hidden costs and unexpected medical bills would be a powerful step toward saying “NO” to the overcharging that the biggest healthcare facilities get away with every day. The Importance of Data Educating and guiding employees to the best places for service will have a huge impact on moving the cost needle. And, using data to identify the sickest employees and understand where they are getting their healthcare services is a great multiplier that brokers can use to help their business clients achieve more cost savings. If an insurance company will not agree, in writing, that all of the company’s data belongs to the business owner – regardless of whether they’re certain to renew – the business owner should walk away. Most traditional insurance companies will tell business owners they can’t give them this data because of privacy laws or HIPAA. The real reason is that they don’t want their clients to share the data with competing insurers and potentially lower their healthcare costs. In reality, business owners can own their data. Nothing in the law says otherwise. (Employers should never directly look at employees’ personal health information. This is just common sense.) We encourage business owners to push harder and challenge the status quo way of thinking. We want them to demand cost transparency so they can control their own costs and still take great care of their people. Owning their employees’ data will enable the employer and their broker to negotiate fair pricing and educate their people about place of service more effectively. Brokers who rise to this challenge will find great opportunities to grow their business and create undying loyalty among their clients. Status quo healthcare costs are bloated with unnecessary administration, waste and overpricing, but businesses and brokers who understand how to choose the right place of service can save money and easily fund healthcare. The worst thing we can do is pay more.

David Berg

Profile picture for user DavidBerg

David Berg

David Berg is co-founder and chairman of the board of Redirect Health. He helps oversee operations and develops innovative ways to enhance the company’s processes and procedures for identifying the most cost-efficient, high-quality routes for common healthcare needs.

Insurance M&A: Just Beginning

Foreign investment – especially from Japan and China – will fuel U.S. deals activity for the foreseeable future.

sixthings
Insurance M&A activity in the U.S. rose to unprecedented levels in 2015, surpassing what had been a banner year in 2014. There were 476 announced deals in the insurance sector, 79 of which had disclosed deal values with a total announced value of $53.3 billion. This was a significant increase from the 352 announced deals in 2014, of which 73 had disclosed deal values with a total announced value of $13.5 billion. Furthermore, unlike prior years, where U.S. insurance deal activity was isolated to specific subsectors, 2015 saw a significant increase in deal activity in all industry subsectors. The largest deal of the year occurred in the property & casualty space when Chubb Corporation agreed on July 1 to merge with Ace. The size of the combined company, which assumed the Chubb brand, rivals that of other large global P&C companies like Allianz and Zurich. This merger by itself exceeded the total insurance industry disclosed deal values for each of the previous five years and represented 53% of the total 2015 disclosed deal value for the industry. However, even without the Chubb/Ace megamerger, total 2015 deal value was still nearly double that of 2014. See Also: Insurance Implication in Asia Slowdown While the insurance industry saw a significant increase in megadeals in 2015, there also was a significant increase in deals of all sizes across subsectors. Tokio Marine & Fire Insurance’s acquisition of HCC Insurance Holdings, announced in June 2015, was the second largest announced deal, with a value of $7.5 billion. The purchase price represented a 36% premium to market value before the deal announcement. The largest deal in the life space (and third largest deal in 2015) was Meiji Yasuda Life Insurance’s acquisition of Stancorp Financial Group for $5 billion. The purchase price represented 50% premium to market value prior to the deal announcement and continued what now appears to be a trend with Asian-domiciled financial institutions (particularly from Japan and China) acquiring mid-sized life and health insurance companies by paying significant premiums to public shareholders. The fourth and fifth largest announced deals in 2015 were very similar to the Stancorp acquisition. They also were acquisitions of publicly held life insurers by foreign-domiciled financial institutions seeking an entry into the U.S. In each of these instances, the acquirers paid significant premiums. In 2014, we anticipated this trend of inbound investment – particularly from Japan and China – and expect it to continue in 2016 as foreign-domiciled financial institutions seek to enter or expand their presence in the U.S. Independent of these megadeals, the overwhelming number of announced deals in the insurance sector relate to acquisitions in the brokerage space. These deals are significant from a volume perspective, but many are smaller transactions that do not tend to have announced deal values. In addition, there were a number of transactions involving insurance companies with significant premium exposure in the U.S., but which are domiciled offshore and therefore excluded from U.S. deal statistics. Some examples from 2015 include the acquisition of reinsurer PartnerRe by Exor for $6.6 billion, the $4.1 billion acquisition of Catlin Group by XL Group and Fosun’s acquisition of the remaining 80% interest of Ironshore for $2.1 billion. See Also: New Approach to Risk and Infrastructure? Drivers of deal activity
  • Inbound foreign investment – Asian financial institutions looking to gain exposure to the U.S. insurance market made the largest announced deal of 2014 and four of the five largest announced acquisitions in the insurance sector in 2015. Their targets were publicly traded insurance companies, which they purchased at significant premiums to their market prices. Foreign buyers have been attracted to the size of the U.S. market and have been met by willing sellers. Aging populations, a major issue in Japan, Korea and China, as well as an ambition to become global players, will continue to drive Asian buyer interest in the U.S. However, the ultimate amount of foreign megadeals in the U.S. may be limited by the number of available targets that are of desired scale and available for acquisition.
  • Sellers’ market – Coming out of the financial crisis, there were many insurance companies seeking to sell non-core assets and capital-intensive products. This created opportunities for buyers, as these businesses were being liquidated well below book values. Starting in 2014, the insurance sector became a sellers’ market (as we mention above, largely because of inbound investment). Many of the large announced deals in 2015 involved companies that were not for sale but were the direct result of buyers’ unsolicited approaches. This aggressiveness and the significant market premiums that buyers have paid on recent transactions should be cause for U.S. insurance company boards to reassess their strategies and consider selling assets.
  • Private equity/family office – Private equity demand for insurance brokerage companies continued in 2015, even as transaction multiples and valuations of insurance brokers increased significantly. However, we have also seen increased interest among private equity investors in acquiring risk-bearing life and P&C insurance companies. This demand has grown beyond the traditional PE-backed insurance companies that have focused primarily on fixed annuities and traditional life insurance products. Examples include: 1) Golden Gate Capital-backed Nassau Reinsurance Group Holdings’ announced acquisition of both Phoenix Companies and Universal American Corp.’s traditional insurance business; 2) HC2’s acquisition of the long-term care business of American Financial Group Inc.: and 3) Kuvare’s announced acquisition of Guaranty Income Life Insurance. We anticipate private equity activity will continue in both insurance brokerage and carrier markets in 2016.
  • Consolidation – While there has been some consolidation in the insurance industry over the past few years, it has been limited primarily to P&C reinsurance. With interest rates near historic lows and minimal increases in premium rates over the last few years, we expect the economic drivers of consolidation to increase in the industry as a whole as companies seek to eliminate costs to grow their bottom lines.
  • Regulatory developments – MetLife recently announced plans to spin off its U.S. retail business in an effort to escape its systemically important financial institution (SIFI) designation and thereby make the company’s regulatory oversight consistent with most other U.S. insurers’. MetLife’s announcement was followed by fellow SIFI AIG’s announcement that it intended to divest itself of its mortgage insurance unit, United Guaranty. The two other non-bank financial institutions that have been designated as SIFIs, GE Capital and Prudential Financial, have differing plans. While GE Capital has been in the process of divesting most of its financial services businesses, Prudential Financial has yet to announce any plans to sell assets. In other developments, the new captive financing rules the NAIC enacted in 2015 and the implementation of Solvency II in Europe may put pressure on other market participants to seek alternative financing solutions or sell U.S. businesses in 2016 and beyond.
  • Technological innovations – The insurance industry historically has lagged behind other industries in technological innovation (for example, many insurance companies use multiple, antiquated, product-specific policy administration systems). Unlike in banking and asset management, which have been significantly disrupted by technology-driven, non-bank financing platforms and robo-advisers, the insurance industry has not yet experienced significant disruption to its traditional business model from technology-driven alternatives. However, we believe that technological innovations will significantly alter the way insurance companies do business – likely in the near future. Many market participants are focusing on being ahead of the curve and are seeking to acquire technology that will allow them to meet new customer needs while optimizing core insurance functions and related cost structures.
Implications
  • We expect inbound foreign investment – especially from Japan and China – to continue fueling U.S. deals activity for the foreseeable future. If there is an impediment to activity, it likely will not be a lack of ready buyers but instead a lack of suitable targets.
  • Private equity will remain an important player in the deals market, not least because it has expanded its targets beyond brokers to the industry as a whole.
  • The need to eliminate costs to grow the bottom line will remain a primary economic driver of consolidation.
  • Regulatory developments are driving divestments at most, though not all, non-bank SIFIs. This remains a space to watch, as a common insurance industry goal is to avoid federal supervision.
  • Actual and impending technological disruption of traditional business models is likely to lead to increased deal activiy as companies look to augment their existing capabilities and take advantage of – rather than fall victim to – disruption.

Gregory Galeaz

Profile picture for user GregGaleaz

Gregory Galeaz

Greg Galeaz is currently PwC’s U.S. insurance practice leader and has over 34 years of experience in the life and annuity, health and property/casualty insurance sectors. He has extensive experience in developing and executing business and finance operating model strategies and transformations.


John Marra

Profile picture for user JohnMarra

John Marra

John Marra is a transaction services partner at PwC, dedicated to the insurance industry, with more than 20 years of experience. Marra's focus has included advising both financial and strategic buyers in conjunction with mergers and acquisitions.

Are You Fit Enough for Growth?

The math on growth doesn’t work unless you find ways to spend less in unimportant areas and allocate those savings to more important ones.

sixthings
When it comes to scrutinizing costs, most insurance companies can say, “Been there, done that. Got the T-shirt.” Managers are familiar with the refrain from above to trim here and cut there. The typical result is flirtation with the latest management trends like lean, outsourcing and offshoring. However, the results tend to be the same. Budgets reflect last year’s spending plus or minus a couple of percent. Meanwhile, managers attempt to develop strategies to capitalize on the trends reshaping the industry – customer-centricity, analytics, digital platforms and disruptive delivery and distribution models. Yet, after all of the energy companies exert to reduce expenses, there is often little left over to spend on these strategic initiatives. Why do you need to look at your expense structure? A variety of pressures have led carriers to improve their cost structures. In all parts of the market, low interest rates and investment returns are forcing carriers to scrutinize costs to improve return on capital, or even to maintain profitability to stay in business. After all of the energy that companies exert to reduce expenses, there is often little energy left over to spend on strategic initiatives. P&C carriers with lower-cost distribution models have been able to channel investments into advertising and take share, forcing competitors to reduce costs to defend their positions. Consolidation in the health, group and reinsurance sectors have forced smaller insurers to either a) explore more scalable cost structures or b) put themselves up for sale. For life and retirement companies, lower interest rates have taken a toll on the competitiveness of investment-based products. This spells trouble for companies that have not adequately sorted out their expense structure. And a shrinking insurance company sooner or later will run afoul of regulators, ratings agencies, distributors and customers. Even if expenses are shrinking, if revenue is declining more quickly then the downward spiral will accelerate. It is virtually impossible to maintain profitability without growth. Expenses increase with inflation, tick upward with each additional regulatory requirement and can spike dramatically when attempting to meet customer and distributor demands for improved experiences and value-added services. The reality is that companies have to grow, and that’s difficult in a mature market, especially in times when “the market” isn’t helping. What’s the key to success, then? In short, growth comes from better capabilities, service, customer-focus and products – all of which require continuing investment in capabilities. See Also: 2016 Outlook for Property-Casualty The math doesn’t work unless you’re finding ways to spend less in unimportant areas and allocate those savings to more important ones. If your answer to any of the following questions is “no,” then it’s important that you look at your allocation of resources for capital, assets and spending:
  • Are you making your desired return on capital?
  • Are your growth levels acceptable?
  • Do you have an expense structure that lets you compete at scale?
The transformation of insurers from clerk-intensive, army-sized bureaucracies to highly automated financial and service operations has been a decades-long process. The industry has invested heavily enough in standardization and automation that one would expect it to be a well-oiled machine. However, when we look under the covers, we see an industry with a considerable amount of customization and one-offs. In other words, the industry behaves more like cottage industry than an industrial, scalable enterprise. We know that expenses are difficult to measure, let alone control. But why are they so intractable? The industry’s poorly kept secret is that insurers, even larger ones, have sold many permutations of products with many different features. All of these have risk, service, compensation, accounting and reporting expenses, as well as coverage tails so long the company can’t help but operate below scale. Why are expenses so intractable? The issue is scale. What defines operating at scale for you? A straightforward way to answer this question is to consider whether you’re operating at a level of efficiency on par with or better than the best in the marketplace. Where do you draw the line? The top 10% to 15%? The top 20% to 25%? Next, ask yourself if you, in fact, are operating at scale. Remove large policies and reinsurance that disguise operating results, then sort out how many differentiated service models you are supporting. Are you in the bottom half of performers? Are you in the top 50% but not the top quartile? Are you in the top quartile but not the top decile? Every insurer needs a more versatile and flexible expense structure to fully operate at scale and be more competitive. Competition is changing Customers now have access to a wealth of information and are increasingly using it to make more informed choices. New market entrants are establishing a foothold in direct and lightly assisted distribution models that make wealth management services more affordable for more market segments. Name brands are establishing customer mind-share with extensive advertising. FinTech is shifting the way we think about adding capabilities and creating capabilities in near real time. Outsourcers are increasingly proficient and are investing in new technologies and capabilities that only the largest companies can afford to do at scale. See Also: Don't Do It Yourself on Property Claims The competitive landscape will continue to change. More products will be commoditized – after all, consumers prefer an easy-to-understand product at a readily comparable price. As they do now, stronger companies will go after competitors with less name recognition and scale and lower ratings. Customer research and behavioral analytics will more accurately discern life-long customer behavior and buying patterns for most lifestyles and socio-demographic groups. The role of advisers will change, but customers of all ages will still like at least occasional advice, especially when their needs – and the products they purchase to meet them – are complex. Table stakes are greater each year and now include internal and external digital platforms, data-derived service (and self-service) models, omni-channel distribution models and extensive use of advanced analytics. The need to improve time-to-market has never been more important. Scale matters. Because they can increase scale, partners also matter even more than in the past. If they have truly complementary capabilities, new partners can help you improve your cost curve because you can leverage their scale to improve yours (and vice-versa). In conclusion, all companies – regardless of scale – need to ensure that their capital and operating spending aligns with their strategy and capabilities and the ways they choose to differentiate themselves in the market. In this transformative time, the ones that can’t or won’t do this will fall increasingly behind the market leaders. Implications: Leave no stone unturned
  • Managing expenses is a job that is never finished. Even if you’ve already looked at expenses, it doesn’t mean that you get a pass from scrutinizing them afresh. You will always have to keep rolling that particular boulder up the hill. Acknowledging that you could always manage expenses better is the first step to doing it well.
  • Identify and commit to the cost curves that get you to scale. This may require new thinking about sourcing partners and which evolving capabilities hold the most promise for the future of the company. How transformative do your digital platforms need to be? Can the cloud help you operate more efficiently and economically? How constraining is your culture, management and governance?
  • Every company needs to invest. Every company needs to be “fit for growth.” You will need to increase expenses where it helps you compete and decrease it where it doesn’t. Admittedly, this is hard to do, but the companies that don’t do it successfully will be left by the wayside.

Bruce Brodie

Profile picture for user BruceBrodie

Bruce Brodie

Bruce Brodie is a managing director for PwC's insurance advisory practice focusing on insurance operations and IT strategy, new IT operating models and IT functional transformation. Brodie has 30 years of experience in the industry and has held a number of leadership positions in the insurance and consulting world.

The 5 Charts on Insurance Disruption

Here are five charts that show the high-level forces disrupting the industry, as well as how they connect and combine for even more impact.

sixthings
The high-level forces (people, technology and market boundaries) are responsible for insurance’s driving influences — new expectations, innovations and new competition that individually exert tremendous transformation pressure on the industry. The forces don’t operate in isolation, however. They are connected and combine to create an even more powerful and disruptive impact on the industry. Majesco developed a model to reflect these forces: Screen Shot 2016-03-16 at 1.40.48 PM The combined impact is creating a powerful market shift that brings the three together, creating unprecedented innovation and disruption. It reflects what author Malcolm Gladwell calls a “tipping point.” A tipping point occurs when an idea, trend, behavior or expectation crosses a threshold and spreads like wildfire, changing the fundamentals of business. These are often sudden, as we have seen in other tipping points over the last century, reflected in the move from the industrial age to the information age and now to the digital age. Each move created leaps in innovation and transformation. People The makeup of the market is shifting. Insurers who ignore the shift will be challenged to retain their customers, let alone grow their businesses. This shift is being driven by demographic, cultural, economic and technological forces. They present new challenges and opportunities for the insurance industry that will require insurers to rethink their strategies, products, channels and processes to reach a fast-changing market. Screen Shot 2016-03-16 at 1.41.58 PM Market Boundaries The combination of the sharing and platform economy trends is dissolving traditional boundaries and the long-held competitive advantages of incumbents. Just as start-ups can now access technology as a service, they can also access resources (sourcing and crowdsourcing), designing, manufacturing and more as a service, giving any company access to the resources needed to compete. As a result, companies must compete on more than brand, product, price or distribution. They must compete on innovative approaches. Screen Shot 2016-03-16 at 1.42.45 PM New Entrants Screen Shot 2016-03-16 at 1.43.24 PMScreen Shot 2016-03-16 at 1.44.16 PM Shifts in the Industry Screen Shot 2016-03-16 at 1.45.11 PM To download the full report, click here.

Denise Garth

Profile picture for user DeniseGarth

Denise Garth

Denise Garth is senior vice president, strategic marketing, responsible for leading marketing, industry relations and innovation in support of Majesco's client-centric strategy.

Tricky Issues Being Raised

AI (artificial intelligence) means we have to figure out how to sell to machines -- or become slaves to the machines.

|
Welcome my son, welcome to the machine. Where have you been? It's alright, we know where you've been. You've been in the pipeline, filling in time, provided with toys and Scouting for Boys. [Pink Floyd, "Welcome To The Machine," from the album “Wish You Were Here” (1975)] Machine Markets This past summer, a North American energy insurer raised, with us, an interesting problem. The company was looking at insuring U.S. energy companies that were about to offer reduced electricity rates to clients who allowed the company to turn appliances on-and-off—for example, a freezer. Freezers in the U.S. can hold substantial and valuable quantities of foodstuffs, often several thousand dollars. Obviously, the insurer was worried about correctly pricing a policy for the electricity firm in case there was some enormous cyber-attack or network disturbance; nowadays, the control systems are as important as the power supply. Imagine, for example, coming home to find your freezer off and several thousands of dollars of defrosted mush in your freezer. You ring both your home and your contents insurer, which notes you have one of those new-fangled electricity contracts. The fault was probably the electricity company's—go claim from them. You ring the electricity company—the company denies it had anything to do with turning off your machine; if anything, it was probably the freezer manufacturer that is at fault. The freezer manufacturer knows for a fact there is nothing wrong; you and the electricity company must have installed things improperly. And, of course, it might have been your error. Perhaps you unplugged the freezer to vacuum your house and forgot to reconnect things. Or, perhaps you were a bit tight on funds and thought you could turn mush into instant cash. In the future, machines will make decisions and send buy-and-sell signals to each other that have large financial consequences. We pointed out to our North American friends that they, the insurer, should perhaps tell the electricity company which freezers to shut off first, starting with the ones with the cheapest contents. With billions of people on the planet, we may need several tens of billions—or even low trillions—of ledgers recording these transactions in case of disputes: a freezer-electricity-control-ledger, an entertainment system, a home security system, heating-and-cooling systems, a telephone, an autonomous automobile, a local area network, telephone recording, etc. Perhaps the most significant announcement of 2015 came in January from IBM and Samsung. The two announced their intention to work together on mutual distributed ledgers (aka blockchain technology) for the Internet of Things. IBM and Samsung developed ADEPT (Autonomous Decentralized Peer-to-Peer Telemetry) for distributed networks of devices. The companies foresee a future of 10 billion people with hundreds of networks and a trillion distributed ledgers. Meet the New Boss Many forecasters are predicting rapid and extreme traditional job losses because of automation. At Oxford, Carl Benedikt Frey and Michael Osborne caused a stir in 2013 when they published a paper containing detailed research that estimated 47% of U.S. jobs were at risk over the next decade. The new machine "bosses" won’t be quite the same as the old bosses. Human nature evolves slowly; machines evolve quickly.  Only other machines will likely be able to keep up with the day-today evolution of choice that other robots will create. Thinking about artificial intelligence (AI) as customer can also be mind-bending. AI customers will require AI salespeople. AI salespeople need to operate in machine time, much faster than human time. There are some inklings of what this might look like in the world of high-frequency trading, but, with trillions of networks, this will be much more competitive, dangerous and, potentially, lucrative. Correspondent and transaction banks will have either a great role or no role. It will be a great role if they can up their game to sell to the machine. It will be no role if they can’t respond quickly. It’s interesting to look at the CPMI Working Group on Correspondent Banking’s recent report that contained the idea of selling to trillions of machines. Banks are cutting down banking relationships just as their need may be about to explode. They are restricting international network access just as IBM, Samsung, Apple, Google and others are seeking global reach. Know Your Customer & Anti Money Laundering (KYC/AML) strictures are being bolstered by Know Your Customers’ Customers (KYCC), perhaps leaving global firms to make payment arrangements without their banks, a role cryptocurrencies are all too happy to assume. Heck, that’s what they were designed for. Another implication is that KYC utilities will arise to displace part of the role of banks because of own inefficiency and creativity. Major accounting firms and some governments, e.g. Estonia, seem to be moving into these spaces. Information sharing is too costly and expensive. Mutual distributed ledgers have a huge role in information-sharing initiatives, such as legal entity identifiers (LEI), International Bank Account Numbers (IBAN) and Bank Identifier Codes (BIC). Furthermore, much messaging is expensive and error-prone. MT 103 and MT 202 payment messages are expensive, and the structure has been used to obscure the ultimate beneficial transactor. Again, mutual distributed ledgers may have a role here. I am aware of at least two global banks that are implementing internal mutual distributed ledgers to cut out internal SWIFT transfers. Land & Expand or Miss & Contract A World Bank survey commissioned by the FSB concluded that correspondent banking services are declining in roughly half the emerging market and developing economy jurisdictions surveyed. This decline is in addition to the decline in retail cross-border payments and remittances. Regulation may be restricting financial services just when multitudes of cross-border micro payments may be taking off. The level of skills in current transaction banking are pitiful when contrasted with the challenges ahead: deploying artificial intelligence, supporting vector machines, mutual distributed ledgers, predictive analytics, agnostic broadcasting of time stamps, evolutionary user interfaces, etc. These are the challenges that transaction banks face today; imagine what they’ll face when machines begin to evolve. The "technological singularity" is a hypothetical event that will occur when artificial intelligence (“strong AI”) takes control. Some talk about "technology rapture" to describe the possibility of the gods of AI coming down to take command of us all. Transaction banks are going to have to get serious about selling to the machines, or they're going to become slaves to the machines.

Michael Mainelli

Profile picture for user MichaelMainelli

Michael Mainelli

Michael Mainelli co-founded Z/Yen, the city of London’s leading commercial think tank and venture firm, in 1994 to promote societal advance through better finance and technology. Today, Z/Yen boasts a core team of 25 highly respected professionals and is well capitalized because of successful spin-outs and ventures.

Union Pacific Leads on Suicide Prevention

Union Pacific Railroad sponsored a company-wide suicide awareness program and touched nearly 10,000 employees in one day.

sixthings
Suicide is always a difficult topic no matter what the setting. It can be a particularly difficult topic to discuss in the workplace. Employers continually struggle to decide how far to intrude into the emotional life of their employees, though in the U.S. alone the economic burden of major depressive disorders was estimated at $210.5 billion in 2010. Associated breakdown of costs suggest:
  • 45% to 47% attributed to directs costs
  • 5% to suicide-related costs
  • 48% to 50% to workplace costs
Of course, the financial costs pale in comparison with the emotional impact the death of a co–worker can have in the workplace. The rail industry workforce is highly representative of “men in the middle years,” the age group known to be prone to suicide. 80% to 90% of the workforce is men, and 35% to 40% are veterans. Additionally, witnessing suicide-by-train is an all-too-frequent trauma for rail workers. To address the problem of suicide in the work place, Union Pacific Railroad sponsored a company-wide suicide awareness and roll-out on Sept. 10, 2015. The day’s activities paralleled the world-wide Suicide Prevention Day. See Also: 6 Things to Do to Prevent Suicides Making personal contact with so many employees and leaving behind literature was no easy task but worth the effort of nearly 200 volunteers throughout the UP system. Volunteers met their fellow employees as they reported to work or left work on Sept. 10. The volunteers, most of whom are part of the Peer Support and Operation Red Block teams, handed out wallet-size cards about suicide and also gave employees a key chain with the inspirational message, "Stay Connected." Together, both groups estimated they touched nearly 10,000 employees on the day. “The volunteers were overwhelmed by the personal stories they heard throughout the day about how other UP employees had been impacted by suicide,”, said Harry Stewart, manager of Peers Support programs. Peer-to-peer programs are vital to efforts like suicide prevention because peers have strong credibility  with their co-workers, as many have lived experiences to share with others that normalize the stigma attached to many of life’s most pressing problems, including suicide. “Coaching and encouraging their fellow employees on where to go for help when life gets tough can make a big impact,” stated Matt Schumacher, system coordinator for Operation Red Block at UP. UP plans to make Suicide Prevention Day an annual event and hopes to touch more employees in 2016 with messages of hope and caring and the all-important bridge to resources for help.

Mark Jones

Profile picture for user MarkJones

Mark Jones

Mark Jones, PhD is the director of the Employee Assistance Program at Union Pacific Railroad located in Omaha, Nebraska. Jones has been the EAP director at UPRR since 2005. Jones’s job duties at UPRR include overseeing a service delivery model covering 50,000 employees and their dependents.

How to Manage 'Model Risk' (and Win)

Thinking on model risk should undergo the same sort of shift that enterprise risk management has seen in recent years.

sixthings
One of the fastest-growing concerns on insurers’ enterprise risk agenda is managing model risk. From being a phrase that primarily actuaries and other modelers used, “model risk” has become a major focus of regulators and the subject of intense activity and debate at insurers. How model risk management has evolved from ad hoc efforts to its current stage is an interesting story. But more interesting still is what we believe could be its next stage – generating measurable business value. Generating measurable business value is model risk management’s next developmental stage. Ad Hoc Organizing and using experience to predict future claims is core to the business of insurance. Recognizing the importance of models, insurers and industry professionals, particularly actuaries, have long incorporated model reviews into their work. As new models were introduced or changes made to existing ones – especially if third-party systems were involved – insurers were careful to ensure consistency between old and new models. Additionally, internal and external auditors’ procedures recognized the risk that models entail and incorporated verification and testing in their processes. See Also: Secret Sauce for New Business Models? What distinguishes this earliest stage is not that model risk was ignored but rather that model risk management was dispersed and generally informal. Practices differed across the industry, across different types of professional organizations and across different parts and functions within an insurer. Standards for documentation, both of the models and the validation process, were largely absent. Typically, not all models were reviewed. Establishing a comprehensive inventory of all significant models was not the norm. Likewise, it was not common for insurers to follow consistent procedures to validate models across the enterprise. Reactive Although a comprehensive guide to help banks mitigate potential risks arising from reliance on models was available as early as 2000, concerted attention to the issue in insurance can be dated to the Great Recession and its aftermath. In reaction to the events of 2008/2009, regulators and insurers themselves revisited their risk management processes and governance. The U.S. Federal Reserve Board took the lead in promulgating new requirements for the banking sector, including supervisory guidance on model risk management issued in 2011. Many insurers, especially those designated as systematically important financial institutions (SIFIs), have been working to adopt these guidelines. In 2012, the North American CRO Council released its model validation principles for risk and capital models, which included eight core validation principles. For insurers operating in Europe, Solvency II provided the potential to use an internal model to establish their capital requirements. To take advantage of this opportunity, insurers needed to adhere to model validation expectations prescribed by regulators. In the U.S., the ORSA Guidance Manual requires insurers to describe their validation process. Reacting to the 2008/2009 crisis and regulators’ demands, insurers began to establish the key elements of an enterprisewide model risk management program:
  • Governance and independence policies;
  • An inventory and risk assessment of all significant models; and
  • Documentation and validation standards.
Only after these basic building blocks had been put in place did insurers developed the practical experience to begin their transition to the next, active stage. Active The reactive stage and the beginning of the active stage effectively started in 2014. In the early months of that year, PwC conducted a survey of 36 insurers operating in the U.S. The survey provided the opportunity for participants to assess their programs across 10 dimensions characterizing the key elements of a monitoring and reporting mechanism (MRM) process. Modal responses across these dimensions were typically “weak” or “developing.” Almost all insurers admitted they had work to do and indicated that they had plans in place to improve their processes. In the intervening two years, we have observed a significant investment in MRM capabilities. In the absence of detailed insurance-focused regulatory guidelines, most insurers have shaped their developments to best fit their own circumstances. For example, while there has been a near-uniform increase in resources allocated to MRM, how insurers deploy these resources has differed significantly. Some have formed large centralized model management functions, and others have allocated most of the validation responsibility to business units. How the responsibilities are dispersed across risk, actuarial, compliance and audit functions vary considerably. We expect that most of these differences are attempts to fit the task to the insurer’s existing structure and culture. Likewise, we have seen insurers, both individually and as a group, more actively develop procedures that better fit the unique circumstances of the insurance sector instead of banking or financial services in general. Three areas in which the insurance sector is increasing its attention are:
  1. Incorporating the unique aspects of actuarial models and the development of standards by actuarial professional organizations;
  2. Emphasizing the process of assumption setting and the governance of this process; and
  3. Emphasizing monitoring and benchmarking necessitated by the long time frame and the lack of market data to measure the performance of many insurance models.
Productive Recent discussions with forward-thinking insurance company executives and board members leads us to think a fourth stage may be next. The common theme is recognition that an insurer’s key asset is the information it possesses and the models it has developed to turn this information into support for profit-generating decisions. Seen in this light, models are not inconveniences substituting for “real” data. Rather, they are the machinery that insurers use to turn their raw materials (data) into salable, profitable costumer solutions. See Also: How to Remove Fear in Risk Management Model risk management then becomes the mechanism to ensure this machinery is performing at its best. This includes the normal activities that one would associate with maintenance, like finding and correcting inadequate performance. But, it also provides a way to determine how better machinery can be developed and brought online. In many respects, the transition to this stage mirrors the transition that has occurred in risk management in general. Not too long ago, risk management was seen as a strictly defensive activity. It was more about saying “no” than finding the right opportunities to say “yes.” Now, risk management is seen as an important strategic activity that plays a central role in an insurer’s deployment of capital and its selection of growth opportunities. Putting models and the data that feeds them at the center of an insurer’s value creation engine, instead of at its periphery, provides a new perspective. And, by moving model risk management to the productive stage, insurers can better use this new perspective to address customer expectations in an information-rich environment. Implications
  • Model risk management is no longer an ad hoc or reactive activity. An active approach is now a necessity to meet internal and external stakeholder demands.
  • Insurers are attempting to develop model risk management practices that fit the needs of their industry. They will need to continually communicate to regulators, standards setters and other stakeholders how the business of insurance has unique characteristics compared with elsewhere in financial services.
  • Models are among insurers’ greatest assets, and the machinery that they use to turn data into salable, profitable costumer solutions. Putting models and the data that feeds them at the center of value creation can provide new perspectives that better address customer expectations. Model risk management becomes the tool to keep this machinery productive.

Henry Essert

Profile picture for user HenryEssert

Henry Essert

Henry Essert serves as managing director at PWC in New York. He spent the bulk of his career working for Marsh & McLennan. He served as the managing director from 1988-2000 and as president and CEO, MMC Enterprise Risk Consulting, from 2000-2003. Essert also has experience working with Ernst & Young, as well as MetLife.

Promise, Pitfalls of Cyber Insurance

There are eight ways that insurers, reinsurers and brokers could put cyber insurance on a more sustainable footing while still generating growth.

sixthings
Cyber insurance is a potentially huge but still largely untapped opportunity for insurers and reinsurers. We estimate that annual gross written premiums will increase from around $2.5 billion today to $7.5 billion by the end of the decade. Many insurers and reinsurers are looking to take advantage of what they see as a rare opportunity to secure high margins in an otherwise soft market. However, wariness of cyber risk is widespread. Many insurers don’t want to cover it at all. Others have set limits below the levels their clients seek and have imposed restrictive exclusions and conditions – such as state-of-the-art data encryption or 100% updated security patch clauses – that are difficult for any business to maintain. Given the high cost of coverage, the limits imposed, the tight attaching terms and conditions and the restrictions on claims, many companies question if their cyber insurance policies provide real value. Insurers are relying on tight policy terms and conditions and conservative pricing strategies to limit their cyber risk exposures. But how sustainable is this approach as clients start to question the value of their policies and concerns widen about the level and concentration of cyber risk exposures? The risk pricing challenge The biggest challenge for insurers is that cyber isn’t like other risks. There is limited publicly available data on the scale and financial impact of attacks, and threats are rapidly changing and proliferating. Moreover, the fact that cyber security breaches can remain undetected for several months – even years – creates the possibility of accumulated and compounded future losses. See Also: Better Way to Assess Cyber Risks? While underwriters can estimate the cost of systems remediation with reasonable certainty, there isn’t enough historical data to gauge further losses resulting from impairment to brands or to customers, suppliers and other stakeholders. And, although the scale of potential losses is on par with natural catastrophes, cyber incidents are much more frequent. Moreover, many insurers face considerable cyber exposures within their technology, errors and omissions, general liability and other existing business lines. As a result, there are growing concerns about both the concentrations of cyber risk and the ability of less experienced insurers to withstand what could become a rapid sequence of high-loss events. So, how can cyber insurance be a more sustainable venture that offers real protection for clients, while safeguarding insurers and reinsurers against damaging losses? Real protection at the right price We believe there are eight ways that insurers, reinsurers and brokers could put cyber insurance on a more sustainable footing while taking advantage of the opportunities for profitable growth.
  1. Clarify risk appetite – Despite the absence of robust actuarial data, it may be possible to develop a reasonably clear picture of total maximum loss and match it against risk appetite and tolerances. Key inputs include worst-case scenario analysis. For example, if your portfolio includes several U.S. power companies, then what losses could result from a major attack on the U.S. grid? What proportion of claims would your business be liable for? What steps could you take now to mitigate losses by reducing risk concentrations in your portfolio to working with clients to improve safeguards and crisis planning? Asking these questions can help insurers judge which industries to focus on, when to curtail underwriting and where there may be room for further coverage. Even if an insurer offers no stand-alone cyber coverage, it should gauge the exposures that exist within its wider property, business interruption, general liability and errors and omissions coverage. Cyber risks are increasingly frequent and severe, loss contagion is hard to contain and risks are difficult to detect, evaluate and price.
  2. Gain broader perspectives – Bringing in people from technology companies and intelligence agencies can lead to more effective threat and client vulnerability assessments. The resulting risk evaluation, screening and pricing process could be a partnership between existing actuaries and underwriters who focus on compensation and other third-party liabilities, and technology experts who concentrate on data and systems. This is similar to the partnership between chief risk officer (CRO) and chief information officer (CIO) teams that many companies are developing to combat cyber threats.
  3. Create tailored, risk-specific conditions – Many insurers currently impose blanket terms and conditions. A more effective approach would be to make coverage conditional on a fuller and more frequent assessment of the policyholder’s vulnerabilities and agreement to follow advised steps. This could include an audit of processes, responsibilities and governance within a client’s business. It also could draw on threat assessments by government agencies and other credible sources to facilitate evaluation of threats to particular industries or enterprises. Another possible component is exercises that mimic attacks to test both weaknesses and plans for response. As a result, coverage could specify the implementation of appropriate prevention and detection technologies and procedures. This approach can benefit both parties. Insurers will have a better understanding and control of risks, lower exposures and produce more accurate pricing. Policyholders will be able to secure more effective and economical protection. Moreover, the assessments can help insurers forge a closer, advisory relationship with clients.
  4. Share data more effectively – More effective data sharing is the key to greater pricing accuracy. For reputational reasons, many companies are wary of admitting breaches, and insurers have been reluctant to share data because of concerns over loss of competitive advantage. However, data breach notification legislation in the U.S., which is now set to be replicated in the E.U., could help increase available data volumes. Some governments and regulators have also launched data-sharing initiatives (e.g., MAS in Singapore and the U.K.’s Cyber Security Information Sharing Partnership). In addition, data pooling on operational risk, through ORIC, provides a precedent for more industrywide sharing.
  5. Develop real-time policy updates – Annual renewals and 18-month product development cycles will need to give way to real-time analysis and rolling policy updates. This dynamic approach could be likened to the updates on security software or the approach taken by credit insurers to dynamically manage limits and exposures.
  6. Consider hybrid risk transfer – Although the cyber reinsurance market is relatively undeveloped, a better understanding of evolving threats and maximum loss scenarios could encourage more reinsurers to enter the market. Risk transfer structures likely would include traditional excess of loss reinsurance in the lower layers, and the development of capital market structures for peak losses. Possible options might include indemnity or industry loss warranty structures or some form of contingent capital. Such capital market structures could prove appealing to investors looking for diversification and yield. Fund managers and investment banks could apply reinsurers’ or technology companies’ expertise to develop appropriate evaluation techniques.
  7. Improve risk facilitation – Considering the complexity and uncertainty surrounding cyber risk, there is a growing need for coordinated risk management solutions that bring together a range of stakeholders, including corporations, insurance/reinsurance companies, capital markets and policymakers. Some form of risk facilitator – possibly brokers – will need to bring together all parties and lead the development of effective solutions, including the cyber insurance standards that many governments are keen to introduce. Evaluating and addressing cyber risk is an enterprise-wide matter – not just one for IT and compliance.
  8. Enhance credibility with in-house safeguards – If an insurer can’t protect itself, then why should policyholders trust it to protect them? If the sensitive policyholder information that an insurer holds is compromised, then it likely would lead to a loss of customer trust that would be extremely difficult to restore. The development of effective in-house safeguards is essential in sustaining credibility in the cyber risk market, and trust in the enterprise as a whole.
See Also: The State of Cyber Insurance Key questions for insurers as they assess their own and others’ security From the board on down, insurers need to ask:
  • Who are our adversaries, what are their targets and what would be the impact of an attack?
  • We can’t defend everything, so what are the most important assets we need to protect?
  • How effective are our processes, assignment of responsibilities and systems safeguards?
  • Are we integrating threat intelligence and assessments into active cyber defense programs?
  • Are we adequately assessing vulnerabilities against the tactics and tools perpetrators use?
Implications
  • Even if an insurer chooses not to underwrite cyber risks explicitly, exposure may already be part of existing policies. Therefore, all insurers should identify the specific triggers for claims, and the level of potential exposure in policies that they may not have written with cyber threats in mind.
  • Cyber coverage that is viable for both insurers and insureds will require more rigorous and relevant risk evaluation informed by more reliable data and more effective scenario analysis. Partnerships with technology companies, cyber specialist firms and government are potential ways to augment and refine this information.
  • Rather than simply relying on blanket policy restrictions to control exposures, insurers should consider making coverage conditional on regular risk assessments of the client’s operations and the actions they take in response to the issues identified in these regular reviews. This more informed approach can enable insurers to reduce uncertain exposures and facilitate more efficient use of capital while offering more transparent and economical coverage.
  • Risk transfer built around a hybrid of traditional reinsurance and capital market structures offers promise to insurers looking to protect balance sheets.
  • To enhance their own credibility, insurers need to ensure the effectiveness of their own cyber security. Because insurers maintain considerable amounts of sensitive data, any major breach could severely affect their market credibility both in the cyber risk market and elsewhere.

Joseph Nocera

Profile picture for user JosephNocera

Joseph Nocera

Joe Nocera leads the cybersecurity and privacy practice nationally for the financial services industry at PwC. His experiences range from IT auditing to large-scale systems implementation. He has significant experience in assisting organizations meet regulatory demands such as Sarbanes-Oxley.

Innovation: a Need for 'Patient Urgency'

Trying to time a disruptive innovation precisely is a fool's errand. A strategy has to emerge opportunistically, through "patient urgency."

sixthings

In corporate innovation, little else matters if your timing is wrong. Moving too fast killed Ron Johnson’s attempts to turn around J.C. Penney. Johnson plunged too quickly into a wholesale remake of the century-old chain’s stores. He didn’t take time to test alternative possibilities—even though, as the developer of the Apple stores, he experimented with every little detail for months in a mock-up before going to market. Johnson also threw out Penney’s long-standing sales strategy. He got rid of discounts—and alienated tons of existing customers—before validating that his new approach would attract enough new customers.

Moving too slowly killed Blockbuster. It ignored Netflix’s subscription-based, DVDs-by-mail model for years. Then, afraid that it was too late, it bet big on its own version even though it had dire economic and operational implications.

Precise timing, however, is a fool’s errand. Disruptive innovations, by definition, deal with future scenarios that are hard to read and where neither the right strategy nor timing is clear. How can you project customer interest for a product that customers haven’t yet seen? How can you deliver detailed timelines and budgets when new products depend on technology breakthroughs?  The strategy has to emerge over time. The timing has to be opportunistic.

To deal with the vagaries of innovation, leaders at Blockbuster, Penney and hundreds of other large-company innovation failures that I’ve studied would have benefited from a strong dose of “patient urgency.”

See Also: Does Your Culture Embrace Innovation?

Patient urgency is one of the distinguishing traits that John Sviokla and Mitch Cohen identified in their study of 120 self-made billionaires, as reported in their excellent book “The Self-Made Billionaire Effect: How Extreme Producers Create Massive Value.” Patient urgency is the combination of foresight to prepare for a big idea, willingness to wait for the right market conditions and agility to act straight away when conditions ripen.

Sviokla and Cohen found that their subjects were no better prognosticators than other people—“they cannot predict the exact right time to make an investment or to bring a product to market.”

They did not, however, sit back and wait. Neither did they just jump in and hope for the best. They learned about the market, made early investments and deals, tested ideas in the market and actively made improvements and adjustments. When the market became ripe, they were ready.

The Sviokla and Cohen finding squares with my research and experience. Reed Hastings of Netflix, for example, knew from Day One that people would eventually stream their movies over the Internet. He experimented with different versions of streaming video for more than a decade. He repeatedly killed ventures when he saw they would not quite work. When the conditions were right, he moved quickly to transform Netflix into a huge streaming business.

Google’s driverless car program is another great example of patient urgency. As I’ve discussed, driverless cars have the potential to save millions of lives and throw trillions of dollars in existing revenue up for grabs while sending a tsunami of business disruption across multiple industries. Google has methodically developed potentially differentiated technology in this fertile arena while keeping its options open on how to capture the resulting business value.

The problem for most large companies, however, is that neither “we’ll figure it as we go” nor “we’ll launch when the market is right” fit with traditional planning mindsets. Operating budgets hate uncertainty. They demand detailed, time-lined projections of human resources, costs and revenue—even when those demands just yield guesses disguised as numbers. This severely limits experimentation, adaption and risk taking. To break the organizational tendencies that dampen corporate innovation, here are three ways to encourage patient urgency:

1. Think big. Focus on big ideas that have the potential to build massive value. Develop vivid alternative future scenarios to illuminate how existing businesses might get crushed or, in a kinder world, be transformed because of disruptive innovations. Getting everyone on the same page about the stakes involved will help the organization start earlier and bide its time longer.

2. Structure early investments like financial options rather than full-fledged go-to-market plans. Ideas that could turn into multibillion-dollar businesses do not deserve billions in investments right away. Invest millions, or even tens of thousands, to test and elaborate them. Each stage of funding should focus on clarifying key questions like whether the product can be built, whether it meets real customer needs, whether it can beat the competition and whether it makes strategic sense. The goal is to invest a little at a time to develop the idea while preserving the right but not making the commitment to launch the innovation.

3. Budget for innovation as a portfolio of options. Rather than force detailed projections for individual options, plan and budget at the portfolio level. As I’ve previously discussed, the overall allocation and prioritization of the innovation portfolio should depend on a company’s investment capabilities and competitive circumstances. This limits the overall risk while allowing flexibility to shift investments between individual initiatives based on experimental results and shifting market conditions. The portfolio approach also demands that multiple (potentially competing) options be tested—thereby short-circuiting the tendency to focus on one all-or-nothing bet.

See Also: Innovation Trends in 2016 Patient urgency avoids the large-company tendency to swing from complacency to panic. It loosens the constraints of shortsightedness and inappropriate planning models that lull large companies into thinking incrementally for too long, as Blockbuster did. It also lessens the chances of being late to the game and having to risk everything on a single desperate idea, like Penney, only to have it not pan out.

Who Is to Blame on Oklahoma Option?

There is plenty of blame to go around for the current mess, but a good deal of it belongs to the folks on the Oklahoma Insurance Commission.

sixthings
I’ve been highly critical of the Oklahoma Option, the alternative workers’ compensation system that was recently found to be unconstitutional by that state's Workers’ Compensation Commission. I’ve been critical of the backers of the system, as well as the employers that willingly set up plans in this closed and tightly controlled scheme. And while I’ve questioned how the Oklahoma Insurance Department, headed by Commissioner John Doak, could have approved plans so obviously deficient in comparison to those in the workers’ compensation system, I’ve never accused commissiioners of being otherwise involved. I just assumed it was stupidity, incompetence or slothfulness that allowed plans, required to provide benefits that are “equal to or better” than those provided under the workers’ comp laws of the state, to be approved for use when they were ultimately substandard. That all changed last week, during the second opt-out session held during the 32nd WCRI Annual Issues & Research Conference. The speaker who changed my point of view was James Mills, director of workers’ compensation and captive insurance at the Oklahoma Insurance Department. Mills went on at length about how proud they were at OID to have developed a “powerful system with options” for employers in their state. I do not recall his mentioning that those options have been found to discriminate against their employees, and were therefore unconstitutional. He did not address that at all. In fact, he spoke so positively about Opt Out that he sounded to me just like the concept's biggest promoter, Dallas attorney Bill Minick. He was just like Minick’s mini-me, or a mini-Minick, if you will. It became apparent from his presentation that the OID approvals were not borne of incompetence; no, the agency was instead directly culpable in the development and promotion of a scheme that creates discriminatory sub-classes of employees in the state. See Also: Strategic Implications of the Oklahoma Option Mini-Minick did not explain how plans that have draconian reporting requirements (most require an incident be reported in 24 hours or less, or all benefits may be denied) got approved when the state system allows for 30 days. He did not explain how plans that exclude a wide variety of injuries or conditions, like asbestos exposure or workplace violence, could be approved when the state system covers them. He did not explain how plans that do not even let an employee testify at an appeal of his denial could get by the OID. Frankly, there are many areas where the alternative plans come up short when compared with the state system, and mini-Minick didn’t explain any of them. He simply touted the OID's desire to work to preserve these options for employers. Clearly, Commissioner Doak appears to be a healthy proponent of the Oklahoma Option. This made clear to me why plans that have left everyone around the nation scratching their heads got approved in the first place. Of course, not every action may be intentional. There is obviously room for a little incompetence, as well. This is, after all the same insurance department that in 2012 issued an email announcement that an Insurance Commissioners Award for Tornado Awareness would be given to “the girl with the biggest [breasts].” (Seriously. Click here to read about it if you doubt me). Pesky details like proofreading emails or comparing benefit levels don’t appear to be a top priority in Doak’s department. I suppose it is appropriate the agency is run by a man whose last name rhymes with the sound Homer Simpson makes when he is completely flummoxed. I was speaking with some people at WCRI the morning following mini-Minick’s session when this topic came up. One of the people pointed out how employers always take the blame in situations like this, but that what they were doing was approved and legal. The problem was the legislative and regulatory environment that created the system to begin with. There is validity in that view, although the employers still should be held to account for the plans they adopted. “Because I could” was never an excuse that worked for me when I was in trouble as a child, and I suspect it will not be widely accepted in the public eye today (unless, maybe, you are Donald Trump, but that is another topic entirely). See Also: The State of Workers' Comp in 2016 For the failures of the Oklahoma Option, there is plenty of blame to go around, but a good deal of it apparently lies with the folks charged with watching the hen house. And it does not sound as though they understand their mistakes, which means they are likely destined to repeat them.

Bob Wilson

Profile picture for user BobWilson

Bob Wilson

Bob Wilson is a founding partner, president and CEO of WorkersCompensation.com, based in Sarasota, Fla. He has presented at seminars and conferences on a variety of topics, related to both technology within the workers' compensation industry and bettering the workers' comp system through improved employee/employer relations and claims management techniques.