Download

How to Handle New 'Ban-the-Box' Laws

A national movement has developed to ban certain questions about job applicants' criminal history, and employers need to act fast.

The term, “ban the box,” refers to the question on hiring applications that asks if the applicant has a criminal record/conviction; if so, he has to check the “Yes” box. “Ban-the-box” laws are laws designed to restrict employers from including questions that ask about prior arrests or convictions on initial employment applications. The purpose behind the law is to reduce unfair barriers to the employment of people with criminal records. The ban-the-box movement requires employers to act and fast. Numerous states and cities have enacted such laws, and we expect more to follow in the near future. Illinois’ ban-the-box equivalent, titled the Job Opportunities for Qualified Applicants Act, takes effect Jan. 1, 2015. Illinois is prohibiting private and public employers from asking about an applicant’s criminal history until after the employer selects the applicant for an interview or provides the applicant with a conditional offer of employment. Illinois’ act applies to all private-sector employers with 15 or more employees. There are exceptions. The act does not apply to: (1) jobs that cannot be held by convicted criminals under federal or state law, (2) jobs  requiring licensing under the Emergency Medical Services System Act and (3) jobs requiring fidelity bonds. The act gives the Illinois Department of Labor (“IDOL”) the power to investigate alleged violations and authorizes IDOL to impose civil penalties up to $1,500. We expect that IDOL will start fining employers as soon as the act goes into effect. Private-employer ban-the-box laws currently exist in California, Colorado, Connecticut, Delaware, Illinois, Maryland,   Massachusetts, Nebraska, New Jersey, New Mexico and Rhode Island. Numerous cities have passed similar laws. Pending legislation exists in Florida, Georgia, Louisiana, Michigan, New Hampshire, North Carolina, Ohio and numerous cities. Some states’ laws prohibit employers from asking about criminal history in the initial employment application before conducting an interview, while other laws prohibit such inquiries until after the employer makes a conditional offer of employment. Be wary, as ban-the-box laws vary in terms of what types of criminal-history questions employers may ask applicants. For example, some laws only allow employers to ask about specific convictions and explicitly prohibit employers from asking about non-conviction arrests or expunged records. Exemptions can vary as well, with exclusions for facilities or employers that provide programs, services or care to minors or vulnerable adults. As each state’s ban-the-box law may vary, it is important for employers to reevaluate their pre-employment and hiring practices. Employers affected by ban-the-box laws that do not update their applications and pre-employment processes risk being investigated and fined on an individual and potentially class-wide basis. Employers that operate in different states need to be diligent to make sure their applications are tailored to each state and city. The takeaway: Have your HR department or labor counsel review your employment applications and company policies to ensure that questions regarding an applicant's criminal history comply with applicable laws. Additionally, employers should consider providing compliance training to employees involved in interviewing and hiring to make sure they are knowledgeable about the new laws. Laura Zaroski wrote this article with her colleagues Joseph M. Gagliardo, Lily M. Strumwasser and Laner Muchin.

Laura Zaroski

Profile picture for user LauraZaroski

Laura Zaroski

Laura Zaroski is the vice president of management and employment practices liability at Socius Insurance Services. As an attorney with expertise in employment practices liability insurance, in addition to her role as a producer, Zaroski acts as a resource with respect to Socius' employment practices liability book of business.

Top 6 Myths About Predictive Modeling

Despite what many think, the most important important issue isn't which model to choose, and the biggest challenge isn't technical.

Even if you’ve been hiding under a rock the past 25 years, it’s almost impossible to avoid hearing about how companies are turning around their results through better modeling or how new companies are entering into insurance using the power of predictive analytics. So now you’re ready to embrace what the 21st century has to offer and explore predictive analytics as a mainstream tool in property/casualty insurance. But misconceptions are still commonplace. Here are the top six myths dispelled: Myth: Predictive modeling is mostly a technical challenge. Fact: The predictive model is only one part of the analytics solution. It’s just a tool, and it needs to be managed well to be effective. The No. 1 point of failure in predictive analytics isn’t technical or theoretical (i.e., something wrong with the model) but rather a failure in execution. This realization shifts the burden of risk from the statisticians and model builders to the managers and executives. The carrier may have an organizational readiness problem or a management and measurement problem. The fatal flaw that’s going to derail a predictive analytics project isn’t in the model, but in the implementation plan. Perhaps the most common manifestation of this is when the implementation plan around a predictive model is forced upon a group:
  • Underwriters are told that they must not renew accounts above a certain score
  • Actuaries are told that the models are now going to determine the rate plan
  • Managers are told that the models will define the growth strategy
In each of these cases, the plan is to replace human expertise with model output. This almost never ends well. Instead, the model should be used as a tool to enhance the effectiveness of the underwriter, actuary or manager. Myth: The most important thing is to use the right kind of model. Fact: The choice of model algorithm and the calibration of that model to the available data are almost never the most important things. Instead, the biggest challenge is merely having a credible body of data upon which to build a model. In “The Unreasonable Effectiveness of Data,” Google research directors Halevy, Norvig and Pereira wrote: “Invariably, simple models and a lot of data trump more elaborate models based on less data.” No amount of clever model selection and calibration can overcome the fundamental problem of not having enough data. If you don’t have enough data, you still have some options: You could supplement in-house data with third-party, non-insurance data, append insurance industry aggregates and averages or possibly use a multi-carrier data consortium, as we are doing here at Valen. Myth: It really doesn’t matter which model I use, as long as it’s predictive. Fact: Assuming you have enough data to build a credible model, there is still a lot of importance in choosing the right model -- though maybe not for the reason you’d think. The right model might not be the one that delivers the most predictive power; it also has to be the model that has a high probability of success in application. For example, you might choose a model that has transparency and is intuitive, not a model that relies on complex machine-learning techniques, if the intuitive model is one that underwriters will use to help them make better business decisions. Myth: Predictive modeling only works well for personal lines. Fact: Personal lines were the first areas of success for predictive modeling, owing to the large, homogeneous populations that they serve. But commercial lines aren't immune to the power of predictive modeling. There are successful models producing risk scores for workers' compensation, E&S liability and even directors & officers risks. One of the keys to deploying predictive models to lines with thin policy data is to supplement that data, either with industry-wide statistics or with third-party (not necessarily insurance) data. Myth: Better modeling will give me accurate prices at the policy level. Fact: Until someone invents a time machine, the premiums we charge at inception will always be wrong. For policies that end up being loss-free, we will charge too much. For the policies that end up having losses, we will charge too little. This isn’t a bad thing, however. In fact, this cross-subsidization is the fundamental purpose of insurance and is necessary. Instead of being 100% accurate at the policy level, the objective we should aim for in predictive analytics is to segment the entire portfolio of risks into smaller subdivisions, each of which is accurately priced. See the difference? Now the low-risk policies can cross-subsidize one another (and enjoy a lower rate), and the high-risk policies will also cross-subsidize one another (but at a high rate). In this way, the final premiums charged will be fairer. Myth: Good models will give me the right answers. Fact: Good models will answer very specific questions, but, unless you’re asking the right questions, your model isn’t necessarily going to give you useful answers. Take time during the due diligence phase to figure out what the key questions are. Then when you start selecting or building models, you’ll be more likely to select a model with answers to the most important questions. For example, there are (at least) two very different approaches to loss modeling:
  • Pure premium (loss) models can tell you which risks have the highest potential for loss. They don’t necessarily tell you why this is true, or whether the risk is profitable.
  • Loss ratio models can tell you which risks are the most profitable, where your rate plan may be out of alignment with risk or where the potential for loss is highest. However, they may not necessarily be able to differentiate between these scenarios.
Make sure that the model is in perfect alignment with the most important questions, and you'll receive the greatest benefit from predictive analytics.

Bret Shroyer

Profile picture for user bretshroyer

Bret Shroyer

Bret Shroyer is the solutions architect at Valen Analytics, a provider of proprietary data, analytics and predictive modeling to help all insurance carriers manage and drive underwriting profitability. Bret identifies practical solutions for client success, identifying opportunities to bring tangible benefits from technical modeling.

Obamacare Backlash: What Comes Next?

After a lot of grandstanding, the new Congress and the president will likely make a series of changes, including repealing what is seen as a bailout for insurers.

The firestorm over comments made by MIT economist Jonathan Gruber has not helped the cause of the White House and defenders of the ACA in Congress. The historical landslide in the recent mid-term elections will also bring a major legislative backlash in the new House of Representatives and U.S. Senate early in 2015.

I had high hopes for the ACA. I have been a supporter of healthcare reform dating back to my college days and at graduate school many decades ago. The goals of universal coverage, elimination of pre-existing condition limitations and allowing dependents to stay on their parents plan until age 26 are all things I fully support.

However, the rollout of the ACA was a debacle. The campaign promise that, "You can keep your plan if you like it" and that "you can keep your doctor" was given 4 Pinocchios by the Washington Post, not exactly a friend of the GOP. Ask Nixon.

Now after the self-proclaimed architect of the ACA touts the lack of transparency in the design of the ACA to fool the American voters and how stupid "we" are, don't expect a warm and fuzzy reaction in a GOP-controlled Congress. House Democratic leader and the White House are now busy "misremembering" the major role that Gruber played in drafting the ACA and are hoping the public will eventually, too.

What's next? Both the House and the Senate will vote to repeal Obamacare in January. The House has already done this a few dozen times, but now Harry Reid can't block a vote in the Senate. The president will veto this legislation. There will be political grandstanding with press conferences and dire predictions on both sides. Nothing will happen. There is a very little, if any chance, the GOP will have the votes to override a presidential veto.

What will most likely pass is a repeal of the tax on medical device manufacturers, which reportedly has bipartisan support. This will be problematic to the president and ACA supporters, because this will directly affect the proposed financing of the ACA. That lost revenue will have to be accounted for. Let me guess? Higher costs to consumers and companies providing health benefits to employees and their families? Is that correct, Mr. Gruber?

What will also likely pass Congress is a bill repealing the little-known provision providing a safe harbor to health insurance companies under the ACA, which essentially allowed a federally financed bailout if they end up losing money.

The ACA is here to stay, in my opinion, but incremental changes will be attempted. The GOP will support legislation to lower malpractice costs, allow small employers to band together in purchasing cooperatives, allow health insurance to be sold across state lines and make the implementation and administration of the ACA a state responsibility and not run by the federal government.

The president's own recent in-house advisory group recommended that the ACA be run by the states, because healthcare, like politics, is all local. This received very little, if any, play in the mainstream media. In fact, at least two major, national, mainstream news outlets have yet to even mention the controversy surrounding Jonathan Gruber's videotaped comments and the firestorm it has created.  I guess they misremembered to run the story.

Finally, for other possible changes, see a previous article of mine at Insurance Thought Leadership on April 9, 2014, regarding how the ACA has gutted major elements of the bipartisan healthcare reform efforts in Massachusetts by virtually eliminating experience rating for small to mid-size employers.

Gruber predicted health insurance premiums are going down because of the ACA. Please tell that to all the small and mid-sized employers across the U.S.  I have not heard from one whose costs are going down. Maybe they misremembered.

It's time to fix the ACA with a bipartisan effort and study what works and what doesn't, and certainly not be based on what someone in an ivory tower believes. He thinks we all are stupid anyway.


Daniel Miller

Profile picture for user DanielMiller

Daniel Miller

Dan Miller is president of Daniel R. Miller, MPH Consulting. He specializes in healthcare-cost containment, absence-management best practices (STD, LTD, FMLA and workers' comp), integrated disability management and workers’ compensation managed care.

11 Things That Matter Most in Managing Risk

Few organizations even think about the fact that they have a risk culture, and building the right one is crucial.

Having just returned from another industry gathering where practitioners are trying to get a read on the keys to success in risk management, I thought I’d share some thoughts that I often include in my presentations and RIMS workshops. Suffice it to say, no two practitioners are doing exactly the same thing nor following a template-based strategy if they’re having much success. I offer this introduction  to say two things: There is no one right way to practice risk management, and, by extension, the best risk strategies are those that are aligned with, if not custom-designed to fit, the priorities of the organizations for which they are intended. One thing is nearly certain: A risk strategy can’t be successfully executed without a risk framework to make actionable those strategies that inform success. A framework might best be guided by one of the risk standards that are increasingly informing how the work can best be done, but a standard is not a prerequisite to success. By contrast, a risk culture is a prerequisite. Your corporate culture represents the ways in which management and governance prefer employees to behave. It is typically tied to a set of values such as honesty, integrity and excellence. But do you realize that you also have a risk culture, even if you haven’t purposely defined and implemented one? Whether your organization is risk-averse, risk-assumptive or somewhere in between these two extremes, your employees have risk taking and managing behaviors that, without a specific design and strategy for the risk culture you desire, will not likely be the behaviors or culture you most need and ideally desire. Therefore, communicating on risk culture can be most valuable to your long-term risk-management effectiveness. What matters most in achieving this desired state? Well, rather than produce another list of top 10 items, here are 11 things that, in my opinion, matter most in effectively managing risk. If you operate with these elements in place, you will be more likely to have an effective strategy that other leaders will both contribute to and enable through resources. Downside Protection: This is job one. The first priority is to make sure reasonably preventable loss is addressed through both mitigations and financing tactics. Management and governance rightly assume this is under way. Influence and Gumption: Every senior risk leader must have the respect to be heard and the gumption to push back on risk owners and stakeholders with whom he may disagree. Consistency: With risk process and sub-processes being the way in which the work gets executed, it is essential that they are consistently applied by all users. Process Rigor: Processes that produce results and have impact require a rigorous approach to how they are designed, measured for effectiveness and continuously improved. Data Interpretability: There must be actionable information about results and impact. Communication Clarity: Beginning with a clear definition of risk itself, an entire sub-strategy for communicating your messaging will ensure you reach the ”right recipients at the right time with the right message.” Reliable Measurability: Not every risk can or should be quantitatively measured, but, when you do, make sure the measure is as believable as possible. Value Creation: Recognizing and leveraging risk for gain is the necessary evolution of the discipline’s practitioners if they ever hope to move beyond the tactical. Embedded Risk Culture: Driving consistent and aligned risk-taking behaviors and decisions across the enterprise can only be achieved by embedding a well-defined and disciplined risk culture. Managing to Appetite and Capacity: Risk cannot be effectively managed without a clear view into how much risk you are taking, want to take and have the capacity to take or assume. Aligning Risk and Performance: The ultimate outcome for risk professionals is to manage risk relative to performance. Alignment, if not integration, between risk and performance is essential to achieving short- and long-term goals. So there you have it: the 11 things that matter most in managing risk effectively. Sure, there are many other tactical elements of a good risk strategy and framework, but I believe they will naturally flow out of these elements when put into practice with the  proper senior level mandate and regular reinforcement of the strategy.

Christopher Mandel

Profile picture for user ChristopherMandel

Christopher Mandel

Christopher E. Mandel is senior vice president of strategic solutions for Sedgwick and director of the Sedgwick Institute. He pioneered the development of integrated risk management at USAA.

The Future of Money: Not What You Think

Decentralization is the future. Insurers can innovate -- or watch as individuals form risk pools without any corporate intermediaries.

Never underestimate the ability of the human species to adapt to changes in its environment. All humans are engineers. If there is too much friction in a system, they will fix it, or they will replace it. When banks add overdraft penalties, incur service fees, constrain capital, restrict mobility or compromise the public trust in any way, all those engineers will make a “correction.” Money, after all, is a social agreement.

Today, young people are encountering a financial game that they cannot win playing by the rules that are presented to them. The result should surprise no one – they will either not play the game, or they will change the rules. In fact, innovation in banking is happening at an astonishing rate; unfortunately, bankers are not necessarily doing it.

Because banking touches every part of our lives, so, too, will any innovation that occurs in the domain of banking.

Look at Bitcoin. It is more than just a cute new social app like Facebook or Twitter – it is a new idea called decentralization. If it is possible to decentralize banking, it would also be possible to decentralize everything; insurance, engineering, education, production (i.e., corporations), education, legislation and even governance. Nothing is immune from the next wave of Internet innovation that is bearing down -- and right now, not tomorrow.

Because this is an insurance audience, allow me to mention that, the easiest (technically) and likely the first big innovation that will arise from the decentralization movement will be decentralization of insurance. With the advent of smart contract platforms such as Ethereum and Ripple Labs, people can form their own risk-sharing pools to cover a whole suite of perils now in the domain of insurance. (For the lawyers and politicians out there, it is also nearly trivial to set up voting, escrow, contract enforcement, etc., via the sort of block chain protocol that is the basis for Bitcoin.)

Last year, I published an article called “What if everyone was a BitCoin”? The core idea was that there are several problems with Bitcoin:

  • Concentration of wealth is worse than the dollar.
  • The proof of work that creates coin is trivial except for the fact that it is difficult.
  • The valuation was speculative.

Today, there are hundreds of companies forming, and being funded in the millions of dollars, that are investing in innovations that would create thousands, if not millions, of alt-coins with characteristics of Bitcoin, except iterated without the impracticalities of Bitcoin.

For example, MaidSafe was able to introduce a currency called Safecoin that provides a way to take unused computational capacity that members are willing to contribute and build a decentralized server network. This network encrypts data flowing through it, creating a secure and anonymous Internet. What happens to big data when people stop sharing the streams of information available on today's Internet?

Further, innovations such as Curiosumé (by this author) could have wide-ranging implications on everything from education to corporate HR and factors of production -- Curiosumé is an open-source development project designed to replace the resume as a means for describing one's interests, skills and abilities; the tag line is, "Because the resume must die."

Swarm.co allows individuals to invest time and money in decentralized innovations without banks, insurance, corporations, etc. A new generation of venture capitalists such as DApps Fund is already funding new startups in crypto-currencies and demonstrating high convertibility and liquidity.

Every month, thousands of people are coming together at Meet-up  (itself an earlier social innovation) to learn, teach and collaborate on open-source platforms such as Ethereum, Bitcoin, Ripple and many others. Every day, with each article warning of the dangers of Bitcoin, there is another article of an ex-CEO banker coming out strongly in favor of the financial innovation in the crypto space. What is certain is that every impression placed on the public regarding these new technologies is bad for the status quo for banking and insurance.

Resistance predictably comes from the public voice of banks and governments, which have the most invested in the way things are. This is not to say that they are bad and wrong, just that they have the greatest infrastructure in place to support the existing system. Changing their minds is like pushing electric cars against the tide of Big Oil; lines have been drawn in concrete.

What we are seeing is not a “revolution” with a central army in a field of battle; there is simply a natural progression happening fueled by rational efficiency and nothing else. But change is inevitable.

As with previous financial innovations, my guess is that some trader may discover that the true risk associated with a particular crypto-asset is less than what the risk-adjusted market valuation indicates it is. Then, a financial instrument will be developed to exploit the risk-arbitrage. Some readers may recall the saga of Michael Milken, who correctly observed that companies with low credit scores were in some cases less likely to fail than their risk valuations indicated. This led to the creation of junk bonds and, ultimately, the idea that risk valuations can be skirted. To Milken’s credit, the assumption held until greed set in (which is not the fault of the asset).

I believe something similar may or must happen in finance to spawn internal innovation. For example: the insurance industry does not necessarily care about risk per se; the industry cares mostly that the risk is priced correctly. Soon, the insurance industry may realize that the risk of assets backed in crypto-currencies is lessened because of increased liquidity, fewer restrictions and regulations and rapid convertibility and because they are underwritten by better fundamental assets than the dollar. The industry will develop financial instruments that exploit this risk arbitrage and profit considerably.

But if the insurance company does not innovate in this future form of value, then people will build their own instruments. These new ideas and the technologies will enables millions of entrepreneurs and billions of engineers to print their own money one social agreement at a time. My advice to the insurance industry is to get in, help out and adapt before your customers leave you behind.

(Editors note: You are invited to join the author at The Future of Money and Technology Summit in San Francisco, Dec. 2, 2014, for his panel: Everything that Can Be Decentralized Will Be Decentralized.

The description is:

Much of our society today is based on centralized organizations that allocate our land, labor and money to create the things that we need. Today, we have an opportunity to specify and design any number of decentralized applications that also can produce all the things that society needs -- except with stunning efficiency. This is a conversation about what is not only possible but is becoming increasingly probable. This group of speakers represent innovations that decentralize: data, venture capital, productivity, currency, contracts and knowledge -- and that’s just the beginning.

The speakers are:

Paige Peterson - Maidsafe

Sam Onat Yilmaz - DApps Fund

Joel Dietz - Swarm.co

Christian Peel - Ethereum

Moderator: Dan Robles, The Ingenesist Project)


Dan Robles

Profile picture for user DanRobles

Dan Robles

Daniel R. Robles, PE, MBA is the founder of The Ingenesist Project (TIP), whose objective is to research, develop and publish applications of blockchain technology related to the financial services and infrastructure engineering industries.

Should You Buy Coverage for Professional Fees?

The minor endorsement can make a major difference on a property claim.

Property insurance claims require significant time, effort and attention from risk management, finance and operations personnel. From the moment the loss is reported, insurers will have what seems like endless requests for information, and they'll scrutinize every figure presented. Then the insured has to put the claim together and present it to the property insurers. The amount of activity is often more than the policyholder anticipates. Insurers understand the burden this places on the policyholder, and it is the reason most insurers offer professional fees coverage. This minor endorsement can be a major difference maker both in effort and outcome.

Here's an example of professional fees wording from a recent policy referring to the coverage for actual costs incurred by the insured: "reasonable fees payable to the insured's: accountants, architects, auditors, engineers and other professionals; for producing and certifying any particulars or details contained in the insured's books or documents, or such other proofs, information or evidence required by the company resulting from insured loss payable."

As you can see, the wording is intended to cover the additional costs associated with the claim.

Here's what's generally not covered:

1) "attorneys, public adjusters and loss appraisers, including any of their subsidiary, related or associated entities either partially or wholly owned by them or retained by them for the purpose of assisting them,

2) "loss consultants who provide consultation on coverage or negotiate claims."

The specific wording of the endorsement will vary and should be carefully reviewed before engaging outside claim services. Some wording is broad and will cover most consultants. Other wording is more restrictive and eliminates certain classes of consultants. To determine what's best for your business, consider the available service providers and evaluate who would best represent your interests.

Often, policyholders don't fully understand the nature of this coverage. Some don't know of it. Some are unaware if they have it. Others may not know if or when to involve a specialist in their claim.

Don't confuse the purpose of this coverage with the "free" help that the insurance adjusters offers. The adjuster's job is to confirm coverage and audit the claim. It is the responsibility of the insured to measure, document and present the claim. If the adjuster's consultants offer to help measure the loss and put the claim together, it would be like having the IRS prepare your taxes. As a courtesy, you should notify the adjuster that you plan to use a claim preparation firm and disclose billing rates and proposals, but the decision is yours to hire, and if the work matches the coverage the insurance company is required to pay for it within reason. The consultant is engaged by the insured, and invoices are reimbursed by the insurance company as part of the claim.

So who is the best choice to help you prepare your claim? Forensic accountants are the most common and appropriate service provider for claim preparation. Forensic accountants can help with:

  1. the tedious and burdensome tasks associated with the claims process
  2. expertise on the adjustment process
  3. efficient interface with policyholder data gathering resources
  4. maximizing recovery and expediting claim resolution
  5. making the formal claim presentation

While the policyholder still needs to produce information, the claim preparers will efficiently package the information in the form of claim presentations. Some brokers have a claim preparation unit, but there could be a conflict of interest there, as well. The broker is an intermediary between the insured and the insurer. It is difficult to walk that line and truly be supportive of the insured. Most brokers accept contingent commissions based on the profitability of an engagement during the policy year, and the client executives have incentives to use their own services. While not a clear conflict, it certainly has potential to influence the position of the insured.

The good news is there are firms that won't come with baggage -- i.e., conflicts of interest. The best solution is a third party, independent firm that has ample experience and can represent your interests with a specialized skill set. Remember, the firm must be skilled in the complexities of property damage and business interruption claims.

It is critical to have your claim preparation team vetted ahead of a loss. Finding time to interview forensic accountants and review proposals after a loss can waste precious time and derail a claim before it even gets going.

"Do your due diligence and find the best fit for your organization by arranging introductions to your finance/accounting leadership. It is worth the effort when you find the right partner," says John Lafferty, manager, risk and insurance management, at Air Products & Chemicals.

If you have property exposure, it's wise to have your forensic accountants in place and to have the coverage for their services. Risk managers should include professional fees coverage in their discussions with underwriters. With most carriers, it should not materially affect your premium -- if at all. As the market continues to soften, many policyholders are enjoying rate reductions with improved terms, so this is the perfect market climate to explore professional fees coverage if you don't have it. If you do have coverage, look for increased limits. A good benchmark for limits would be to 1% to 2% of your probable maximum loss. This should easily cover the costs for claim preparation from a reputable firm.

If you apply this information and incorporate these recommendations, the next time you have a property loss with business interruption the process will be smoother and results will impress you and your executives. So find your team and get that coverage. You'll be prepared to recover whatever loss comes your way.


Christopher Hess

Profile picture for user ChristopherHess

Christopher Hess

Christopher B. Hess is a partner in the Pittsburgh office of RWH Myers, specializing in the preparation and settlement of large and complex property and business interruption insurance claims for companies in the chemical, mining, manufacturing, communications, financial services, health care, hospitality and retail industries.

How to Understand Your Risk Landscape

Boards have a new fiduciary duty: to manage information about risks with the same controls they apply to accounting.

sixthings
This is part two of a series of five on the topic of risk appetite and its associated FAQs. The author believes that enterprise risk management (ERM) will remain locked in organizational silos until boards are mobilized in terms of their comprehension of the links between risk and strategy. This is achieved either through painful and expensive crises or through the less expensive development of a risk appetite framework (RAF). Understanding risk appetite is very much a work in progress for many organizations. The first article made a number of observations of a general nature based on experience in working with a wide variety of companies. This article describes the risk landscape, measurable and unmeasurable uncertainties and the evolution of risk management. The Risk Landscape Lessons learned following the great financial crisis (GFC) include the importance of establishing an effective risk governance framework at the board level. In essence, two key questions must now be addressed by boards. First, do boards express clearly and comprehensively the extent of their willingness to take risk to meet their strategic and business objectives?  Second, do they explicitly articulate risks that have the potential to threaten their operations, business model and reputation? To be in a position to provide credible answers to these fundamental questions, we must first seek to understand the relationship between risk and strategy. It is RMI’s experience that risk and strategy are intertwined. One does not exist without the other, and they must be considered together. Such consideration needs to take place throughout the execution of strategy. Consequently, it is vital that due regard is given to risk appetite when strategy is being formulated Crucially, risk is now defined as "the effect of uncertainty on objectives." It is clear, therefore, that effective corporate governance is strategy- and objective-setting on the one hand, and superior execution with due regard for risks on the other. This particular landscape is what we in RMI refer to as the interpolation of risk and strategy. For this reason, RMI describes board risk assurance as assurance that strategy, objectives and execution are aligned. Alignment is achieved through operationalization of the links between risk and strategy, which will be described in the final article in this series. Before further discussion, however, we would like to draw attention to observations based on our practical experience that give cause for concern, namely: 1.  Risk appetite: While we now have a globally accepted risk management standard3 and sharper regulatory definition of effective risk management for regulated organizations, there is as yet much confusion, and neither a consensus nor an internationally accepted guidance, as to the attributes of an effective risk appetite framework. 2.  Risk reporting: In relation to risk reporting, two significant matters arise: Risk registers that are primarily generated on the basis of a compliance-centric requirement, as distinct from an objectives-centric4 approach, tend to contain lists of risks that are not explicitly associated with objectives. As such, they offer little value in terms of reporting on risk performance. Note: RMI supports the adoption of a board-driven, objectives-centric approach5 to reporting and monitoring risks to operations, the business model and reputation. Risk registers and other reporting tools detail known risks and what we know we know. They tend not to detail emerging or high-velocity risks that have the potential to threaten the business model. As such they tend to be of limited value in terms of reporting or monitoring either unknown knowns6, or unknown unknown7 risks. This is a matter that should give boards cause for concern given pace of change, hyper-connectivity and the disruptive nature of new technologies. 3.  Risk data governance: The quality, rigor and consistency in application of accounting data that is present in well-managed organizations does not equally exist in those same organizations in the risk domain. The responsibility of directors to use reliable accounting information and apply controls over assets, etc. (internal controls) as part of their legally mandated role extends equally to information pertaining to risks that threaten financial performance. The latter is not, however, treated in an equivalent fashion to accounting data. Whereas the integrity of accounting data is assured through the use of proven and accepted accounting systems subject to audit, information pertaining to risks typically relies on the use of disparate Excel spreadsheets, word documents and Power Points with weak controls over the efficacy of copying and pasting of data from one level of report to another. Weaknesses and failings in risk data governance can be addressed in much the same way as for other governance requirements. For example: a.    Comprehensive training for business line managers and supervisors on:
  •  (Risk) Management Processes,
  •  (Risk) Vocabulary,
  •  (Risk) Reporting,
  •  Board (Risk) Assurance Requirements
b.    Performance in executing (risk) management roles and responsibilities included in annual performance appraisals,   c.   System8 put to process through the use of database/work flow solutions, providing an evidence basis of assurance that:
  • The quality, timing, accessibility and auditability of risk performance data is as rigorously and consistently applied as that for accounting data,
  • Dynamic management of risk data (including risk appetite/tolerance/criteria) can be tracked at the pace of change
  • Tests can be applied to the aggregation of risks to objectives at the pace of change and prompt interdictions applied when required,
  • Reports, or notification, of significant risks are escalated without delay, and without risk to the originator of information.
4.  Lack of understanding of the nature of the risks that need to be mastered in the boardroom: Going back to our definition of risk as the effect of uncertainty on objectives: There are many types of objectives -- for example, economic, financial, political, regulatory, operational, customer service, product innovation, market share, health safety, etc. -- and there are multiple categories of risk. But what is uncertainty? Uncertainty9 is the state, even partial, of deficiency of information related to understanding or knowledge of an event, its consequence or its likelihood. There are essentially two kinds of uncertainty: 1.   Measurable uncertainties: These are inherently insurable because they occur independently (for example, traffic accidents, house fires, etc.) and with sufficient frequency as to be reckonable using traditional statistical methods. Measurable uncertainties are treated individually through traditional (risk) management supervision, and residually through insurance. Measurable uncertainties are funded out of operating profits. 2.   Unmeasurable uncertainties:  These are inherently un-insurable using traditional methods because of the paucity of reliable data. For example, whereas we can observe multiple supply chain and service interruptions, data breaches, etc. they are not sufficiently similar or comparable to be soundly put to a probability distribution and statistically analyzed. Un-measurable uncertainties are treated on a broad basis through organizational resilience. For the top 5-15 corporate risks10 that are typically inestimable in terms of likelihood of occurrence, the organization seeks to maintain an ability to absorb and respond to shocks and surprises and to deliver credible solutions before reputation is damaged and stakeholders lose confidence. Un-measurable uncertainties are funded out of the balance sheet. The hyper-connected and multispeed world in which we live today has driven the effect of un-measurable uncertainties on company objectives to unprecedented, heights, and so amplified the risk potential enormously. 5.  Urgent need to recognize the mission-critical importance of building  and preparing management to always be prepared to offer credible solutions in the face of unexpected shocks and surprises  Figure 1 below describes the evolution of risk management as depicted within the red dotted line11 and the next stage of the evolution (resilience) as envisioned by RMI.

RMIFINAL

Figure 1: Evolution of risk and the emergence of “resilience” as the current era in the evolution of 21st century understanding of risk  

Resilience was the theme that ran through the World Economic Forum: Global Risks 2013, Eight Edition Report.  Resilience was described as capability to
  1. Adapt to changing contexts,
  2. Withstand sudden shocks, and
  3. Recover to a desired equilibrium, either the previous one or a new one, while preserving the continuity of operations.
The three elements in this definition encompass both recoverability (the capacity for speedy recovery after a crisis) and adaptability (timely adaptation in response to a changing environment). The Global Risks 2013 Report emphasized that global risks do not fit neatly into existing conceptual frameworks but that this is changing insofar as the Harvard Business Review (Kaplan and Mikes12) recently published a concise and practical taxonomy that may also be used to consider global risks13. The report advises that building resilience against external risks is of paramount importance and alerts directors to the importance of scanning a wider risk horizon than that normally scoped in risk frameworks. When considering external risks, directors need to be cognizant of the growing awareness and understanding of the importance of emerging risks. Emerging risks can be internal as well as external, particularly given growing trends in outsourcing core functions and processes.
table3
It is also interesting to observe the diversity in understanding of emerging risk definitions. For example:
  • Lloyds: An issue that is perceived to be potentially significant but that may not be fully understood or allowed for in insurance terms and conditions, pricing, reserving or capital setting,
  • PWC: Those large-scale events or circumstances beyond one’s direct capacity to control, that have impact in ways difficult to imagine today,
  • S&P: Risks that do not currently exist,
The 2014 annual Emerging Risks Survey (a poll of more than 200 risk managers predominantly based at North American re/insurance companies) reported the top five emerging risks as follows:
  1. Financial volatility (24% of respondents)
  2. Cyber security/interconnectedness of infrastructure (14%)
  3. Liability regimes/regulatory framework (10%)
  4. Blowup in asset prices (8%)
  5. Chinese economic hard landing (6%)
Maintaining business defense systems capable of defending the business model has become an additional fiduciary requirement for the board, alongside succession planning and setting strategic direction15. References: Influenced by COSO (Committee of Sponsoring Organizations of the Threadway Commission, Enterprise Risk Management (ERM)  Understanding and Communicating Risk Appetite, by Dr. Larry Rittenberg and Frank Martens 2 Source: ISO 31000 (Risk Management 2009). ISO 31000 is now the globally accepted risk management standard. 3 The new globally accepted risk management standard (ISO 31000) is not intended for the purposes of certification. Rather, it contains guidance as to risk-management principles, a framework and risk management process that can be applied to any organization, part of an organization or project, etc. As such, it provides an overarching context for the application of domain-specific risk standards and regulations -- for example, Solvency II, environmental risk, supply chain risks, etc. 4 Risk Communication Aligning the Board and C-Suite: Exhibit 1 Top Challenges of Board and Management Risk Communication by Association for Financial Professionals (AFP), the National Association of Corporate Directors (NACD) and Oliver Wyman 5  The Conference Board Governance Centre, Risk Oversight: Evolving Expectations of Board, by Parveen P. Gupta and Tim J Leech 6 An unknown known risk is one that is known, and understood, at one level (e.g. typically top, middle, lower level management) in an organization but not known at the leadership and governance levels (i.e. executive and board levels) 7An unknown unknown risk is a so called black-swan (The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb) 8 Specified to the ISO 31000 series 9 Source: ISO 31000 (Risk Management 2009). ISO 31000 is now the globally accepted risk management standard 10 More than 80% of volatility in earnings and financial results comes from the top 10 to 15 high-impact risks facing a company: Risk Communication Aligning the Board and C-Suite, by the Association for Financial Professionals (AFP), the National Association of Corporate Directors (NACD), and Oliver Wyman 11 Source: Institute of Management Accountants, Statements on Management Accounting, Enterprise Risk Management : Frameworks, Elements and Integration 12 Managing Risks: A New Framework 13 Kaplan and Mikes' third category of risk is termed “external” risks, but the Global Risk 2013 report refers to them as “global risks." They are complex and go beyond a company’s scope to manage and mitigate (i.e. they are exogenous in nature). 14 Audit and Risk, 21 July 2014, Matt Taylor, Protiviti UK, 15 The Financial Reporting Council has determined that it will integrate its current guidance on going concern and risk management and internal control and make some associated revisions to the UK Corporate Governance Code (expected in 2014). It is expected that emphasis will be placed on the board's making a robust assessment of the principal risks to the company’s business model and ability to deliver its strategy, including solvency and liquidity risks. In making that assessment, the board will be expected to consider the likelihood and impact of these risks materializing in the short and longer term;

Peadar Duffy

Profile picture for user PeadarDuffy

Peadar Duffy

Peadar Duffy is founder and chairman of Risk Management International (RMI) a firm that has been advising clients in relation to risk in Ireland and internationally for more than 20 years. He is a member of the International Organisation for Standardization (ISO) TC 262 Working Group 2, which is currently undertaking a review of the global standard for risk management (ISO 31000).

Better Way to Handle Soft-Tissue Injuries

Perform tests to set a baseline before an injury occurs, then conduct the same tests after an injury and compare the two.

|
The most costly problem facing employers today is work-related, soft-tissue injuries, more commonly known as work-related musculoskeletal disorders (WRMSD). According to OSHA, WRMSD account for 34% of lost work days in the U.S., as well as a third of the dollars spent in workers’ compensation and of all work-related injury cases. Not surprisingly, soft tissue injuries -- to the ligaments, tendons and fibers of the body that connect the bones -- are difficult to diagnose. Standard diagnostic tests such as X-rays or imaging are frequently unable to document the presence of pain and loss of function. As a result, diagnoses are often subjective, leading to poor treatment (including unnecessary surgery and overuse of narcotics), extra lost work time, precariously high medical costs and, at times, fraudulent claims. There is a need for accurate, timely and evidence-based diagnosis and treatment to curtail escalating costs and improve clinical outcomes, as these case studies show: Case 1 A 44-year-old gentleman had undergone a baseline EFA. (The Electrodiagnostic Functional Assessment, or EFA, combines mutltichannel wireless electromyography (EMG) with range-of-motion testing and integrates that with a functional output). He is employed as an unloader in the shipping department. He alleged a work-related injury in October 2014, five weeks into his employment. He stated that he injured his shoulders when he put his hands out to block a fall. He complained of bilateral shoulder pain, radiating to the right upper extremity. He rated the pain as an eight on a scale of one to 10. But an EFA found no change from the baseline test. Outcome: Because there was no change from his baseline, he was released from treatment and advised to see his primary care physician for any further medical needs. Case 2 A 37-year-old gentleman was employed as a loader. He alleged a work injury in October 2014; when he bent to lift some ice, he felt a pain in his lower back. He complained of radiating lower back pain, into the left lower extremity, rated as a 6/10. He was referred by his occupational medicine doctor, as there were no objective findings, and his subjective complaints seemed out of proportion. An EFA revealed normal EMG activity, with chronic, unrelated pathology. Outcome: When he returned for his follow-up evaluation after the EFA, he still had the same subjective complaints. After his doctor reviewed the EFA findings, he stated that he felt much better and asked for a release to return to  full duty at work. Case 3 A 34-year-old gentleman was employed as a mix/truck driver. He had undergone a baseline EFA in June 2014 and had a work-related motor vehicle accident in September 2014. His head struck the roof of his truck, and he was not wearing a hard hat. He complained of neck, shoulder and head pain. When an EFA was compared with the baseline, chronic, unrelated pathology was noted. However, the comparison also revealed a change in the paracervical region. This change was consistent with the date and mechanism of injury and with his subjective complaints. Outcome: The EFA comparison was able to identify and redirect care, away from the chiropractic care that he was receiving. After imaging studies were performed and the results found to be consistent with the EFA findings, he received site-specific, conservative care for his work-related injury, and his symptoms improved. It is our opinion that the EFA-STM provides a book end solution, comparing a pre-injury test to a post-injury assessment to objectively and accurately determine AOECOE (arising out of employment/course of employment) status. One must base a medical evaluation on facts, not subjective complaints. When that is accomplished, proper diagnosis and treatment are rendered, and outcomes improved. The authors invite you to join them at the NexGen Workers' Compensation Summit 2015, to be held Jan. 13 in Carlsbad, CA. The conference, hosted by Emerge Diagnostics, is dedicated to past lessons from, the current status of and the future for workers' compensation. The conference is an opportunity for companies to network and learn, as well as contribute personal experience to the general knowledge base for workers' compensation. Six CEU credits are offered. For more information, click here.

Frank Tomecek

Profile picture for user FrankTomecek

Frank Tomecek

Frank J. Tomecek, MD, is a clinical associate professor of the Department of Neurosurgery for the University of Oklahoma College of Medicine-Tulsa. Dr. Tomecek is a graduate of DePauw University in chemistry and received his medical degree from Indiana University. His surgical internship and neurological spine residency were completed at Henry Ford Hospital.

Go Ahead! Let Rivals Put Ads on Your Site

Learn from Amazon. If you're not a fit, don't just deliver a "sorry" page. Steer the customer elsewhere, helping him -- and generating revenue

The thought of placing links to competitors’ products or services on a company website would normally stop chief marketing officers or marketing managers dead in their tracks. After all, once you’ve attracted a consumer to your website, it’s only natural to expect them to purchase your product. However, what if that specific item isn’t available? Or it’s sold out? Or it’s not the right fit? You’ve exhausted your advertising investment on this potential customer, only to deliver a “sorry page,” forcing him to leave your site and head elsewhere. But what if your "sorry pages," instead of being a dead end for a consumer, could still bring in revenue? It sounds like some Twilight Zone marketing nightmare, but it’s becoming a much more utilized practice - and a smart one at that. Businesses need to come to terms with the fact that they don’t own customers; they are, at best, momentarily rented. This can leave brands in the lurch when it comes to sales and brand loyalty. However, there are many options a business can apply that ultimately generate revenue, even without sales, when the consumer turns to make a purchase from a competitor. But competitor advertising placed on your own website? This is how it ends up being a win-win. Say you are a car insurance company looking for policy holders with specific characteristics. A consumer will visit your website and fill out an application; however, they do not meet your ideal customer criteria. Now you have two options: reject them outright or provide additional other options with that rejection, some of which may be best obtained through a competitor. To suggest an alternate purchase by providing a link to a competitor’s website in the form of, “You may be interested in policies from XXXX,” you’d be providing prime real estate for a competitor to make a sale through a tailored advertisement. You receive revenue from an otherwise lost opportunity by selling advertising space on your site to these competing brands, while also building positive brand recognition. These consumers have shown intent, and you’ve helped them get what they need, which makes you a valuable asset. Instead of shutting down the deal and turning your back on the customer, you are aiding them in their search, while providing a competitor with a valuable lead, for which they are paying. Win-win, right? Many customers would consider that a positive brand experience. After all, it shows that your brand is ultimately focused on helping the customer find what she needs, even if you are not the one to give it to her. Consider Amazon. This booming company not only has an enormous selection of goods for customers to choose from, but it also provides links to external shopping sites instead of sorry pages when a specific product is not available.

Patrick Quigley

Profile picture for user PatrickQuigley

Patrick Quigley

Patrick Quigley is the CEO of Los Angeles-based advertising technology company Vantage Media. He has more than 15 years experience in sales, marketing, product management and engineering with both public and private companies.

How Not to Innovate (and How to): IBM v. Facebook, Amazon

Facebook and Amazon are investing for the long term. IBM? Not so much.

You don’t often see a CEO squash a great quarter with an analyst call script designed to batter his company’s stock price. Yet, that’s what Mark Zuckerberg recently did—to his great credit. One of the biggest challenges for public companies is to make investors prioritize long-term value over short-term profits. So, rather than running a victory lap after beating Wall Street’s expectations for the third quarter, Zuckerberg laid out a long-term strategy that drove loads of investors away. To understand why this was a smart move, look at Amazon.com and IBM. Few CEOs have been more audacious than Amazon CEO Jeff Bezos in managing Wall Street. Bezos told investors 17 years ago, “It’s all about the long term. . . . Our goal is to move quickly to solidify and extend our current position while we begin to pursue the online commerce opportunities in other areas…. We will continue to make investment decisions in light of long-term market leadership considerations rather than short-term profitability considerations or short-term Wall Street reactions." Bezos has, ever since, eschewed profitability to invest in—and deliver—innovation and growth in ever-expanding retail categories and adjacent markets, including tablets, phones, original content and cloud services. In return, Amazon attracted investors who believed in his long-term time horizon, as evidenced by Amazon’s astronomical price-to-earnings-ratio and a market cap that stands at nearly $140 billion. Amazon’s costs have risen to eat up all revenues -- but the stock price has rocketed. At some point, Bezos will have to deliver profits commensurate with Amazon’s scale, but, for now, investors continue to give him wide latitude to invest for the long term. Contrast Bezos’ approach with that of his counterparts at IBM. In 2006, IBM CEO Sam Palmisano told investors that IBM would focus on earnings per share, ahead of growing IBM’s business. Palmisano’s rationale, as he recently told Harvard Business Review, was that IBM was a “mature business” that needed to respond to the demands of shareholders who “wanted more margin expansion and cash generation than top-line growth.” Palmisano dubbed his plan Roadmap 2010 and committed to increase earnings per share from $6 to $10 by 2010.In return, IBM amassed like-minded investors who drove up IBM’s share price in lockstep with earnings growth. Roadmap 2010 was so successful with investors that Palmisano and his successor, Ginni Rometty, doubled down on it with Roadmap 2015—which called for IBM to deliver $20 per share by 2015. While investors loved them, Roadmaps 2010 and 2015 took IBM far off course in meeting the needs of its customers and employees. The problem is that while IBM might well be a mature company, it competes in an industry that was far from mature in 2006, and is even less mature today. Rather than accepting old age, IBM needs to be as agile as startups and growth-minded goliaths—like Amazon. But Palmisano and Rometty locked IBM into a set of investors and expectations that left less and less room for agility. Rather than staying on top of the rapid advances in information technology products and services redefining its industry—and every one of its target markets—IBM management had to focus on financial and business reengineering to increase earnings for its investors, as promised. One industry insider summed up IBM’s predicament this way: "They are paying the price for moving to services—which was smart—but not investing in the fundamentals to support services, e.g., recruiting, training, staffing, etc. They kept their earnings afloat by financial engineering, such as loading up on debt to fund buybacks." IBM paid a high price for not innovating. It suffered declining revenues for the last 10 quarters—to the point where Rometty had to abandon Roadmap 2015. The superiority of Amazon’s long-term market leadership approach over IBM’s short-term profitability considerations is evident in the two giants' battle over cloud services. Who could have imagined in 1997 that one of the next big things in IT services would be cloud services, and that Amazon would be the dominant player in that space? If anyone should have foreseen and owned the cloud, it should have been IBM—given its deep client relationships and long history of running data and networking centers for corporate clients. Yet, BusinessWeek estimates that Amazon Web Services is one of the fastest-growing software businesses in history:
The growth of Amazon’s cloud business is unprecedented, at least when compared to other business software ventures. It’s grown faster after hitting the $1 billion revenue mark than Microsoft, Oracle, and Salesforce.com. You would need to turn to Google—which had the advantage of the vast consumer market—to find a business that grew faster.
Amazon has also vanquished IBM in head-to-head competition, including for a high-profile 10-year, $600 million contract for cloud services to the CIA. What’s more Amazon’s initial success has drawn other deep-pocketed competitors like Google and Microsoft.  The resulting price war might have more dire consequences for IBM, which does not have Amazon’s long-term investment flexibility. So, when Zuckerberg prepared for his recent earnings call, he could have easily crafted a story that short-term investors would have loved. Facebook beat expectations for both top-line revenue and bottom-line profits. It also made great strides in showing that it would dominate in the mobile space—a transition about which many observers (including me) had been skeptical. Facebook’s share price would surely go for a nice ride if Zuckerberg simply focused on monetizing his social network. Instead, Zuckerberg took a page from Bezos’ playbook and laid out five and 10-year visions with aspirations far beyond the core Facebook network. He told investors that he would build a series of other billion-user products—before starting to monetize them. What’s more, he said that Facebook would build the next major computing platform, which he believes will revolve around augmented reality. And he made it clear that this vision required significant investment and that, like Bezos, he would prioritize long-term market leadership over short-term profitability: "We’re going to prepare for the future by investing aggressively." The subsequent drop in share price meant that a lot of investors got the message, and left. Zuckerberg still needs to deliver on his long-term strategy. But he left little doubt about his intentions and made sure that his investors were working on the same assumptions. That’s smart.