Download

The Power of the Right Prototype

There's that moment during a prototype presentation when everything in the room changes. Eyes light up. People lean forward in their chairs.

|

I'm a long-time advocate for leveraging prototypes and demos in the enterprise to explore ideas and emerging technologies. And, there's a particular reason. It's that precise moment during a prototype presentation when everything in the room changes. Eyes light up. People sit forward in their chairs. The conversation shifts from potential problems to possibilities. Executives become transfixed with ideas of transformation. The power of prototypes to persuade is undeniable.

The practice of prototyping has typically been isolated in certain pockets of the organization. However, as technology purchasing is distributed across the company, more executives can benefit from the prototype's ability to put an idea or technology into the context of the business. Through a prototype, a technology once considered the latest consumer fad turns into a vehicle to advance enterprise innovation. Prototypes make a big, amorphous idea personal, relatable and feasible.

I was recently onsite with a client who was keen on prototyping, and the first question I asked was: Why? Not because I was skeptical that prototyping would work. Understanding the why behind a prototype is imperative for picking the right prototype for your project. Different business drivers demand different types of prototypes, and prototypes often need to evolve as the lifecycle of the idea or technology matures. Prototypes are more than rough first productions of an object. Prototypes span the dimensions of physical, high-technology and digital and can take various forms; even a workflow diagram can be considered an early prototype.

Prototype-table

For example, PwC used digital story telling via video to communicate our vision of the future of shopping as well as to portend the potential impact of wearables on the insurance industry. In this case, video was the most effective prototyping medium because the technology is already available in the marketplace. The impetus was more on inspiring executives to realize what is possible vs. testing the technology's capability or feasibility. By contrast, when a client asked us to explore the use of sensors for increasing business intelligence, we built a smart refrigerator to test the feasibility and usability of the technological components in physical form, given that the Internet of Things is a nascent technology. A particular area of focus was exploring the transmission of data via the cloud.

Challenges With Getting Started

As businesses expand their prototyping programs, they will face challenges. Here are some of the obstacles they will need to scale.

1) New Skills. Prototyping requires a combination of creative design skills and rapid iterative development skills. Most companies have yet to cultivate these skills. Our 6th annual Digital IQ survey of nearly 1,500 business and technology executives found that only 19% of respondents rated the IT's prototyping skills as "excellent."

2) New Prototyping Processes. Traditional business processes like building a business case and determining an ROI don't apply to prototyping. Businesses must find new ways to plan, fund and evaluate prototyping initiatives, such as the ability to identify the next new opportunity for growth, efficiency and effectiveness. How many ideas were generated, how many advanced to the next stage and how many ideas were taken to market?

3) A New Understanding. Executives need to understand why prototyping is vital in today's fast-paced business climate, and the word needs to spread like wildfire across the enterprise. The good news is that it only takes one prototype presentation to turn someone into a believer.

Technology is touching every aspect of our lives, and businesses must constantly explore ways to better engage customers, employees and suppliers. Technology is moving too fast for companies to wait until a vendor hands them the next version of their product or service. Prototyping is no longer reserved for a handful of companies that are considered the kings of creativity. It’s necessary for all companies.


Chris Curran

Profile picture for user ChrisCurran

Chris Curran

Chris Curran is a principal and chief technologist for PwC's advisory practice in the U.S. Curran advises senior executives on their most complex and strategic technology issues and has global experience in designing and implementing high-value technology initiatives across industries.

3 Steps Toward Better Meetings

How often did you attend a meeting in the last week without knowing why you were even there? Better meetings are possible -- and necessary.

How many meetings did you attend last week that lacked a specific agenda, started late and then ended late? How often did you attend a meeting without knowing why you were even there? How many meetings actually resulted in a new idea or significant decision?

With about 11 million business meetings occurring each day, one thing is clear: Meetings are a mainstay of business culture. When they are conducted effectively, they inspire and ignite innovation and lead to higher-performing teams and a stronger bottom line. When they are ineffective and irrelevant, they plague all of us with the notion that this time together was wasteful, costly and inefficient.

Too many meetings fail to generate any meaningful return on the investment of our time and energy. And they undermine our productivity. Our meeting-intensive culture forces people to complete their work in the margins of their day-early in the morning and late at night-hurting their health, motivation and work-life balance.

Something has to give.

It is time for better meetings. It is time for a meeting revolution.

Start the revolution by questioning the value of each meeting you attend, by preparing for your meetings and by ensuring that the right people, and only the right people, are invited.

1. QUESTION THE VALUE OF EACH AND EVERY MEETING YOU ATTEND

Instead of automatically accepting the next meeting request, pause and consider the meeting's return on investment for you. Ask yourself:

  • Will this meeting assist me in achieving my goals?
  • How does the purpose of the meeting align with the company's strategic priorities?
  • What contribution can I make in the meeting?
  • Will anyone even notice if I'm not present?
  • Will this meeting be energizing, or will it suck the life right out of me?
  • Will this meeting be a rehash of the last five meetings I attended?
  • Is attending this meeting the highest and best use of my time right now?
  • Remember, every time you say yes to one thing, you are saying no to something else.

2. SUCCESS AND EFFECTIVENESS DEPEND ON YOUR PLANNING

As you prepare for your next meeting, ask yourself the following questions:

  • Why do we need to meet?
  • What is the purpose of the meeting?
  • Is this an informational, decision-making, problem-solving, brainstorming, team-building or instructional/skill-building meeting? Or a combination of a few of these?
  • What is the outcome I want to achieve as a result of this meeting?
  • Is there an alternative format I can use to achieve the outcome?
  • If a meeting is essential, what is the ideal meeting format to achieve the meeting outcomes-an in-person meeting, a virtual meeting or a combination of the two?
  • Who needs to attend the meeting?
  • What information do I need from the attendees?
  • What do the attendees need to know or complete in advance of the meeting to achieve the outcome?
  • What expectations do I have for the meeting attendees regarding preparation and participation? How will I communicate these expectations?
  • What is the ideal length of the meeting to accomplish the stated purpose of the meeting?

Use your answers to guide you in planning and preparing to have better meetings.

3. INVITE THE RIGHT PEOPLE AND ONLY THE RIGHT PEOPLE

To think about who to invite to your meeting, start by recognizing that there are four types of meeting attendees: the decision maker, the influencer, the resource person and the executer.

  • The decision maker is the primary authority.
  • The influencer has the pull and network within the organization to advocate and popularize meeting decisions and initiatives.
  • The resource person has specific knowledge, skills and expertise needed to inform the decisions and create plans for executing those decisions.
  • The executer has the knowledge, skills, resources and authority to successfully complete the work resulting from the meeting.

An ideal meeting has each of these types in attendance. Of course, one person can represent multiple roles, and more than one representative of a specific role may be required. For example, you may need three executers to complete a complex project discussed during the meeting.

To determine who really needs to attend the meeting, ask yourself:

  • What is the meeting outcome?
  • Who in the organization must be present to achieve the outcome?
  • Who is the decision maker?
  • Who is the influencer?
  • Who is the resource person?
  • Who is the executer?
  • If there are people who will not be invited to the meeting but who have been invited to similar meetings in the past, how will I communicate my rationale for excluding them?

Without the right people in the meeting, nothing will be accomplished, and everyone's time will be wasted. To have better meetings, invite the right people and only the right people.

A decision maker is not necessary to start a meeting revolution. A meeting revolution starts with one person choosing to do something differently and then communicating with colleagues about why she has chosen a different approach.

Thirty-seven percent of employee time is spent in meetings. So, when you choose to lead a meeting revolution, you are not only ensuring that this investment of time and energy generates a significant return on investment, you're also giving your team time back to do the work they're good at, the work they're hired to do and the work that will grow the business.

What can do you right now?

  • Here's a game-changing question for you: Are you a planner, prioritizer, arranger or visualizer? Find out your productivity style in less than 10 minutes; take my free productivity style assessment.
  • Want to take it to the next level? Share the assessment with your team, then start a conversation about your respective productivity styles and what you each need to work well.

Share your thoughts on how these strategies worked for you! Please leave a comment on this post.

This article originally appeared on fast company.com.


Carson Tate

Profile picture for user CarsonTate

Carson Tate

Carson Tate is a productivity consultant and the founder of Working Simply. She is the author of <em>Work Simply: Embracing the Power of Your Personal Productivity Style.</em> She serves as a coach, trainer and consultant to executives at Fortune 500 companies, including AbbVie, Deloitte, Wells Fargo and United Technologies.

How Big Data Can Transform Auto

Big data can already tell you: if you need more staff in the field, what historical repair costs are by year, make and model -- and more.

Big data is everywhere and is becoming more powerful in every aspect of our daily lives. The hunger, and ability, to parse and analyze diverse data sets goes to the highest levels of our government, large corporations and even start-ups, which are finding ways to add immense value. Although we in the claims industry are not involved in international intrigue or attempting to create a surveillance network, we can harness data to effectively influence our decisions. This will ultimately achieve better outcomes by putting specific files in the most skilled hands, streamlining processes and eliminating unnecessary touch-points. As technology continues to advance exponentially, the auto claims process can be radically improved. For many years, auto insurance claims departments squelched innovative thinking, in favor of tired cost-cutting measures. The departments would look at cycle time, average appraisal cost, repair cost, parts usage and total loss option documentation, but these data points are only a small portion of the picture and don't fully address or reveal the root causes. To get to that, we need to dig a bit deeper. Today, so much more data can be effectively captured and analyzed that a  claims manager could nearly predict outcomes and channel resources for the best results. Need more staffing in the field? What are the historical repair costs of a certain year, make and model with a matching impact location? Where are higher labor rates going to result in more total losses? Which companies allow their policyholders excessive rental on subrogation demands? Here are a few examples of how auto claims technology and resulting data can be channeled to improve workflows, eliminate unnecessary touch points and create a more efficient process.
  1. Predictive Dispatching. Imagine putting the right claims into the right hands every time. Through predictive modeling based on a series of data points, the likely repairability of a vehicle can be ascertained to a solid degree from first notice of loss. Combine that with a robust dispatching solution, and the right field representative could be sourced based on factors such as; closest location, past performance, experience levels, historical cycle time results, CSI ratings, estimating software platform, workload volume, current volume for the day and comparisons with other representatives. Algorithms can instantly give the file to the person who is most likely to provide the best outcome. Should the file qualify for a self-service type claim or even trigger a potential total loss, the appropriate resources can be called into action, thus saving money and wasted time.
  1. Subrogation Demand Analysis. Why do some insurers focus intently on the front end of a claim to ensure accuracy yet will accept subrogation demands from carriers without the same scrutiny? Sure, much of the disconnect may be because of the compartmentalization between various departments within a claims organization, but, with data analysis, subrogation reviews can be parsed, reviewed and stored to track trends. Which companies overpay their insureds for rental when the repair is minor? Do a large number of your files end up in arbitration? If so, what are the triggers? Data can even catch potential fraud if, say, a VIN has been reported multiple times for the same damage.
  1. Where and when? For quite a while now, most digital cameras have had information embedded in each photograph. With the spread of smartphones and tablets, geocode location data can also be included. What does this mean? If detailed photo information is needed, oftentimes exchangeable image format (EXIF) data can document what brand of camera took the photo, the time it was taken and the exact coordinates. This means a field representative's day can be reconstructed or a re-inspection date can be confirmed to ensure the condition of a vehicle at a specific point in time. While it’s never guaranteed that every phone or camera will provide the exact EXIF information, adjusters will find that, in most situations, a lot of valuable information can be gleaned.
  1. Claim delay analysis. In auto claims, the standard cycle time for a vehicle damage inspection is 48 hours. Can this always be achieved? If each step of the appraisal process is documented from assignment to completion, any issues and resulting data points along the way can lead to a treasure trove of information. You might discover that an incorrect phone number or vehicle location leads to 25% of all delayed files. This would be a valuable training tool to discuss with adjusters, stressing the importance of documenting accurate information before a dispatch. On the other hand, data may show that a certain percentage of files are delayed because an appraiser is overloaded. This could lead to an immediate focus on better allocating field resources. Capturing and recording specific data points along the way can serve as a tool to detect bottlenecks, inefficiencies and areas for process improvement.
  1. Location volume analysis. Do you need to increase your field staff? Just think if you could pull up a map at a moment's notice and track volume in a certain location. You could track average repair costs in various zip codes, cities and states and break the information down to an even more granular level. Detailed data can be run over time and compared with newly written policies to predict staffing needs in certain areas before the losses even occur. Think about doing this in real time without needless Excel files and manual processes. This can help in forecasting and budgeting for future years, enabling efficient management and allocation of expenses.
Data, when simplified and made usable, is incredibly powerful. Nary a one of us leaves the house today without a smartphone in his pocket packed with valuable data: phone contacts, mapping directions, family photos, etc. In the claims industry, we must likewise surround ourselves with data, innovating and developing in ways that will let claims leaders manage from quantifiable data instead of basing decisions on emotion and misperceptions.

Ernie Bray

Profile picture for user ErnieBray

Ernie Bray

Ernie Bray, chairman and CEO of ACD, has more than 20 years of experience in the insurance and automobile claims industry. Bray is a dynamic force in driving innovation and technology to transform the auto claims industry and connect a highly fragmented business sector.

3 Ways to Fix Operations Reports

Operations reports are a paradox: Business users ask for information, but what they need is advice. Here is how to help.

In a prior life, I worked as a business systems analyst for a global hard drive manufacturer. After successfully navigating the Y2K crisis, we found ourselves inundated with custom report requests. We did an analysis and found that our enterprise system had more than 2,000 custom-coded operations reports, only 70 of which had been run in the last 90 days. Of course, the actively used operations reports were the source of endless user complaints and enhancement requests. That's how we knew we had a good report: Complaints signaled actual use. Perhaps you've heard this broken record before; it happens everywhere.

It's not hard to understand how this happens. A business person is trying to make a decision. Do I have enough resources? Are there bottlenecks I need to address? Was the process change I made last month effective? To guide the decision, she needs information, so she asks for a report. In the change request, she identifies data fields and recommends an output format. If the report is done well, it helps her make her decision. But that's not the end of the story. Once the decision is made, the business person needs to make the next decision. Now that I know I need more resources, where should I position them? Last month's process change wasn't effective, so what can I do now? The old report becomes obsolete. The person needs another report (or an enhancement of the one requested). Rinse and repeat 20 times for 100 business users, and you get what we had: roughly 100 active reports and 1,900 inactive ones.

Let's face it, operational reporting is like fighting a land war in Asia. There are no winners; there are only casualties. Although some reporting is unavoidable, there are three things you can do to drive improved business impact:

Get closer to the decision: Business users may request information, but they're looking for advice. Put the effort into understanding the decisions they are trying to make. It will affect how you conceive your solution.

Apply the 20/80 rule: Providing information is an infinite and unending task. Put 20% of the effort to get 80% of the business value. Then take your savings and…

Invest in innovation: Stop reinforcing outdated paradigms. Columns and rows are food for machines, not humans. Data visualization, advanced analytics, social media and external data sources – the opportunities abound. Save some capacity to pursue them.

Operational reporting is a paradox: Business users sometimes get what they ask for, but they never get what they need. What they ask for is information; what they need is advice. The historical paradigm for reporting is primarily financial: a statement of fact, in a standard format, used by external parties to judge the quality of the company. A financial report has no associated internal decisions – the only purpose of a financial report is to state unadulterated fact. Operational reporting, on the other hand, is fundamentally about advice. A business person needs to make a decision to influence the financial outcomes. The facts are simply just a pit-stop on the journey toward a decision.


Eugene Lee

Profile picture for user EugeneLee

Eugene Lee

Eugene Lee is vice president of data and analytics at Guidewire Software, serving as the business leader for Guidewire’s analytics solutions. He uses internal and external data sources to lead innovation and deliver analytical insight to improve the quality of business decision-making across the property and casualty insurance industry lifecycle.

Best Practices for Predictive Models

With predictive models, many think the sole issue is picking the right technology, but the details of implementation are crucial.

|

There's little doubt about the proven value in using predictive analytics for risk selection and pricing in P/C insurance. In fact, 56% of insurers at this year's Valen Analytics Summit that are not currently using predictive analytics in underwriting plan to start within a year. However, many insurers haven't spent enough energy planning exactly how they can implement analytics to get the results they want. It's a common misconception that competitive advantage is won by simply picking the right model.

In reality, the model itself is just a small part of a much larger process that touches nearly every part of the insurance organization. Embracing predictive analytics is like recruiting a star quarterback; alone, he's not enough to guarantee a win. He both requires a solid team and a good playbook to achieve his full potential.

The economic crash of 2008 emphasized the importance of predictive modeling as a means to replace dwindling investment income with underwriting gains. However, insurance companies today are looking at a more diverse and segmented market than pre-2008, which makes the "old way of doing things" no longer applicable. The insurance industry is increasing in complexity, and with so many insurers successfully implementing predictive analytics, greater precision in underwriting is becoming the "new normal." In fact, a recent A.M. Best study shows that P/C insurers are facing more aggressive pricing competition than any other insurance sector.

Additionally, new competitors like Google, which have deep reservoirs of data and an established rapport and trust with the Millennial generation, means that traditional insurers must react to technologies faster than ever. Implementing predictive analytics is the logical place to start.

The most important first step in predictive modeling is making sure all relevant stakeholders understand the business goals and organizational commitment. The number one cause of failure in predictive modeling initiatives isn't a technical or data problem, but instead a lack of clarity on the business objective combined with a defect in the implementation plan (or lack thereof).

red

ASSESSMENT OF ORGANIZATIONAL READINESS

If internal conversations are focused solely on the technical details of building and implementing a predictive model, it's important to take a step back and make sure there's support and awareness across the organization.

Senior-Level Commitment - Decide on the metrics that management will use to measure the impact of the model. What problems are you trying to solve, and how will you define success? Common choices include loss ratio improvement, pricing competitiveness and top-line premium growth. Consider the risk appetite for this initiative and the assumptions and sensitivities in your model that could affect projected results.

Organizational Buy-In - What kind of predictive model will work for your culture? Will this be a tool to aid in the underwriting process or part of a system to automate straight-through processing? Consider the level of transparency appropriate for the predictive model. It's usually best to avoid making the model a "black box" if underwriters need to be able to interact with model scores on their way to making the final decisions on a policy.

Data Assets - Does your organization plan to build a predictive model internally, with a consultant or a vendor that builds predictive models on industry-wide data? How will you evaluate the amount of data you need to build a predictive model, and what external data sources do you plan to use in addition to your internal data? Are there resources available on the data side to provide support to the modeling team?

MODEL IMPLEMENTATION,/p>

After getting buy-in from around the organization, the next step is to lay out how you intend to achieve your business goals. If it can be measured, it can be managed. This step is necessary to gauge the success or failure post-implementation. Once you've set the goals for assessment, business and IT executives should convene and detail a plan for implementation, including responsibilities and a rollout timeline.

Unless you're lucky enough to work with an entire group of like-minded individuals, this step must be taken with all players involved, including underwriting, actuarial, training and executive roles. Once you've identified the business case and produced the model and implementation plan, make sure all expected results are matched up with the planned deliverables. Once everything is up and running, it is imperative to monitor the adoption in real-time to ensure that the results are matching the initial model goals put in place.,/p>

UNDERWRITING TRAINING

A very important but often overlooked step is making sure that underwriters understand why the model is being implemented, what the desired outcomes are and what their role is in implementing it. If the information is presented correctly, underwriters understand that predictive modeling is a tool that can improve their pricing and risk selection as opposed to undermining the underwriters. But there are still some who rely solely on their own experience and knowledge who may feel threatened by a data-driven underwriting process. In fact, nearly half of the attending carriers at the 2015 Valen Summit cited lack of underwriting adoption as one of the primary risks in a predictive analytics initiative.

Insurers that have found the most success with predictive modeling are those that create a specific set of underwriting rules and showcase how predictive analytics are another tool to enhance their performance, rather than something that will replace them entirely. Not stressing this point can result in resistance from underwriters, and it is essential to have their buy-in. At the same time, it is also important to monitor the implementation of underwriting guidelines, ensuring that they are being followed appropriately.

KEEPING THE END IN MIND,/p>

Many of the challenges and complexities in the P/C marketplace are out of an individual insurer's control. One of the few things insurers can control is their use of predictive modeling to know what they insure. It's one of the best ways an insurer is able to protect its business from new competitors and maintain consistent profit margins.

Using data and analytics to evaluate your options allows you to test and learn, select the best approach and deliver results that make the greatest strategic impact.

While beginning a predictive analytics journey can be difficult and confusing at first glance, following these best practices will increase your chances of getting it right on the first try and ensuring your business goals will be met.


Bret Shroyer

Profile picture for user bretshroyer

Bret Shroyer

Bret Shroyer is the solutions architect at Valen Analytics, a provider of proprietary data, analytics and predictive modeling to help all insurance carriers manage and drive underwriting profitability. Bret identifies practical solutions for client success, identifying opportunities to bring tangible benefits from technical modeling.

Stop Overpaying for Pharmaceuticals

Studies show huge overpayments for pharmaceuticals, but claims administrators have simple ways to end the problem.

Legislators in all jurisdictions have attempted to rein in the cost of pharmaceuticals in workers' compensation in an effort to reduce insured employers' workers' compensation premiums.

California, in particular, passed legislation between 2002 and 2007 to reduce pharmaceutical costs, yet expected reductions have not been forthcoming. Attention needs to focus on whether claims administrators have taken full advantage of this legislation and whether they could be doing more to help reduce the cost of pharmaceuticals.

A recent Workers Compensation Research Institute (WCRI) study titled "Are Physician Dispensing Reforms Sustainable?" found that the average price paid in California for 5mg and 10mg Cyclobenzaprine, a muscle relaxant, ranged from $0.35 to $0.70 per tablet (from the first quarter of 2010 through the first quarter of 2013). An independent study of Medi-Cal pharmaceutical prices used for California Workers' Compensation found, however, that since 2009, 10mg Cyclobenzaprine has been priced at $0.10 per tablet and as low as $0.05, while 5mg Cyclobenzaprine has been priced at $0.16 per tablet and has also been as low as $0.05. The comparison suggests that claims administrators have overpaid.

The 2006 California Commission on Health and Safety and Workers' Compensation (CHSWC) study titled "Impact of Physician-Dispensing of Repackaged Drugs on California Workers' Compensation, Employers Cost, and Workers' Access to Quality Care" also showed significant cost differences. For example, an insured employer's estimated total cost for each tablet dispensed at the correct Medi-Cal price of $0.10 was $0.29 per tablet. For each tablet dispensed at a price of $0.35, estimated total costs increased by between $0.70 and $0.99. When dispensed at $0.70 per tablet, estimated total costs increased by between $1.69 and $1.98 per tablet. This significant increase is directly caused by claims administrators paying far more than the published Medi-Cal price.

What can claims administrators do to ensure they do not overpay for medications?

First: Monitor medications dispensed. Second: Ensure that no more than the legislated maximum price is paid.

The California Department of Industrial Relations (DIR) website provides a medication pricing inquiry screen requiring entry of a National Drug Code (NDC) and other details taking approximately 10 seconds to obtain the price of a medication on the date it was dispensed. In addition, current pharmaceutical pricing data is available that can be loaded into a claims administrator's computer system or program, such as a spreadsheet. To complement the DIR's offerings, the U.S. Food and Drug Administration (FDA) website also provides NDC inquiry and download facilities, plus a downloadable file of suppliers of medications showing their labeler code(s) along with their company name. The labeler code is the first of three parts associated with the NDC identifying the supplier of the medication. For claims administrators wanting to know more about medications, the FDA offers the "Orange Book" for download, listing all FDA-approved medications along with therapeutic equivalence evaluations. With all this free information, California workers' compensation claims administrators have no excuse for overpaying.

For jurisdictions that utilize the average wholesale price (AWP) to set their maximum price for a medication, claims administrators will need to license pricing information from sources such as Medi-Span (Wolters Kluwer Health) or Red Book (Truven Health Analytics). Both offer extensive pharmaceutical information for download into a claims administrator's computer system or, alternatively, use of the vendor's inquiry facilities.

The passing of legislation in California that set the same prices for medications regardless of dispenser (i.e. pharmacy, mail order/PBM or physician) has provided opportunities for medications to be dispensed by a physician without paying a higher price and for more accurate and timely details relating to medications being available to claims administrators.

The invoice a physician submits (either paper or electronic), includes services rendered at the person's medical appointment with a report outlining their current medical conditions and other pertinent information, including the date of their next medical appointment. Receiving billing details on the same invoice for medications dispensed, which would include the NDCs, quantities dispensed and prices charged, provides the claims administrator with an excellent opportunity to review the appropriateness of the medication against the diagnosis and treatment plan as well as the prices charged, all in one step. In addition, there is the opportunity to review any physician treatments that differ from the norm (i.e. guidelines), which may be necessary so as not to interfere with any non-work-related treatments under the control of the person's own physicians.

In cases of pain management and where step-therapy is used, the claims administrator can ensure that physician-dispensed medication quantities are limited to the next medical appointment and assist in determining when the person may be able to either return to work or stay at work during their recovery. In many cases, acute pain is treated with acetaminophen (aka paracetamol) and nonsteroidal anti-inflammatories (NSAIDs), allowing a person to either stay at work or return to work earlier. At times, however, narcotic analgesics may be required to control pain that blocks pain receptors to the brain, slowing the person's cognitive function and reaction times, possibly restricting their ability to either stay at work or return to work early.

Claims administrators also have the opportunity to monitor a physician's pharmacy formulary to ensure they are dispensing medications from suppliers with the lowest or the average lowest price for a medication. Claims administrators should never have to pay the "no substitution" price for a physician-dispensed medication. For some medications, the Medi-Cal "no substitution" price can be much higher than the regular price.

Considering that claims administrators currently perform some form of medical bill review, to include pharmacy price and utilization verification would add minimal additional effort to the overall medical bill payment process, regardless of whether the physician's invoice is received on paper or electronically.

Claims administrators with computer systems that monitor medications through the NDC have the opportunity through physician dispensing to invoke timely automated processes based on the NDCs shown on the physician's invoice. For example, if claims administrators use an adaptation of the biopsychosocial and shared-decision making frameworks (i.e. collaboration) to address a stay at work (SAW) or early return to work (ERTW), a more empathetic approach to claims handling is required. This SAW/ERTW approach can be enhanced through invoking processes based on the physician's submitted NDCs, which may include: a pre-defined questionnaire associated with distress and risk, focusing on somatic and emotional symptoms: a pre-existing anti-depressant medications questionnaire that establishes whether the person is already taking anti-depressants' as well as a cultural sensitivity questionnaire relating to a person's religious or spiritual beliefs and their cultural and language preferences. The results from these questionnaires can directly influence the medical treatment pre-authorized by the claims administrator as well as assist in determining when the person is likely to return to "normality." All this information directly influences the cost of the claim, which in turn determines the future premiums paid by the insured employer. For claims administrators who do not have capabilities such as these in their computer systems, there are systems available.

Having physician-dispensed medications billed in a timely way on the same invoice as other medical services improves both transparency and accountability. This recent WCRI study has highlighted that insured employers in California may have paid higher premiums for policy periods from 2011 through 2014, caused by claims administrators overpaying for the 5mg and 10mg Cyclobenzaprine medications, which was only brought to the attention of the workers' compensation community in 2015.

Considering that expected savings from the enacted California legislation relating to pharmaceuticals have not been forthcoming, it is only a matter of time before insured employers conduct their own studies investigating how much has been overpaid for dispensed medications and how much this overpayment may have increased their premiums since 2007. Depending on the findings from this type of study, a possible outcome could result in California workers’ compensation insurers being forced to restate their claims costs associated with pharmaceuticals and all pharmaceutical overpayments by their claims administrators to be treated as an expense outside of their workers’ compensation insurance portfolio.


John Bobik

Profile picture for user johnbobik

John Bobik

John Bobik has actively participated in establishing disability insurance operations during an insurance career spanning 35 years, with emphasis on workers' compensation in the U.S., Argentina, Hong Kong, Australia and New Zealand.

Reinventing Life Insurance

Reinventing life insurance is necessary to get the industry out of doldrums that have lasted for decades. A proven method is the "LITE" model.

|
Many life insurance executives with whom we have spoken say that their business needs to fundamentally change to be relevant in today’s market. Life insurance does face formidable challenges. First, let’s take a hard look at some statistics. In 1950, there were approximately 23 million life policies in the U.S., covering a population of 156 million. In 2010, there were approximately 29 million policies covering a population of 311 million. The percentage of families owning life insurance assets has decreased from more than a third in 1992 to less than a quarter in 2007. By contrast, while less than  a third of the population owned mutual funds in 1990, more than two-fifths (or 51 million households and 88 million investors) did by 2009. A number of socio-demographic, behavioral economic, competitive and technological changes explain the trends -- and the need for reinventing life insurance:
  • Changing demography: Around 12% of men and an equal number of women were between the ages of 25 and 40 in 1950. However, only 10% of males and 9.9% of females were in that age cohort in 2010, and the percentage is set to drop to 9.6% and 9.1%, respectively, by 2050. This hurts life insurance in two main ways. First, the segment of the overall population that is in the typical age bracket for purchasing life insurance decreases. Second, as people see their parents and grandparents live longer, they tend to de-value the death benefits associated with life insurance.
  • Increasingly complex products: The life insurance industry initially offered simple products with easily understood death benefits. Over the past 30 years, the advent of universal and variable universal life, the proliferation of various riders to existing products and new types of annuities that highlight living benefits significantly increased product diversity but often have been difficult for customers to understand. Moreover, in the wake of the financial crisis, some complex products had both surprising and unwelcome effects on insurers themselves.
  • Individual decision-making takes the place of institutional decision-making: From the 1930s to the 1980s, the government and employers were providing many people life insurance, disability coverage and pensions. However, since then, individuals increasingly have had to make protection/investment decisions on their own. Unfortunately for insurers, many people have eschewed life insurance and spent their money elsewhere. If they have elected to invest, they often have chosen mutual funds, which often featured high returns from the mid-1980s to early 2000s.
  • Growth of intermediated distribution: The above factors and the need to explain complex new products led to the growth of intermediated distribution. Many insurers now distribute their products through independent brokers, captive agents, broker-dealers, bank channels and aggregators and also directly. It is expensive and difficult to effectively recruit, train and retain such a diffuse workforce, which has led to problems catering to existing customers.
  • Increasingly unfavorable distribution economics: Insurance agents are paid front-loaded commissions, some of which can be as high as the entire first-year premiums, with a small recurring percentage of the premium thereafter. Moreover, each layer adds a percentage commission to the premiums. All of this increases costs for both insurers and consumers. In contrast, mutual fund management fees are only 0.25% for passive funds and 1% to 2% for actively managed funds. In addition, while it is difficult to compare insurance agency fees, it is relatively easy to do so with mutual fund management fees.
  • New and changing customer preferences and expectations: Unlike their more patient forebears, Gens X and Y – who have increasing economic clout – demand simple products, transparent pricing and relationships, quick delivery and the convenience of dealing with insurers when and where they want. Insurers have been slower than other financial service providers in recognizing and reacting to this need.
A vicious cycle has begun (see graphic below). Insurers claim that, in large part because of product complexity, life insurance is “sold and not bought,” which justifies expensive, intermediated distribution. For many customers, product complexity, the need to deal with an agent, the lack of perceived need for death benefits and cost-of-living benefits make life products unappealing. In contrast, the mutual fund industry has grown tremendously by exploiting a more virtuous cycle: It offers many fairly simple products that often are available for direct purchase at a nominal fee. Untitled Reasons for optimism Despite the bleak picture we have painted so far, we believe that reinventing life insurance and redesigning its business model are possible. This will require fundamental rethinking of value propositions, product design, distribution and delivery mechanisms and economics. Some of the most prescient insurers are already doing this and focusing on the following to become more attractive to consumers:
  • From living benefits to well-being benefits: There is no incentive built into life policy calculations for better living habits because there traditionally has been very little data for determining the correlation between these behaviors and life expectancy. However, the advent of wearable devices, real-time monitoring of exercise and activity levels and advances in medical sciences have resulted in a large body of behavioral data and some preliminary results. There are now websites that can help people determine their medical age based on their physical, psychological and physiological behaviors and conditions. We refer to all these factors collectively as “well-being behaviors.” Using the notion of a medical age or similar test as part of the life underwriting process, insurers can create an explicit link between “well-being behaviors” and expected mortality. This link can fundamentally alter the relevance and utility of life insurance by helping policyholders live longer and more healthily and by helping insurers understand and price risk better.
  • From death benefits to quality of life: Well-being benefits promise to create a more meaningful connection between insurers and policyholders. Rather than just offering benefits when a policyholder dies, insurers can play a more active role in changing policyholder behaviors to delay or help prevent the onset of certain health conditions, promote a better quality of life and even extend insureds’ life spans. This would give insurers the opportunity to engage with policyholders on a daily (or even more frequent) basis to collect behavioral data on their behalf and educate them on more healthy behaviors and lifestyle changes. To encourage sharing of such personal information, insurers could provide policyholders financial (e.g., lower premiums) and non-financial (e.g., health) benefits.
  • From limited to broad appeal: Life insurance purchases are increasingly limited to the risk-averse, young couples and families with children. Well-being benefits are likely to appeal to additional, typically affluent segments that tend to focus on staying fit and healthy, including both younger and active older customers. For a sector that has had significant challenges attracting young, single, healthy individuals, this represents a great opportunity to expand the life market, as well as attract older customers who may think it is too late to purchase life products.
  • From long-term to short-term renewable contracts: Typical life insurance contracts are for the long term. However, this is a deterrent to most customers today. Moreover, behavioral economics shows us that individuals are not particularly good at making long-term saving decisions, especially when there may be a high cost (i.e., surrender charges) to recover from a mistake. Therefore, individuals tend to delay purchasing or rationalize not having life insurance at all. With well-being benefits, contract durations can be much shorter -- even only one year.
  • Toward a disintermediated direct model: Prevailing industry sentiment is that “life insurance is sold, not bought,” and by advisers who can educate and advise customers on complex products. However, well-being benefits offer a value proposition that customers can easily understand (e.g., consuming X calories per day and exercising Y hours a day can lead to a decrease in medical age by Z months), as well as much shorter contract durations. Because of their transparency, these products can be sold to the consumer without intermediaries. More health-conscious segments (e.g., the young, professional and wealthy) also are likely to be more technologically savvy and hence prefer direct online/call center distribution. Over time, this model could bring down distribution costs because there will be fewer commissions for intermediaries and fixed costs that can be amortized over a large group of early adopters.
We realize that life insurers tend to be very conservative and skeptical about wholesale re-engineering. They often demand proof that new value propositions can be successful over the long term. However, there are markets in which life insurers have successfully deployed the well-being value proposition and have consistently demonstrated superior performance over the past decade. Moreover, there are clear similarities to what has happened in the U.S. auto insurance market over the last 20 years. Auto insurance has progressively moved from a face-to-face, agency-driven sale to a real-time, telematics-supported, transparent and direct or multi-channel distribution model. As a result, price transparency has increased, products are more standardized, customer switching has increased and real-time information is increasingly informing product pricing and servicing. Implications Significantly changing products and redesigning a long-established business model is no easy task. The company will have to redefine its value proposition, target individuals through different messages and channels, simplify product design, re-engineer distribution and product economics, change the underwriting process to take into account real-time sensor information and make the intake and policy administration process more straight-through and real-time. So, where should life insurers start? We propose a four step “LITE” (Learn-Insight-Test-Enhance) approach:
  • Learn your target segments’ needs. Life insurers should partner with health insurers, wellness companies and manufacturers of wearable sensors to collect data and understand the exercise and dietary behaviors of different customer segments. Some leading health and life insurers have started doing this with group plans, where employers have an incentive to encourage healthy lifestyles among their employees and therefore reduce claims and premiums.
  • Build the models that can provide insight. Building simulation models of exercise and dietary behavior and their impact on medical age is critical. Collecting data from sensors to calibrate these models and ascertain the efficacy of these models will help insurers determine appropriate underwriting factors.
  • Test initial hypotheses with behavioral pilots. Building and calibrating simulation models will provide insights into the behavioral interventions that need field testing. Running pilots with target individuals or specific employer groups in a group plan will help test concepts and refine the value proposition.
  • Enhance and roll out the new value proposition. Based on the results of pilot programs, insurers can refine and enhance the value proposition for specific segments. Then, redesign of the marketing, distribution, product design, new business, operations and servicing can occur with these changes in mind.

Jamie Yoder

Profile picture for user JamieYoder

Jamie Yoder

Jamie Yoder is president and general manager, North America, for Sapiens.

Previously, he was president of Snapsheet, Before Snapsheet, he led the insurance advisory practice at PwC. 


Anand Rao

Profile picture for user Anand_Rao

Anand Rao

Anand Rao is a principal in PwC’s advisory practice. He leads the insurance analytics practice, is the innovation lead for the U.S. firm’s analytics group and is the co-lead for the Global Project Blue, Future of Insurance research. Before joining PwC, Rao was with Mitchell Madison Group in London.

Getting to 2020: the Finance Function

An EY survey finds the finance function trying to become a much better business partner -- but lacking the right data and analytics.

Even as economies recover, the insurance sector continues to face many competitive pressures and regulatory challenges. Yet a new drive for growth is emerging. The 2014 EY Global Insurance CFO Survey captures the priorities and challenges for finance and actuarial teams as they seek to support business growth strategies while addressing regulatory and cost pressures. Delivering more value to the business through performance measurement and improved decision support is the top priority for the finance function through 2020. Among senior finance professionals participating in the survey, 71% indicated that “being a better business partner” ranked among their top three priorities, with 35% placing this as number one.
As insurance companies around the world continue to invest in data management and analytics capabilities, the role of finance and actuarial functions has become even more critical. The processes and systems supporting these functions are key to developing deep insights into business performance, as well as customer needs, preferences and behavior. In response, finance leaders have been increasing their efforts to improve the capabilities of their organizations to meet the new demands. In the survey, 89% of respondents stated that they have either begun a change program or are in the planning stage.
However, the drive to better insights is not without challenges. Among the issues is the impact of continuing regulatory compliance demands. According to 35% of those surveyed, implementing new regulatory and financial reporting requirements was the highest priority for finance and actuarial organizations; 56% ranked this among their top three. As a result, the ability for these organizations to strike a balance between delivering value to the business and meeting daily operational demands will continue to be a challenge. Not surprisingly, the current data and technology footprint will require significant change to meet the challenges of the finance function of the future. Across the finance operating model, survey participants scored data as the least developed capability on average, while technology recorded the greatest gap between current and required future state.
Other Key Findings
  • Top three business drivers: #1 growth, #2 managing costs and #3 regulatory changes
  • Two-thirds of respondents rank data and technology issues among the top three challenges facing finance and actuarial functions; participants on average score data as their least developed capability
  • By 2020, the most significant shifts in maturity levels by operating model will be in data management and technology capabilities
  • Respondents expect onshore shared services to support transaction processing functions, with outsourcing selectively used for payroll and internal audits
  • Decision support and controls are expected to account for a larger share of finance and actuarial headcount by 2020
What insurers must do We see three key areas where insurers can take action:
  • Modify current reporting processes by developing an efficient reporting solution architecture.
  • Deliver timely and relevant management information and link strategic objectives to performance indicators.
  • Improve finance and actuarial operational performance by using the right skills and processes to strike a balance between effectiveness and efficiency.
For the full survey from which this excerpt was taken, click here.

David Foster

Profile picture for user DavidFoster

David Foster

David Foster has more than 30 years' experience working in the financial services industry, in insurance and, for the last 10 years, in a management consulting/advisory capacity. Foster leads Ernst & Young’s finance change agenda for insurers. Foster is also a member of Ernst & Young’s insurance industry board and Global Solvency II task force.

How Strict Can a Dress Code Be?

Enforcing a dress code can infringe on the rights of those with sincere religious beliefs. Here are tips on how to avoid the problem.

Does your company have a "look" or standard of dress it requires in the workplace? No hats, or maybe no beards? Can you deviate from the dress code?

Increasingly, employees and applicants for employment are making "failure to accommodate" claims on the grounds that they were discriminated against based on their need for a change or exception to a workplace grooming or dress policy. Examples of religious discrimination or failure to accommodate can include: not hiring the applicant because she doesn't fit the company's "look" or placing an employee in a non-customer-facing position because of religious attire or grooming (e.g., long beard, piercings, head scarf ).

The law

Title VII of the Civil Rights Act of 1964 ("Title VII"), 42 U.S.C. § 2000e, et. seq., as amended, prohibits employers with at least 15 employees from discriminating in employment hiring, recruitment, promotion, benefits, training, job duties, termination or any other aspect of employment on the basis of religion. It also prohibits retaliation for complaining of religious discrimination or for participating in the investigation of such claims, and for denying reasonable accommodations, including accommodations for religious attire or grooming standards. It is the EEOC's position that an employer is required to reasonably accommodate an employee's religious beliefs or practices, unless doing so would cause more than a minimal burden on the operations of the employer's business.

Title VII only provides protection to sincerely held religious beliefs and practices about dress code. These protections are broadly interpreted and cover not only traditional religious beliefs but also those that are new and uncommon. If an employee merely makes such a request for accommodation based on personal preference rather than religious belief, there are no Title VII protections or implications. However, the requirement that employers and their management learn to distinguish between these two types of requests can be daunting and dangerous in light of the litigious society we live in.

Recent case

In February 2015, the United States Supreme Court heard arguments in a case filed against Abercrombie & Fitch, where a Muslim applicant was rejected after wearing a head scarf (known as a hijab) to an interview, based on the hiring manager's belief that such covering violated the company's rigid "look" policy, which forbids caps and hats. The applicant never asked for an accommodation, and the employer never opened a dialog as to whether a reasonable accommodation to the dress code would be necessary. Once a ruling is issued, we hope the Supreme Court will provide guidance as to when an employer has any obligation to open dialog about religious accommodation without the employee or applicant making such a request.

Takeaway

To ensure compliance with the law, employers must be informed and vigilant when applying workplace uniform, "look" or grooming policies, particularly as they apply to employees or applicants in need of a religious accommodation. Management or hiring decision makers should be trained on how to implement religious accommodation requests, specifically, learning to identify and understand religious clothing accommodation requests and how to properly engage in such discussion. When in doubt as to the proper handling of a religious clothing accommodation, we suggest that you contact a labor and employment lawyer before making employment decisions. Your attorney can also help identify potential pitfalls in uniform, look or other clothing policies. Further, a well-designed employment practices liability (EPL) insurance policy should be purchased to mitigate potentially costly financial damage, should you be faced with a discrimination suit based on religious dress or grooming.


Laura Zaroski

Profile picture for user LauraZaroski

Laura Zaroski

Laura Zaroski is the vice president of management and employment practices liability at Socius Insurance Services. As an attorney with expertise in employment practices liability insurance, in addition to her role as a producer, Zaroski acts as a resource with respect to Socius' employment practices liability book of business.

'War for Talent' Is Not Necessary

The "war for talent" focuses on the wrong issue. Instead, employers should retain existing talent by fixing morale-sapping processes.

Everyone is talking about the war for talent. Must there be a war? Or can your organization accomplish what it wants to (needs to) by fighting the battle for retention quietly, within its own borders?

There's only so much a company can manufacture with its brand to catch the attention of top talent. Smart candidates understand that the truth about you is displayed by your existing employment base. Your employment brand is much less the result of intentional messages created for an external audience than it is the result of the vibe your existing employee base creates in the marketplace. Social media has increased the volume of this voice exponentially.

To retain the talent you have and to make that social voice work for you, you should:

  1. Make sure existing talent knows it's being treated fairly in terms of pay and recognition.
  2. Make sure each employee leaves work for the day, week, month with a sense of achievement.
  3. Make sure your processes continue to improve. Smart people don't like working with dumb processes.
  4. Make sure each employee has at least a small sense of camaraderie.
  5. Relate all work to the customer experience. Most employees would rather work for customers than bosses!

Brilliant people or brilliant processes?

In some cases, employers believe they need brilliant employees because they are the only ones who have been proven to find their way through the maze of terrible processes. Often, employers don't even realize this is the reason they're looking for talent. A well-known Japanese company's leader once said, "In America, you have brilliant people working with average processes; in Japan, we have average people working with brilliant processes." Something to think about.

Lets face it. There are only so many brilliant people to go around. Most of us are closer to average. Given this, is it really smart to fight a war for talent? Or would it be smarter to work on processes -- remove waste (things customers wouldn't pay for), minimize the things customers do have to pay for but would rather not (things normally required by law) and spend time working on processes that create value? It's about the customer.

The focus of everyone's work must be on creating value for customers. The message should be, "Shoot for referrals, settle for retention." The entire workforce should be motivated by this. Common purpose builds workplaces worth working for.

Unfortunately, it has been my experience that it is easier to convince a janitor of this than it is most executives. Executives make a lot of money. There's a lot of temptation to protect their domains, their technical areas of work, their lines of business, their territories, etc. What these executives need to understand is that customers flow horizontally through the spectrum of work performed by each executive's area of influence. The thicker the borders between those areas of influence, the harder it is for employees who want to satisfy customers to get their jobs de. These employees, especially the smart ones, eventually leave the organization. They definitely don't recommend their own workplace to people they care about.

What if you actually won the war?

So before you begin to fight a war for talent, my recommendation is to think internally. If you did win the war for talented people, what kind of environment would they be working in? What kinds of processes would they be forced to work with? Are your best employees already recommending others to work at your company? If not, why not?

Don't just sit there, ask them!