Download

8 Steps to Beat All 8 CPCU Exams

CPCU is usually done as a self-study program and can be hard for those who haven't done something like it, but eight strategies can get you through.

|||||

Now that we got you excited based on earlier articles such as this one, and you're ready to start CPCU today, here's some guidance on how to actually get it done and survive the tests. This article is lovingly dedicated to "those poor souls studying for the CPCU designation,"

Please keep in mind that doing CPCU is very much like trying to eat an elephant; there's only one way to do it, one bite at a time.

I asked my friend and all-around Wonder Woman, Carly Burnham, to share the strategies she used in completing the designation. I met Carly in 2011 when she was at a turning point in her career. She felt stuck in her position as a call center sales agent and wasn't sure of the next step. She wasn't even sure whether insurance was an industry she could make a career in. She had an interest in underwriting but had no idea how to get there. We met through the Gen Y Associate Resource Group at Nationwide Insurance.

head

I could clearly see she was bright and hard-working and was looking for a challenge, so I asked her if she had heard about the CPCU. Over coffee, I told her all about why CPCU is awesome and convinced her to go for it. To make things even more interesting, I challenged her to do it in a year, while working full time and finishing a part-time MBA program. To my surprise, she took me up on it. Even more impressively, she met the goal and finished all eight tests in just short of 12 months.

When I talked to Carly about this article, she shared the following thought with me, "The CPCU is usually done as a self-study program, and if you haven’t tackled online courses or some other self study program, it can be challenging to know where to start. I was lucky to have your mentorship, and, looking back, I'd say these eight strategies were really what helped me meet the audacious goal that we set."

  1. Set Your Own Timetable

Decide up front when you are going to finish your CPCU. If you don't choose an end date, you could stretch the entire process out for YEARS. On average, people take at least two years to finish, but many insurance professionals have been working on their CPCU for longer than that. Decide when you want to be done and commit to the deadline. If you are trying to finish to advance your career, focus on finishing before you begin to apply for new roles. If you want to finish in time to attend the annual meeting in a certain city, set your end date as the last month that you can qualify for that meeting. Having an end date and an understanding of your motivation will help you push through challenges along the way.

Untitled

  1. Find an Accountability Partner

Your accountability partner may be a current CPCU or someone who is also pursuing the designation. He or she should be someone with whom you can share the reason for your pursuit of the CPCU. If he or she understands your motivation, it will be easier to push you to stay the course and finish by your goal date.

Untitled

  1. Create a Spreadsheet on Google Drive to Share With Your Accountability Partner

On this spreadsheet, you will want to map out the dates that you will take each exam to achieve your goal date. Once you have mapped out exam dates, you can work backward using the chapter summaries at theinstitutes.org to identify when you will read each chapter of the text for the exam and when you will take your practice exams.

  1. Devote Certain Hours of Your Day to Studying

When studying, consistency is key. If you focus best at the beginning of the day, set aside an hour or two in the morning and commit to showing up the same place each day to read the chapters that you laid out in your spreadsheet for this day. Choose the time that works best for you, but aim to make it a routine, so that you don't have to decide every day that you are going to stay at the office an extra hour or go to the coffee shop before work starts. If it's part of your daily rituals, you won't have to use willpower to get your studying done.

  1. Read the Entire Book

First, read The Institutes' guide to preparing for their exams. As they mentioned, there is no single way to prepare. But I found that reading the entire book first helped me establish a base level of knowledge. Next, I would take a practice exam, as a sort of pre-test. The practice exam would let me know which chapters I was weak on. With this information, I could pinpoint the best way to spend my time. If I needed to, I could re-read chapters and test on those individual chapters until I felt comfortable moving on to the next chapter.

  1. Use the Mobile App

The Institutes have created a mobile app called Smart QuizMe for Apple and Android phones. Using this in any spare time you have will also help you feel confident with the information and the style of questions on the practice exams. You can set the app to run through certain chapters or the whole book depending on what you want to focus on. Because it's on your phone, you can use it even if you only have five or 10 free minutes. The questions on the app tend to be clustered, so question 100, 101, 102 and 103 might be the same question with only one word changed. This really teaches you how changing a small part of a question can result in a different answer. The app is particularly helpful for the most detail-oriented tests, especially 520. One word of warning: Don't depend entirely on the app without doing the online practice exams; you could easily fool yourself into thinking you're ready when there are significant parts you haven't yet mastered.

Untitled

  1. Pass the Practice Exams a Few Times

Leave at least at least four and preferably a full seven days before the real test to take the online practice exams. Passing the exams will give you the confidence you need to take the exam without feeling rushed or unsure of your answers. The practice exams are very similar and sometimes harder than the actual exams. You will also have the opportunity to research any questions you missed and make sure you understand the concept before test day. Nothing beats going into the real test feeling confident, and nothing gets you more confident that the online practice exams. The practice exams are the key to the kingdom!

  1. Get the Proper Support

Make sure your family, close friends and other support systems fully understand that the CPCU is a BIG DEAL and that you will require lots of support while you get through it. Make sure they know this isn't just another license or minor designation but a serious commitment that only 4% of people in our industry have gotten through.

To help my family understand, I explained that I was pursuing something akin to a master's degree in insurance, and I was doing it in a year, while working 40 hours a week -- most people outside the industry will need the designation explained in a similar way to fully understand the commitment you've made. Also, join the CPCU Candidates Facebook Group; they'll provide you with tons of encouragement and answer your questions. Most importantly, you won't feel like you're the only person in the world putting yourself through the challenge of CPCU.

One Bonus Tip:

Know ahead of time that 540 - Finance and Accounting for Insurance Professionals is a special beast of a test (see artist's rendering below). To ensure proper preparation for this one, allow yourself 50% more time than usual; so if you have given yourself two months for 500, 520 and 530, give yourself three months for 540. Buy a financial calculator (preferably the Texas Instruments BA-II Plus) and learn how to use it. The book won't teach you how to use it, so you have to get help from someone who knows how to use it - if you have a hard time finding someone, there are decent tutorials on YouTube or at Atomic Learning. Use the calculator for all the practice tests, and then don't forget to bring it on exam day!

I am passionate about spreading the word about the CPCU, and I was glad to have met Carly at that turning point in her career. Her commitment has paid off, and she has recently became a commercial lines underwriter at Erie Insurance; she's loving the new job, and she's fully committed to the industry. She credits her designation with helping her get the interview but says it goes even further than that: "The knowledge that I gained in earning my CPCU gave me the confidence to pursue a true career in the industry, and I now use the knowledge every day in my role as an underwriter. This designation gives you a broad understanding of the industry, but it also gives you practical, technical information that is essential to being a successful insurance professional."

If you've had similar experiences, share them in the comments. If you have questions about the pursuit of your CPCU, message me. There are really no excuses left. Let's get going and get your CPCU. You will never regret it.

Good job making it to the end of our longest post yet; as a reward, here is another image for the awesome metaphor of eating an elephant one bite at a time.

Untitled


Tony Canas

Profile picture for user TonyCanas

Tony Canas

Tony Canas is a young insurance nerd, blogger and speaker. Canas has been very involved in the industry's effort to recruit and retain Millennials and has hosted his session, "Recruiting and Retaining Millennials," at both the 2014 CPCU Society Leadership Conference in Phoenix and the 2014 Annual Meeting in Anaheim.

Data Security Critical as IoT Multiplies

Manufacturers are rushing products to market to connect to the Internet of Things (IoT) but are paying very little attention to data security.

When this century commenced, delivering new technology as quickly as possible, with scant concerns about quality, became standard practice. Consumers snookered into buying version 1.0 of anything were essentially quality-control testers.

How soon we forget. As we enter the age of the Internet of Things, companies are pushing out computing devices optimized to connect to the Web with little thought to security implications.

Free IDT911 white paper: Breach, Privacy, And Cyber Coverages: Fact And Fiction

ESET security researcher Cameron Camp has been paying close attention to data security. He recently sat down with ThirdCertainty to share his observations (answers edited for clarity and length):

3C: New devices with the capacity to link to the Internet seem to hit the market every day, and eager early adopters snatch them up. Why should they slow down?

Camp: Companies are going to live and die on whether they get to market fast. I think security tends to be an afterthought, and I'm concerned that some of the manufacturers don't really have a solid way forward right now.

3C: That sounds ominous. What can and should we be doing?

Camp: We have to think about security in new ways. We have to secure the person, the experience and the data in rest and in motion at all times, and that's not going to be done with a PC attitude toward security.

We don't understand how to protect that data at all times and on a multitude of platforms. If you're working on machines at home, and a lot of them are connected, and you have a breach on one, you have a breach on lots of them. All hackers need is a toehold into your system.

3C: What if someone doesn't buy every new gizmo that comes along? Are they safe?

Camp: Hackers are finding interesting and novel ways to break into all kinds of things. Routers are one of the first things that really need security to be dealt with, because everyone has one. If your router is one to three years old, it is a gateway to get into everything you own.

3C: Why don't routers get patched like PCs?

Camp: The manufacturer will be notified that these things are wide open to attacks, and they don't seem to want to do anything; they're more interested in the next product cycle. People replace a router when it dies after five years. In the meantime, if four of those years they're vulnerable, we have a big problem.

Manufacturers have to keep the revenue up; they don’t do that by supporting their routers forever, especially low-cost routers. In the Internet of Things, if you have many sensors around the house, and you raise the cost of those sensors by $1, it makes your system cost too much. Nobody's going to buy it, and you're going to be out of business.

3C: Everyone is worried about their routers now; anything else consumers need to be concerned about?

Camp: The people who are good at breaking into Internet of Things devices may not be good at exploiting them, but they are good at entry, and they're going to sell that to the highest bidder.

Many of these devices run a full Linux operating system; that means they are a server. You can load things on them and exfiltrate data, because Linux was always built to be networked; it was built to be in a server environment.

3C: Is there some good news on the horizon?

Camp: I think there's going to be a standardization around operating system ecosystems. We're going to see default operating systems used on the Internet of Things so a manufacturer can focus on their own sensor, their own technology, and just drop in a secure operating system. Right now, there's many different permutations. In five years, we're not going to see that, we're going to see just a few that everyone uses, so if there's a security issue, people will understand more how to patch them.


Byron Acohido

Profile picture for user byronacohido

Byron Acohido

Byron Acohido is a business journalist who has been writing about cybersecurity and privacy since 2004, and currently blogs at LastWatchdog.com.

Mitigating Risk With Small Deliverables

Providing batches of small deliverables decreases development time, increases the frequency of feedback and cuts costs.

small deliverables|

For far too long, products have failed to make it to market. Or, if they finally do, they are over budget and grossly unpopular because of poor implementation of product development principles. Product development as a discipline has exploded with the ascendancy of the tech industry, because of the emphasis on innovation and product releases. This has created a paradigm shift in how risk is mitigated relative to bringing new products to market.

Of all the variables involved in getting a successful product to market, batch size is of paramount importance. The emphasis on small deliverables is the key to mitigating risk in modern product development.

What Is Batch Size?

According to Eric Ries, author of The Lean Startup, "Batch size is the unit at which work-products move between stages in a development process." An example of a batch in the context of a software project is the set of code written in the development phase to implement a single, complete new feature of a product.

There are three key advantages to delivering features in small batch sizes: a decrease in development time decrease, an increase in feedback and a drop in cost.

Development Time Is Decreased

The ability to reduce the time from development to launch is not only a competitive advantage, but is also a risk mitigator. The ability to deliver a unit of work to your users quickly is a key benefit of delivering in small batch sizes. It is far better to have your new feature in your user's hand sooner rather than later so you can test the assumptions that are the basis for future iterations.

Reduced development time also reduces the risk of obsolescence. In the past, I worked with a company to develop a technology that was first to market in its product category. The problem was that the product was delivered in large batch sizes. By the time the product was ready for market, the underlying technology had moved to another medium. Had the product been delivered in small increments, it would have been possible to integrate technology consistent with the pace of the industry.

The Feedback Loop Is Increased

Delivering on iterations of your product in small batch sizes increases the frequency of feedback from your user base, so you can be sure your product development is consistent with the expectations of your customers.

If you deliver a large batch of features to your user base, there is a likelihood that one or more features will be ill-received. This means the other delivered features were predicated on a false assumption. Engineers will make assumptions all through the construction process based on the most current information. This situation adds an inordinate amount of risk to a rollout that could have been mitigated with correct assumptions established from the outset.

Cost Is Decreased

When development of a software feature is finished, there are still activities that need to occur before a feature goes live. These include code testing, project meetings and process reviews. It is a common misconception by project managers that transaction costs associated with development cycles are fixed. In fact, these activities exponentially grow as the batch size grows and as a result are extremely costly. For example, troubleshooting a bug in code that relates to one feature is relatively straightforward. Troubleshooting a bug where code affects several related features is exponentially more complex and time-consuming.

Large batches increase risk in product development by causing schedule slippage, which in turn costs money. According to analysis by Donald Reinertsen, project schedules slip exponentially as time to delivery drifts. See figure 1 below:

fig 1

Figure 1

In conclusion, small deliverables reduce risks associated with product development. A major advantage of reduced batch size is that the time in development decreases. This allows your deliverable to get into your user's hands more rapidly and creates a feedback loop. Smaller batches ensure your project stays on budget and increases the likelihood of your ultimate goal of getting to market.


Joe Estes

Profile picture for user joeestes

Joe Estes

Joe Estes has worked with startups, enterprises and government agencies for more than 14 years, to develop successful and lasting software products. He has been at the forefront of mobile app development - leading the work at <a href="http://www.jpl.nasa.gov/">NASA's Jet Propulsion Laboratory</a>, then, at iViu, developing an indoor, micro-granular location service that is used by some of the largest retailers in the world. Joe co-founded <a href="http://goalabilityapp.com">Goalability</a&gt;, whose app is used all over the world to achieve goals by motivating social networks.

Next-Gen Analytics Drive Efficiency

Major advancements in technology will make new analytics platforms more accessible to insurers, brokers and agencies of all sizes.

|

Across the insurance industry, analytics has become the most heralded technology investment for insurance companies, wholesalers, brokers, agents and other intermediaries. Yet, with soft market conditions, thinning margins, intensifying competition, escalating M&A activity and rapidly evolving technology, making the right choices and investments is imperative.

Not surprisingly, many insurers and intermediaries have opted to maintain systems that have long become obsolete before jumping into what they're led to believe may be the next generation of features.

The issue is where to focus. Selecting the right platform essentially comes down to assessing an organization's current and future information needs and matching them to today's technology offerings.

To put this in a wider context, let’s take a closer look at some of the dynamics currently facing the commercial insurance marketplace.

Broker and Agent Challenges

Competition

Brokers are moving aggressively to build their business across all market segments, industries and geographies.

Consolidation

The pace of M&A is accelerating among mid-sized and smaller broker/agents. This affects all players - whether you are an acquirer, likely to be acquired or face direct competitors with added capabilities or resources through M&A transactions.

Focus on Maintaining or Improving Margins

All players are driving not only for market share but for growth that will deliver higher margins. Accordingly, competition among brokers and agents is intensifying for the most profitable business segments.

Drive for Efficiency

With the sustained soft commercial insurance market, brokers and agents face an imperative for greater efficiency and improvement in workflow structure.

Rising Client Expectations

Despite generally lower rates for insurance coverage, clients continue to have higher service expectations of all their providers.

Reconciling Old and New Technology

Brokers and agents must balance the need to incorporate technology against the cost of development, implementation and integration of any new system with legacy systems.

Insurer Challenges

Strengthening Distribution Network

Many insurers are working to develop and expand business with national, regional and local brokers and agents.

Navigating Competition

Insurers face intensifying competition for all types of business, especially in profitable coverage lines and with high-margin client segments.

Emerging Risks and Opportunities

Evolving risks provide opportunities for innovation both in terms of creating products and improving existing coverage lines.

Maintaining Individual Producer Relationships

Insurers want ways to address a lack of visibility of accounts and account owners at national, regional and local brokers; to remain effective, they must keep up with contact changes at brokers, shifting responsibilities of client managers.

Growth vs. Retention

With each renewal, insurers must balance need for client retention with a drive for new business.

Better Data and Feedback

To meet aggressive growth goals, insurers need improved market intelligence and feedback on product offerings and potential solutions for improving both retention and capturing new business. They also can gain from formal feedback mechanisms to track reasons for lost business. 

Building solutions: Meeting the industrywide need for tech-based analytics

In recent years, global brokers have developed and implemented proprietary systems that enable them to capture details of individual placement transactions on a global basis and gain insights on pricing trends, terms and conditions, market penetration, client and carrier characteristics and success rates.

This information also yields substantial benchmarking data for senior management, individual brokers and marketing executives. For insurers, these analytical capabilities have proved invaluable in developing and refining insurance products and services, targeting industry segments more effectively and operating more profitably.

Within the U.S., however, the largest global brokers still account for only about 20% of the overall commercial insurance marketplace. As insurers strive to expand their business with national, regional and local brokers and agents, they need similar robust analytical capabilities to achieve efficiencies.

Today, most insurers have access to a variety of technology-based solutions for tracking and analyzing various types of claims. However, beyond what's currently available from the largest brokerages for their own business, insurers generally lack similar solutions for identifying and managing their incoming business and continuing clients across the spectrum of their broker and agency relationships.

The next generation of analytical platforms will enable insurers to track, manage, understand and evaluate business obtained from each of their brokerage and agency relationships. Insurers will be able to pinpoint producers at each broker directly responsible for placing business by geography, client size, industry sector and other delineators.

With a clear line of sight across their entire portfolio, insurers will have enhanced abilities to develop and roll out new products and policy features, especially those targeted to specific industry sectors or client types. They will be able to get rapid feedback on why they lost business or failed to win new clients.

Meanwhile, as mentioned, brokers and agents face similar challenges with respect to their accounts and underwriter relationships. Given the number and spread of clients in their books, brokers want solutions that enhance their efficiency and ability to service and grow their business.

New platforms also will offer brokers and agents the ability to track and monitor their business, as well as to strengthen and expand their relationships with insurers. Individual producers will be able to view their own accounts and benchmark them against those of the same size, geography, industry and other factors.

This, in turn, will help brokers and agents identify and address gaps in client programs, expand the insurers providing quotes for individual clients and specific coverage lines, negotiate pricing terms and conditions more effectively and elevate overall service delivery and performance.

The same platforms will offer views for broker/agency senior leadership that will detail account profitability; help assess performance by producer, office or region; and make informed decisions about resource allocation, sales, marketing and planning. After a merger or acquisition, the systems will help accelerate business integration, including the development of consistent service delivery across the combined book.

Major advancements in technology, including dramatic decreases in data storage costs afforded by the cloud, will make new analytics platforms more affordable and accessible to insurance companies, brokers and agencies of all sizes. Ultimately, the widespread availability, real-time information, feedback and robust capabilities of the next generation of analytics platforms will propel the insurance distribution system into the 21st century.

Stay tuned.


Kabir Syed

Profile picture for user KabirSyed

Kabir Syed

Kabir Syed is founder and CEO of RiskMatch, a business intelligence and analytics firm that provides insurance portfolio management and placement solutions to insurance brokers and insurers. During the course of his 20-year insurance industry career he has held leadership positions in marketing, strategic planning, operations and analytics.

What Makes Us Get Sick? Look Upstream

The job for doctors -- and for us as patients -- isn't just to heal but to figure out what makes us sick in the first place and stop it in its tracks.

The headline comes from a TED conference speech by Rishi Manchanda, who has worked as a doctor in South Central Los Angeles. After about 10 years, he realized, "His job isn't just about treating a patient's symptoms, but about getting to the root cause of what is making them ill-the 'upstream' factors like a poor diet, a stressful job, a lack of fresh air. It's a powerful call for doctors to pay attention to a patient's life outside the exam room."

This story has a WOW factor.

Let me repeat: Dr. Manchanda came to realize it's not enough to treat a patient's symptoms, but to get to the root cause of what makes people sick. Regular readers of Cracking Health Costs will know this is a familiar message. Of all the things that cause us to die too early, medical care can only deal with about 25%. The rest is about how you live your life.

This also explains why typical corporate wellness programs fail. They're trying to ameliorate symptoms but ignore the root cause of syndromes such as high blood pressure, high cholesterol, etc. It's not enough to walk into a smoke-filled house and turn on the exhaust fan. You need to put out the fire, too.

Wellness buyers, e.g. benefit managers, need to have the same epiphany as Rishi Manchanda.

I've been writing a series of posts about root causes of illness: loneliness, job stresses, life dissatisfaction, etc. I also firmly believe the time is right to start thinking about employee ailments in an entirely different way.

My next book, An Illustrated Guide to Managing Your Health—How to Improve Your Health in 40 Common-Sense Steps, could well be called, "For better health, look upstream."


Tom Emerick

Profile picture for user TomEmerick

Tom Emerick

Tom Emerick is president of Emerick Consulting and cofounder of EdisonHealth and Thera Advisors.  Emerick’s years with Wal-Mart Stores, Burger King, British Petroleum and American Fidelity Assurance have provided him with an excellent blend of experience and contacts.

Using the Workplace to Prevent Suicide

We call on all organizations to implement policies and programs that promote a mentally healthy workforce and prevent suicide.

|

Most deaths by suicide are among people of working age. Suicide is the leading cause of death for males aged 25–44 years and females aged 25–34 years. The proportion of suicides that are work-related is unclear. One Australian study found that 17% of suicides in Victoria from 2000–2007 were work-related. Applying this estimate to deaths across Australia, approximately 3,800 suicides over the decade to 2011 may be work-related.

Adults spend about a third of their waking hours at work. The workplace provides a unique opportunity to provide key health information and intervention. Suicide Prevention Australia (SPA) sees the workplace as playing a vital role in the creation of a suicide safe community.

The World Health Organization suggests worker suicide is a result of a complex interaction between individual vulnerabilities and work-related environmental factors that trigger stress reactions and contribute to poor mental well-being. Employers have a legal responsibility to provide a safe and healthy workplace, including managing psychosocial stressors.

Suicide Prevention Australia believes urgent action is required to address a range of systemic issues including managing unemployment, workers compensation and coronial processes. In addition, we call on organizations of all sizes to implement workplace policies and programs that promote a mentally healthy workforce and prevent suicide behaviors.

Understanding of the cost of workplace stress is continuously building and includes productivity losses because of "presenteeism" (the act of coming to work despite sickness, physical or mental) and absenteeism as well as workers' compensation claims. No detailed and independent costing exists on the cost of suicide and suicidal behavior to the Australian economy.

A plausible estimate was calculated to be $17.5 billion per year, including productivity costs. Death claims paid out by Group Life Insurers in Superannuation for suicide exceeds $100 million per year. Monetary value aside, suicide cuts lives short and leaves scars.

Suicide is mostly preventable, yet significant gaps exist in our understanding of the relationship between work and suicide, limiting prevention efforts. SPA has reviewed the existing evidence, and we believe urgent action is required to address a range of systemic issues including managing unemployment, workers' compensation and coronial processes. In addition, we call on organizations of all sizes to implement workplace policies and programs that promote a mentally healthy workforce and prevent suicide behaviors.

We ask employers to draw on the information provided in this document and call on them to:

  • Promote a workplace culture that is inclusive, de-stigmatizes mental health problems and encourages help-seeking. Sharing stories about personal experiences with suicide and mental health problems can be a powerful way to address stigma. In appropriate settings and with support and informed consent of all parties involved, leaders are encouraged to share their own stories, highlighting positive coping strategies and sources of help.
  • Prioritize psychosocial workplace safety. This includes identifying ways to reduce work-related stressors.
  • Understand and value the person as a human being rather than a resource. This includes understanding the interactions between what happens within the workplace and other aspects of life including family, relationships, cultural background, health, etc. This will help facilitate an understanding of the meaning of work for people and the impact of stress, loss or failure of work on their lives.
  • Promote mental health and suicide awareness within the workplace, paired with clear and communicated pathways to support for those in need.
  • Establish mechanisms for the recognition and early detection of mental health and emotional difficulties in the workplace.
  • Provide employees with access to appropriate self-help or professional interventions and treatment, for example via employee assistance programs linked to external community health resources. Pathways to care should be well promoted within the workplace, making sure employees feel encouraged to draw on these supports and understand the confidential nature of services. This will help overcome potential fear of breach of privacy.
  • Frame suicide prevention programs in a manner that respects the cultural backgrounds and needs of the target audience, taking into account factors such as cultural and linguistic diversity, indigenous status and diverse sexualities and genders.
  • Be prepared for suicide to touch the lives of your employees and to respond appropriately. Lived experience of suicide can include having thoughts about taking one's own life, making a suicide attempt, caring for someone who is suicidal, being bereaved by suicide, witnessing a suicide or being exposed to suicide in some other way. These experiences will take on different meaning and importance for every person and can have lasting impacts.

To assist individual employers to achieve this, we ask that industry and employer groups:

  • Establish relationships with key suicide prevention and mental health organizations.
  • Develop industrywide guidelines for suicide prevention.
  • Invest in the development of multifaceted suicide prevention programs tailored for the industry. This is especially urgent for industries characterized by relatively ready access to suicide means, elevated risk of suicide or a high proportion of male workers.
Government plays a vital role in suicide prevention. Action is required by government to address systemic issues that contribute to work-related suicide. We call on government to:
  • Promote policies and practices that encourage employment, as this will give more people protection against one of the more significant risk factors for suicide.
  • Invest in both labor market programs and suicide prevention programs (including mental health promotion) during times of economic downturn.
  • Provide access to counselling services (via employment pathway services) for individuals unemployed for more than four weeks.
  • Provide suicide intervention skills training for front-line staff working with the long- term unemployed.
  • Fund research into the relationship between work and suicide to inform suicide prevention activities.
  • Review the role of the workers' compensation system in suicide prevention, minimizing harm and maximizing opportunities for intervention with those vulnerable to suicide. To achieve this, workers' compensation claims databases require improvement, and research is required to better understand the relationship between workers' compensation and suicide.
  • Give coroners adequate resources to ensure that coronial investigations include the role of work in suicide deaths.
  • Develop guidelines for suicide prevention in line with the Australian Work Health and Safety Strategy 2012-2022.
  • The proposed harmonized workplace health and safety regime increases focus on duty of care including mental health. We call on state and territory governments to implement recommendations under the proposed regime.
  • Invest in mental health and suicide prevention in the workplace.

Sue Murray

Profile picture for user SueMurray

Sue Murray

Sue Murray has a background in education and specialty in health promotion and has been a passionate advocate for improving the health and well being of the community throughout her career. Murray has more than 25 years experience in the community sector and held positions with responsibility for education, media, communications, fundraising and organisational leadership.

Are 'Best Practices' Really Best?

Benchmarking against "best practices" can make a company more competitive, but three problems often crop up and actually reduce value.

Best practices can help companies gain a competitive advantage. However, the opposite is often true. There are various reasons for this, but, in our experience, we have observed three major problems with implementing what a company perceives to be best practices.

  • First, the benefits are elusive. They are often difficult to measure, with no baseline or true comparison, and the costs to implement them are often excessive and misunderstood. This often means there are significant implementation costs and effort with few tangible results.
  • Second, best practices exist in the rearview mirror. By the time you have adopted them, business conditions and the right ways for implementing them will have changed.
  • Third, once adopted, these practices can be very difficult to change. The organization has invested emotion, credibility, time and money, and it's very difficult to abandon a practice even if it doesn't work or needs to be adapted.

What's often missing is clarity on the best practices that are most relevant to the business.

Companies naturally want to be competitive, and many seek inspiration outside their own industry in their search for the best-of-the-best. Companies tend to reach broadly, embracing a great number of potential best practices. Conversely, some companies narrowly benchmark themselves against only their peers. Following either extreme often results in missed opportunity for real improvement. In addition, if implementation occurs in silos, then best practices tend to compete with one another and increase complexity and overhead.

Although the benefits from deftly applied best practices can be real and demonstrable, they often are more elusive than anticipated and can result in frictional costs, impasses, noise in the system and ultimately few concrete benefits to the bottom line. Moreover, implementation costs can be high but may not be visible until the cost of adoption or compliance becomes evident throughout the organization. Finally, benefits may elude adoption, specification, measurement and capture.

Our observations

Best practices tend to be selected and defined in isolation. Once they have a mandate, individual business and functional areas, centers of excellence and shared services often tend to implement new practices without giving sufficient consideration to downstream implications, such as costs. New best practices come with costs, and when they are supposed to result in higher service levels, they may come with higher-than-average costs.

Accordingly, the application of new practices requires balance and compromise. They may reduce expenses in part of the organization but may increase frictional costs and place an extra burden on people, process and technology elsewhere.

A common problem is that many companies attempt to implement too many new practices in too many places. Most organizations' ability to adapt to change is limited, and change tends to be undermanaged, especially when new best practices are required by new control mandates.

Many companies also tend to overestimate the opportunities that standardization offers. We have seen some companies try to standardize everything and inevitably encounter challenges they had not anticipated. New best practices need to be capable of changing and evolving over time, as well as being able to adaptable to local differences and requirements.

Lastly, many companies fail to conduct a cost/benefit analysis when instituting new best practice mandates. If a best practice is central to the business strategy's success, then frictional costs are an acceptable risk. However, excessive application of best practice improvements can waste resources (e.g., a need for excess staff to perform the work, complex policies and procedures, standardization for its own sake and demands for "unnatural acts of cooperation" that hinder the business' ability to respond to changes in the marketplace).

What should companies do differently?

Many companies have the habit of relying on practices that are not appropriate for them and therefore fail to effectively execute desired strategy. To help prevent these problems, we suggest using a success framework that prioritizes and enhances company focus on improvement efforts. This framework should have the following six characteristics:

Success Framework A Mechanism...
1.  Prioritization To identify what’s most important and to align implementation effort with strategy.
2.  Proportion To confirm that the implementation effort is proportionate to the practice's perceived value.
3.  Readiness To assess organizational appetite and readiness.
4.  Implementation To assign authority, accountability, responsibility, and appropriate resources.
5.  Impact To track impact, including both intended and potentially unintended consequences.
6.  Change To drive continuous improvement and to authorize a full stop if warranted.
 

It is critical to first identify which best practices are worth the effort, through prioritization.

Implementing leading practices can cost money, but there may or may not be tangible benefits or related savings. The question to ask is: How good is good enough? Moreover, when the implementations of best practices compete with each other for time and focus, there are frictional costs that further minimize expected benefits. Being cognizant of frictional costs and avoiding them is critical to optimizing investment and benefit realization.

What we've concluded

Best practices are about performing better and therefore adding strategic and operational value.

Accordingly, because of the highly subjective nature of "best," we suggest the term "value added practice" (VAP) instead. By putting value at the center of practice improvement efforts, a company can better plan and implement new practices. Frameworks for investment and continuous improvement are key, especially at larger organizations where budgets, controls and approvals tend to be complex.

Before embarking on a new best practices initiative, a company should perform a quick self-diagnosis. Are you trying to implement a best practice for its own sake, or are you clearly focusing on the value you hope to realize?

If you plan to invest in new capabilities without tying them to specific business objectives, then you should step back and determine just how implementing new best practices will benefit the company.


Bruce Brodie

Profile picture for user BruceBrodie

Bruce Brodie

Bruce Brodie is a managing director for PwC's insurance advisory practice focusing on insurance operations and IT strategy, new IT operating models and IT functional transformation. Brodie has 30 years of experience in the industry and has held a number of leadership positions in the insurance and consulting world.

Electronic Health Records Hurt Care

Electronic health records force practitioners to focus on checking the boxes and distract them from their mission: providing great care.

|

Patient care as we know and expect it will diminish because of electronic health records (EHR) requirements. Society will suffer a slow degradation of artful interactive provider attention in deference to "data-field" medicine.

I am not simply referring to the very real and challenging issues in the technical application of EHR systems. Rather, I point out a more serious and insidious future threat to the actual human aura in medical practice.

There exists an unintended but real incentive for doctors and clinicians to consider task-completion as clicking through the data interface rather than interacting with and treating the patient. Legal requirements, reimbursements and potential penalties force EHR to top priority. In turn, clinicians as EHR users become more aware of and anticipate the truncated, template-driven and limited means of expressing case events via electronic reports. Therefore, their interaction with patients may be truncated.

I know this sounds callous and insulting to all good medical providers. To them, I say no insult is intended, and the fault of this perverse incentive is not theirs. They might honestly assess their experience and the actions of peers and associates within their practices given the advent of EHR. To providers, I ask: What about EHR might be sucking the creative life out of your optimal vision for the practice of your specialty?

My most stark encounter with this reality comes from a chance discussion with a longtime friend. She is a nurse practitioner who, for decades, has treated both ER and family-practice patients. As family friends, we never talk shop, and this particular conversation was not solicited by me. I politely asked, "How's it going?" and got a surprising, soul-baring burst of frustration.

She expressed disdain. She prides herself as a master of triage, symptom investigation, on-the-spot research and communication with involved family members, and she desires to take the wide approach to patient situations as a service to them and to the doctors or specialists who may eventually carry the case, but electronic records don't allow the narratives or collective points of data she would prefer. As such, her value is diminished, and the patient ultimately gets poor attention.

As she described her situation, I began to understand the rigid decision-tree "intelligence" in narrowing prompts for information based on how case records are initiated. She has persevered and found cumbersome work-around methods (such as editing previous fields to change next options, etc.) to combine or add issues or thoughts to a record beyond the template's desired straight line of thought. Unfortunately, she explained, taking extra time to do anything is neither advisable nor encouraged because of the volume of patients requiring care.

Quick Tip: The Want for Data Should Not Put the Cart Before the Horse

As a foreshadowing about healthcare in general, consider what the supreme focus on automation and data collection has done to workers' compensation. I have written extensively about the advent of electronic claim systems, over decades, reducing the adjusting job from that of an intelligent, intuitive personal-interactive specialist to the current task-level data entry clerk. We are now well into the post-paper-file generation of claim adjusters who know their job only as data-interface. Will medical clinicians meet the same fate when our current generation of providers, like my friend, move on? Will future clinicians, knowing only electronic records, assume that the decision tree of the EHR interface supersedes intuitive medicine?

Let's hope not. Unfortunately, a simple Google search for "problems with EHR" will not sit well with anyone who embarks on some research in this area.

In claim adjusting, as in medicine, we need to intelligently feed the hunger for data but rail against a perverse desire to let automation increase case volumes or assume the template is sacrosanct. I am certainly not against all the good that electronic medical records bring to the party. However, we must first let practitioners do their jobs, not let "data screen medicine" dumb down patient care.

Perhaps provider-run coalitions should dictate standards for ever-improving EHR frameworks and interfaces so their highest-quality, real-time nimble intelligence can be best captured in all patient events. I know at least one nurse practitioner who has a lot to say on that subject.


Barry Thompson

Profile picture for user BarryThompson

Barry Thompson

Barry Thompson is a 35-year-plus industry veteran. He founded Risk Acuity in 2002 as an independent consultancy focused on workers’ compensation. His expert perspective transcends status quo to build highly effective employer-centered programs.

Insurers Are Turning to Dubious Securities

With interest rates so low, carriers are turning to dubious securities known as ILS to generate better returns. This may not end well.

The latest California Department of Insurance market share report said that workers' compensation carriers combined to write $11.43 billion in premium in 2014, up 11% from 2013, the fourth consecutive double-digit increase. Premium growth lately is more a reflection of increasing payrolls rather than rates, as those have held rather steady in the last few years. But there is another, more insidious reason for premium growth lately: low interest rates. Chief financial officers are able to lock in only pathetic returns using traditionally safe investments because of the moribund interest rate environment.

So they are looking to alternative investment vehicles -- and history doesn't look kindly to that kind of activity involving dubious securities. The last time "interesting" financial assistance came into the California (and subsequently national) market was in the mid-1990s, when the Unicover/Craigwood reinsurance scheme was being pitched to minimize carrier risk ... and dozens of carriers folded.

Now we have a new investment vehicle that is picking up steam, and I think it portends trouble if not kept in check. It's called ILS.

In aviation, ILS is Instrument Landing System - a way for aircraft to find the runway under a layer of clouds and fog. In insurance, ILS is insurance-linked securities.

The most common ILS, and what brought this alternative to note, are CAT bonds. Catastrophe bonds are risk-linked securities that transfer a specified set of risks to investors. They were first used in the mid-1990s in the aftermath of Hurricane Andrew and the Northridge earthquake.

Wikipedia has a good explanation: "An insurance company issues bonds through an investment bank, which are then sold to investors. These bonds are inherently risky ... and usually have maturities less than three years. If no catastrophe occurred, the insurance company would pay a coupon to the investors, who made a healthy return. On the contrary, if a catastrophe did occur, then the principal would be forgiven, and the insurance company would use this money to pay their claim-holders. Investors include hedge funds, catastrophe-oriented funds and asset managers."

At least one insurance investment observer indicates alarm at the "convergence" of the insurance and capital markets. Michael Moody, MBA, ARM, in the April edition of Rough Notes magazine writes about "Capital Market Convergence" and describes how the money behind the capital structure of the insurance industry is increasingly being collateralized and sold off to investors with the single intent of increasing yield on capital invested: "With interest rates continuing at historically low levels, most institutional investors are looking for better yields. Currently, many of the ILS products are producing results that are 5% to 6% higher than traditional investments."

Here's the issue: There will be many investment people who know nothing about the insurance product providing the capital. Financial instruments such as credit default swaps (CDS) and collateral debt obligations (CDOs) and others created by Wall Street will move capital out of the insurance industry to the detriment of the insured public, and this includes workers' compensation.

Moody understatedly writes: "Agents and brokers who have accounts that utilize significant amounts of reinsurance need to be aware of the advancements that are being made in the ILS market. The old days of competing on price are disappearing. Capital market professionals believe it is only a matter of time before reinsurance and ILS will be used in the same manner that reinsurance is purchased in layers today. It will not be uncommon to find excess limit programs that are made up of a combination of reinsurance and ILS. The genie is out of the bottle, and the capital markets appear to be willing to embrace the convergence with the insurance/reinsurance concept. As a result, agents and brokers who are interested in a long-term view of the insurance industry would be well advised to monitor this situation closely, as it will remain extremely fluid for some time."

Certainly, departments of insurance will protect us from dubious securities, right? After all it is their job to regulate the insurance market and ensure a safe, healthy industry.

Well, that didn't happen when Unicover/Craigwood came around, and there's no reason to believe that any regulating agency is going to be proactive; traditionally, regulators are reactive. By the time they are alerted and take action, it's too late - carriers disappear, guarantee associations are swamped and state funds take up the slack (as in 2000, when the State Fund covered 50% of the California market).

California, and the nation's work comp market, is one bad ILS away from disaster. Carriers won't be looking for the runway under the clouds - rather, they'll be looking for insolvency relief.


David DePaolo

Profile picture for user DavidDePaolo

David DePaolo

David DePaolo is the president and CEO of WorkCompCentral, a workers' compensation industry specialty news and education publication he founded in 1999. DePaolo directs a staff of 30 in the daily publication of industry news across the country, including politics, legal decisions, medical information and business news.

San Andreas -- The Real Horror Story

In the wake of the disaster movie "San Andreas," insurers should take two actions to remedy the woeful lack of use of earthquake insurance.

|

For the past two weeks, the disaster movie "San Andreas" has topped the box office, taking in more than $200 million worldwide. The film stars Dwayne "The Rock" Johnson, who plays a helicopter rescue pilot who, after a series of cataclysmic earthquakes on the San Andreas fault in California, uses his piloting skills to save members of his family. It's an action-packed plot sure to keep audiences on the edge of their seats.

As insurance professionals who specialize in quantifying catastrophic loss, we can't help but think of the true disaster that awaits California and other regions in the U.S. when "the big one" actually does occur.

The real horror starts with the fact that 90% of California residents DO NOT maintain earthquake insurance. The "big one" is likely to produce economic losses in either the San Francisco or Los Angeles metropolitan areas in excess of $400 billion. With so little of this potential damage insured, thousands of families will become homeless, and countless businesses will be affected - many permanently. The cost burden for the cleanup, rescue, care and rebuilding will likely be borne by the U.S. taxpayer. The images of the carnage will make the human desperation we saw in both Hurricane Katrina and Superstorm Sandy pale by comparison.

The reasons given for such low take-up of earthquake insurance generally fall into two categories: (1) Earthquake risk is too volatile, too difficult to insure and, as a result, (2) is too expensive for most homeowners.

Is California earthquake risk too volatile to insure?

No.

The earthquake faults in California, including the Hayward, the Calaveras and the San Andreas faults. are the most studied and understood fault systems in the world. The U.S. Geological Survey (USGS) publishes updated frequency and severity likelihood every six years for the entire U.S. This means that estimation of potential earthquake losses, while not fully certain, can be reasonably achieved in the same manner that we can currently estimate potential losses from perils such as tornados and hurricanes. In fact, the catastrophe (CAT) models agree that it's likely that on a dollar-for-dollar exposure basis, losses from Florida hurricanes that make landfall are more severe and more frequent over time than California earthquakes, yet nearly 100% of Florida homeowners actually maintain windstorm insurance. If hurricane risk in Florida isn’t too volatile for insurers to cover, then earthquake risk in California should follow that same path.

Isn't earthquake coverage expensive?

Again, the answer is a resounding no.

The California Earthquake Authority (CEA), the largest writer of earthquake insurance in the U.S., has a premium calculator that quotes mobile homes, condos, renters and homeowners insurance. For example, a $500,000 single-family home in Orange County, CA, can be insured for about $800 a year, or roughly the same price as a traditional fire insurance policy. To protect a $500,000 home, an $800 investment is hardly considered expensive.

The real question should be: Are California homeowners getting good value? CEA policies carry very high deductibles -- typically in the 10% to 15% range -- and the price is “expensive” when the high deductibles are considered. As one actuary once explained it to us, "With that kind of deductible, I'll likely never use the coverage, so like everyone else I'll cross my fingers and hope the 'big one' doesn’t happen in my lifetime."

It's this lack of value that's the single biggest impediment preventing millions of California homeowners from purchasing earthquake insurance. It's also an area that has much room for improvement.

How can we as an industry raise the value proposition of earthquake coverage? Consider the following:

  1. The industry can make better use of technology, especially the CAT models. California is earthquake country, but it's also a massive state. This map shows that the high-risk areas mostly follow the San Andreas fault and the branches off that fault. There are many lower-risk areas in California, and the CAT models can be used to distinguish the high risk from the low risk. Low risk exposures should demand lower premiums. Even high-risk exposures can be controlled by using the CAT models to manage aggregates and identify the low-risk exposure within the high-risk pools. We expect that CAT models will help us get back to Insurance 101 by helping the industry to better understand exposure to loss, segment risks, correct pricing, manage aggregates and create profitable pools of exposure.
  2. The industry can bundle earthquake risks with other risks to reduce volatility. Earthquake-only writers (and flood as well) are essentially "all in" on one type of risk, to steal a common poker term. Those writers will fluctuate year to year; there will be years with little or no losses, then years with substantial losses. That volatility affects retained losses and also affects reinsurance prices. Having one source of premium means constantly conducting business on the edge of insolvency. Bundling earthquake risks geographically and with other perils reduces volatility. The Pacific Northwest, Alaska, Hawaii and even areas in the Midwest and the Carolinas are all known to be seismically active. In fact, Oklahoma and Texas are now the new hotbed regions of earthquake activity. Demand in those areas exist, so why not package that risk? Reducing volatility will reduce prices and help stabilize the market. We estimate that in parts of California, volatility is the cause of as much as 50% of the CEA premium.

Hollywood has produced yet another action-packed film. But to add a touch of realism, Hollywood screenwriters should consider making the leading actor, The Rock, a true hero - an "insurance super hero" who sells affordable earthquake insurance.


Nick Lamparelli

Profile picture for user NickLamparelli

Nick Lamparelli

Nick Lamparelli has been working in the insurance industry for nearly 20 years as an agent, broker and underwriter for firms including AIR Worldwide, Aon, Marsh and QBE. Simulation and modeling of natural catastrophes occupy most of his day-to-day thinking. Billions of dollars of properties exposed to catastrophe that were once uninsurable are now insured because of his novel approaches.


James Rice

Profile picture for user JamesRice

James Rice

James Rice is senior business development director at Xuber, a provider of insurance software solutions serving 180+ brokers and carriers in nearly 50 countries worldwide. Rice brings more than 20 years of experience to the insurance technology, predictive analytics, BI, information services and business process management (BPM) sectors.