Download

New Message From Lemonade

Lemonade announced a new coverage stance on guns and gun owners, but does it benefit society or just Lemonade?

sixthings
In August, I wrote about “Lemonade’s Bizarre New Self-Service Approach.” This “self-service” approach is pitched by Lemonade as a benefit to customers who can apparently delete coverage for spouses, mortgage companies and others unilaterally — something that could create serious problems for consumers who are not counseled on the possible repercussions of such actions. In other words, Lemonade is reducing its workload, increasing consumers’ risks of loss and making customers thank Lemonade for it. Earlier this month, I posted on LinkedIn about “Lemonade’s ‘Zero Everything’ Policy.” Again, Lemonade touts something that largely benefits… Lemonade. And it does it in a way that makes consumers think they are getting the benefits. See also: Politics of Guns and Workplace Safety   Just this week, I learned of yet another odd Lemonade decision regarding a new coverage stance on guns and gun owners. This is from their web site blog: “Guns, And Why Lemonade Is Taking A Stand“ According to the blog post, Lemonade’s next policy form revision will include the following:
  1. “We will exclude assault rifles altogether. We simply don’t understand why civilians need military-grade weapons, and we prefer not to insure them.
  2. We will add requirements that firearms be stored securely and used responsibly, upon penalty of voiding coverage. We believe guns should be treated mindfully and soberly, not as a plaything, a status symbol or an ideological prop. Reasonable people, we believe, can agree on that.”
On the first point, Lemonade did not elaborate on how “assault rifle” will be defined, nor did the company indicate whether the exclusion will be for loss of the rifle itself or, more likely, liability claims for BI to others. If the latter, this policy sounds like it punishes victims as much as gun owners. Lemonade’s HO policies already exclude liability coverage for intentional losses like the recent Las Vegas shooting. So even if the perpetrator had an HO policy with the company (or most other insurers), the policy wouldn’t cover victims’ claims. However, what about an accidental negligent shooting or BI that arises when someone takes a gun owned by someone else and the owner is sued? Most HO policies would respond to such claims. However, it doesn’t sound like Lemonade wants any part of that. On the second point, Lemonade says that its revised policies will require that all firearms be stored “securely and used responsibly” or else there is no coverage. How do you determine what “securely” and “responsibly” mean if these terms are the ones actually used in the revised policy language? And, again, if such an exclusion is invoked, who is the loser? It sounds like the victim(s) of such negligence now have no insurance proceeds to access. So, who is potentially the primary beneficiary of these changes? It sounds like Lemonade is because it would not be paying claims to innocent victims of the negligence of their insureds. Insurance policies sometimes exclude BI or PD that arises from illegal activities, though most auto policies, for example, don’t exclude BI or PD that arises from driving under the influence. The reason is that one of the primary purposes of liability insurance, from a social aspect, is to benefit innocent victims of such careless actions. See also: Examining Potential of Peer-to-Peer Insurers   However, what we could be seeing with this change from Lemonade is that its insurance products may base coverage on what LEMONADE thinks is morally right or wrong. The company's blog post says it is being upfront about what it thinks is “good” and “not good,” and that Lemonade is not into “gun worship” or “vigilante” gun owners and is doing its part to “solve gun violence.” Sounds like what Lemonade is doing is simply taking away a victim’s source of financial recovery for the negligence of Lemonade’s customers. Does that benefit society, or does it benefit Lemonade?

Bill Wilson

Profile picture for user BillWilson

Bill Wilson

William C. Wilson, Jr., CPCU, ARM, AIM, AAM is the founder of Insurance Commentary.com. He retired in December 2016 from the Independent Insurance Agents & Brokers of America, where he served as associate vice president of education and research.

2017 Outlook for Homeowners ROE

The estimated prospective return on equity for homeowners this year is 4.5%, down from 6.7% in 2016. There are three key themes.

sixthings
The estimated prospective ROE for homeowners this year is 4.5%, down from 6.7% in 2016. There are three key themes to note regarding homeowners insurance in 2017: Growth The homeowners' line of business continues to grow; premiums increased to $91 billion in 2016 from $89 billion in 2015. The rate of growth has slowed from prior years, and slower growth is expected in the near future with less aggressive but positive rate change in the pipeline. Further, catastrophe losses are rising faster than inflation, and coverage gaps continue in perils (like floods), suggesting opportunities exist for carriers to find premium through coverage innovations. Divergent Markets From the macroscopic perspective of this study, there are at least three different homeowners markets: 1. Florida, a market unto itself. Eight of Florida's 10 largest carriers have limited name recognition outside the Florida market, though several are expanding to other coastal states. Remove Florida and US ROE increases to 9.1%, suggesting the assumptions of this study (nationwide carrier with A.M. Best “A” rating) differ from market reality in the sunshine state. 2. The hurricane-exposed coast, excluding Florida. Hurricane coast states posted an ROE of 6.7% in this year’s study. At present, these states are characterized by heavy regulation; strong competition between established brands vs. younger carriers; and sophisticated risk differentiation based on granular catastrophe-savvy rating plans. 3. Everybody else. The remainder of the U.S. owns a respectable 12.2% ROE with market share largely dominated by big-name national and super-regional brands. Regulatory considerations are easier to navigate than in coastal states. Catastrophe risk has unique challenges associated with less robust models for thunderstorm, wildfire and flash flood risks as compared to hurricane risks. California and Washington are unique because of their strict regulatory environment, but they otherwise resemble the other states in the cohort in terms of perils and players of note in part because earthquake endorsements are not required for home loans, show limited take-up and are ultimately excluded from this analysis because they roll up to the earthquake annual statement line. Technology This year’s study examines “one dollar of homeowners premium,” which highlights 8 cents of loss adjustment and 21 cents of policy acquisition costs (12 cents for commissions and brokerage plus 9 cents for other acquisition costs). These areas of the value chain are coming under attack from insurtech startups eager to test established carriers’ ability to adapt rapidly evolving technology. Aon’s Digital Monitor currently tracks over 40 startups, backed by nearly $2 billion in venture capital, that are attacking these areas of the property and casualty value chain (not all in homeowners, specifically). Mobile and software-as-a-service platforms, drone and satellite imagery and proprietary catastrophe-detection internet-of-things-enabled hardware promise to continue to apply pressure to traditional homeowners carriers’ approach to the business of insurance. ROE study methodology The basis of the prospective ROE estimate is industry, state and aggregate statutory filing data including reported direct losses, expenses, payout pattern and investment yields. We replace actual historical catastrophe losses as measured by property claims services with a multi-model view of expected catastrophe loss. On-leveling of direct premiums to current rates uses rate filings of the top 20 insurance company groups by state. Finally, estimated capital requirements and reinsurance costs consider a nationwide-personal-lines-company writing both home and auto business at a capitalization level consistent with an A.M. Best “A” rating. The ROE estimates exclude earthquake shake losses; the premium and losses for that coverage are recorded on a separate statutory line of business. See also: 10 Trends on Big Data, Advanced Analytics   The diversification available to a nationwide personal lines insurer impacts the ROE calculation. For instance, homeowners business in California diversifies Gulf and East Coast hurricane exposure for a nationwide insurer. A California standalone would incur higher capital and reinsurance costs than the California portion of a nationwide insurer with similar premium volume in the state. Similar results are to be expected for any other regional or single state insurer. The normalization of catastrophe by this study replaces the local impacts from large events — like Harvey, Irma or the first and second quarter hail and wind losses experienced in 2017 — with the modeled catastrophe average annual loss. The prospective impact to the line from these events remains to be seen, and future versions of this study may attempt to measure impacts to rate level and reinsurance pricing. The 2017 nationwide ROE estimate of 4.5% falls below our 2016 estimate of 6.7%. Profitability challenges to the line include: (1) a slowdown of rate increases (and decreases by some major carriers) that failed to pace loss and expense inflation, and (2) premium and exposure growth that pushed up the A.M. Best capital requirements to maintain the assumed “A” rating. Declining costs of reinsurance to capitalize the volatility inherent in the homeowners line were insufficient to offset the increased capital charges. Softening reinsurance costs cumulatively added over 210 bps of ROE in our study since 2013; after the catastrophe losses of 2017, the reinsurance and capital markets will be closely watched for pricing signals. The maps above and below show, in loss ratio points, the amount that catastrophe experience exceeds model average annual loss. Adjusting combined ratios for expected versus historical catastrophe loss is an important step to distinguish weather-related randomness from inadequately priced business. Historical catastrophes can distort measures of results at a state level, causing the noise to overwhelm the signal. While state level adjustments can be significant, the 10-year nationwide experience catastrophe loss ratio of 13 points is meaningfully lower than the modeled expected catastrophe loss ratio of 23%. 2016 ended the dearth of hurricane activity that was the boon of gulf coast carriers for nearly 10 years. The Gulf states plus Florida had 30 points of favorable results relative to expected from 2007 through 2016, and, as of the time of this publication (even with Harvey and Irma), that favorable experience is more than 24 points of performance lift. The five year retrospective comparing catastrophe experience to modeled expectation is favorable for much of the country. States on the eastern slopes of the Rockies into the plains (including Colorado, Nebraska and Montana) experienced pain primarily from hail-driven losses in several of the last five years. Texas is an interesting case study because the lull in hurricane activity drives overall favorable experience overwhelming thunderstorm losses that contributed to a five-point drag on the loss ratio. The five-year averages reflect the period from 2012 to 2016. Across the country, the first two quarters of 2017 experienced the highest thunderstorm-loss levels since 2011, and the third quarter included multiple major landfalling hurricanes. Taken together, this should partially erode the favorable experience of the previous five years. The percentages in the map above show the direct target combined ratios necessary to fund reinsurance costs and allocated capital for retained risk by state, including catastrophe and non-catastrophe risk. The targets are for a sample of nationwide companies only and will vary among individual companies because of state distribution of premiums, capital adequacy standards, target return on capital, allocation methodologies, reinsurance and other considerations. For a diversified insurer with a footprint similar to the industry, the target combined ratios fall into three main categories: (1) Florida, (2) other hurricane exposed states and (3) states not materially exposed to hurricanes. The map above shows average approved rate changes filed between January 2016 and August 2017 for the top 20 homeowners groups by state that made a filing in the period. Rate activity, while still positive, continues the slowdown observed in last year’s study. Notable decreases came from at least one large industry carrier, suggesting potential divergence in pricing levels that the averages fail to reflect. Rate changes on the coast, including Florida and Texas, ticked up significantly versus observations from last year. For Florida, in particular, rate activity was likely insufficient to on-level for assignment of benefits and claims adjustment issues facing the state’s carriers. See also: How to Drive More Quotes   Homeowners as a growth engine continues to be the headline for the insurance industry through 2016; the line has outpaced GDP and most other underwriting segments since 2010. Direct written premiums increased from $71 billion in 2010 to $91 billion in 2016, with a projected $93 billion for 2017 given prospective rate activity. A strong component of growth through 2015 was the emphasis on rate adequacy with indicated rate levels increasing over 30% since 2010. Policyholders changing carriers will prevent the industry from realizing the full aggregate benefit of the individual carriers’ rate actions. The “S” shape of the rate change curve suggests the line should be watched carefully. The rate activity through 2015 is now fully earned, and rates since 2015 show more modest increases. Time will tell if rate increases around 2% will be sufficient to track loss and expense inflationary pressures. Our study suggests that, at prospective 2018 rates and before income taxes, insurers keep slightly more than four cents of profit for every premium dollar they earn. The four cents of direct profit is shared between the primary carrier, reinsurance partners and the U.S. Treasury. The full report is available here.

Greg Heerde

Profile picture for user GregHeerde

Greg Heerde

Greg Heerde is head of Americas Analytics for Aon Benfield, a group of professionals that provides catastrophe modeling, actuarial, rating agency, consulting and other strategic and technical services to Aon Benfield clients. He has 20 years of experience in the insurance industry. Prior roles with Aon Benfield include developing and leading the rating agency advisory practice and managing director in the Aon Benfield securities corporate finance team. Prior to joining Aon, Heerde was a manager in the the reinsurance team of the business assurance (audit) practice of PricewaterhouseCoopers.

Big Opioid Pharma = Big Tobacco?

The crisis calls for an investigation by Congress, lawsuits by individual states, counties and cities, class actions and more.

sixthings
Have you noticed that big opioid pharma (BOP), manufacturers and distributors of prescription opioids are under attack? I have. In fact, I've written about it for awhile. You can read “Suing Big Opioid Pharma - The Next Big Thing?” from 3/13/17, “780,069,272 Pain Pills” from 12/20/16, “Suing Big Opioid Pharma” from 9/27/16 and “Patients Sue Physicians' and Pharmacists” from 5/22/15. As I've followed the strategic initiative, it reminds me of big tobacco. As a refresher, it was accused (informally, at first, and then collectively, over time) of knowing that tobacco was dangerous and addictive but kept it a secret. In November 1998, attorney generals from 46 states entered into the Tobacco Master Settlement Agreement (MSA) with the four major tobacco companies:
The states settled their Medicaid lawsuits against the tobacco industry for recovery of their tobacco-related health-care costs, and also exempted the companies from private tort liability regarding harm caused by tobacco use. In exchange, the companies agreed to curtail or cease certain tobacco marketing practices, as well as to pay, in perpetuity, various annual payments to the states to compensate them for some of the medical costs of caring for persons with smoking-related illnesses. In the MSA, the original participating manufacturers (OPM) agreed to pay a minimum of $206 billion over the first 25 years of the agreement.
See also: Misconception That Leads to Opioids   Throughout 2017, I have saved every article I read on the subject of the opiod problem (see below). You are more than welcome to read the entire article, but I think the date (constant throughout the year), source (wide variety of publications) and headline (provocative and descriptive) provide a sweeping perspective on the scope of this activity: The full scope of the opiod crises includes an investigation by Congress; lawsuits by individual states, counties and cities around the country (and in Canada); collaboration among attorney generals; and class action lawsuits. (And maybe others). The initiation of most of this action is not academic, it is personal. Take Mike Moore, the former Mississippi attorney general who was the first to sue Big Tobacco using a then-unproven legal strategy. His nephew started with Percocet as prescribed by a doctor in 2006. By 2010, he was using street fentanyl. Moore saved his nephew from an overdose by taking him directly to the hospital.
As he’s watched the tobacco victory pay off in declining smoking rates, he’s also seen easy access to powerful pain medication spark a new deadly crisis. He’s convinced this is the moment to work the same mechanisms on the drug companies that forced the tobacco industry to heel — and he’s committed himself to making that happen. “It’s clear they’re not going to be part of the solution unless we drag them to the table.”
The primary argument against BOP is the same as the one against Big Tobacco. BOP knew the dangers of their product, but they misled consumers (in this case, prescribers) by purposefully obfuscating the truth. See also: Opioids: Invading the Workplace   If you look at the evidence (anecdotal and factual), it appears as though there was a strategic effort to hide the truth. Of course, all of this in large part is still alleged — not proven in a court of law — and BOP will have an opportunity to make their arguments. Except... In May, Purdue Pharma settled a class-action lawsuit in Canada for $20 million. But of course, settlements always include the language “no admission of guilt.” As I stated in a post:
$20 million (or 0.064% of OxyContin revenue) to settle? This is a rounding error for Purdue Pharma. But not to those who became dependent/addicted and lost anything from an active lifestyle to life itself. Fair and equitable? That was a rhetorical question — I don't believe it is either fair or equitable. Not so much the dollar amount, but the fact that it will not hurt Purdue at all in the pocketbook. If the goal of a lawsuit is to change behavior because it's too painful not to, then this probably didn't hit the mark.
Whether you believe the opioid epidemic is real or not (I do), or whether you think at least some of the deaths from illicit street heroin and fentanyl are a consequence of over-prescribing prescription opioids (I do), I think we can all agree it's wrong for a company to tell its customers there is no danger when there really is (and when the company knows it). In this case, it can be deadly. So if BOP wants to know where this is heading, they just need to refresh their memories about what happened with Big Tobacco. What happened then is about to happen again.

Mark Pew

Profile picture for user MarkPew

Mark Pew

Mark Pew is a senior vice president at Prium. He is an expert in workers' compensation medical management, with a focus on prescription drug management. Areas of expertise include: abuse and misuse of opioids and other prescription drugs; managing prescription drug utilization and cost; and best practices for weaning people off dangerous drug regimens.

Is Shrinking a Growing Trend?

What does minimalism have to do with innovation? Everything, if you are in an industry like insurance that intersects with people’s stuff.

sixthings
I have been fascinated hearing my futurist colleagues talk about the subject of minimalism as an emerging trend. While not new, the practice of deliberately ridding one’s self of things that do not add real value to life is apparently gaining traction for a variety of reasons that point to the need to understand it better. Just the fact that I was fascinated by it suggests that I may be a minimalist myself and not even know it. I read an article by Courtney Carver called, “25 Reasons Why You Might Be A Minimalist,” and I saw myself in about 18 of them. The one that made me chuckle the most is, “If you can’t stop giving stuff away, and your dog is afraid he might be next, you may be a minimalist.” I don’t have a dog, which is another sign, but my husband expressed a similar concern to me last year after I spent my entire summer sabbatical finding ways to reduce the amount of stuff in our lives; he wanted to make sure he was not on the list. Minimalism, which might sound like a religion or cult, is sometimes described as having fewer than 100 things in your life. However, it is more of a mindset than it is about strict rules. For example, Steve Martin’s character in the 1979 movie “The Jerk” had a memorable scene where he was running away from home and said, “I don’t need anything. Except this.” (He picks up an ashtray). Then he gradually picks up several other items and says, “The ashtray, this paddle game and the remote control,” etc. etc. until he is laden with stuff that was actually useless, but to him was important. See also: Innovation: ‘Where Do We Start?’   So, what does minimalism have to do with innovation? Everything. If you are in an industry that intersects with people’s stuff. Let’s see, that would be insurance, investments, banking, real estate, education, entertainment, food, packaged goods, appliances, furniture, clothing, media, pets…what isn't included? When an emerging trend intersects with your business and there is a potential new need that your capability can fulfill, that’s called innovation “white space.” New growth opportunities are generally not obvious, so to find them we must triangulate emerging trends/drivers of change with basic human needs that never change, plus the current and potential capabilities of the company. This is the fuzziest of the fuzzy front end of innovation. To get a better idea about how this is done, do this exercise: Pick a trend, such as minimalism and:
  1. Choose a human need, such as the need for freedom, stability, simplicity or control (refer to MD Trend Framework Wheel).
  2. Brainstorm many possible future states around how the trend and the need would potentially affect the core of your business competency or provide opportunity.
Here is an example for four different industries. If you cross:
  1. Real Estate: With the need for simplicity, you may imagine that more people in the future may live in tiny homes. If tiny homes are not considered real estate, how would that future hurt you or provide opportunity?
  2. Insurance: With the need for control, you may imagine that people might scrutinize the value of insurance from every direction and expect ultimate transparency. How do products and experiences stack up?
  3. Banking: With the need for stability, you may imagine that more people in the future may pay off their debt faster than they are required to. How does that change your business model?
  4. Education: With the need for freedom, you might imagine that fewer students opt for higher, specialized degrees and instead want real-world experience. If your institution relies on the predictable growth of the student body, how will you lean in?
These are just a few examples. You certainly could, however, imagine many more permutations, combinations and scenarios. So how do you know which ones to bet on? While nothing is guaranteed, there are ways to de-risk your choices through the application of different types of research, understanding what your organization will support and developing your team’s consumer-focused insight and design skills. See also: Can Insurance Innovate?   If you would like to learn more about this topic and how to de-risk the process, please register to attend, or receive the replay of, a WebEx hosted by LOMA and sponsored by LexisNexis Risk Solutions. In the meantime, clean out your closet and see how it makes you feel!

Advanced Telematics and AI

Two award winners exemplify how the use of technology like telematics and artificial intelligence is maturing in the insurance industry.

sixthings
Solution providers are quickly advancing innovative approaches to the next generation of emerging technologies that are affecting both insurers and policyholders. SMA has acquired a unique historical view of how solution providers are innovating within insurance through our annual SMA Innovation in Action Awards – and we have seen how it has changed over the six years of the program. The two winners of this year’s SMA Innovation in Action Awards for Solution Providers are excellent examples of how the use of technology like telematics and artificial intelligence (AI) is maturing in the insurance industry. TrueMotion’s smartphone usage-based insurance (UBI) product combines mobile technology, machine learning and data science to help insurers better understand and mitigate risk, acquire more profitable customers and increase customer loyalty as well as giving individual users feedback on their driving habits. TrueMotion’s solution illustrates how solution providers are expanding the possibilities of existing technologies. Telematics is firmly established as an insurance technology and continues to spread from personal to commercial lines of business. TrueMotion’s innovation lies in the refinements it has made and how it is using telematics, sensors and mobile technology to effect measurable improvements to drivers’ behavior. Not only does TrueMotion use the sensors in a user’s own smartphone in place of an on-board device (OBD), the company can determine when a driver is actually behind the wheel based on the position of his phone. This solves a weakness in OBD-based telematics: how to link good or bad driving habits to a specific driver rather than a specific vehicle, while still excluding trips that driver makes as a passenger. See also: Strategist’s Guide to Artificial Intelligence   The sophistication in data collection and the advanced analytics on the back end allow TrueMotion to give drivers immediate feedback for more effective behavioral modification. TrueMotion also takes an innovative approach in monitoring how the user interacts with her phone while driving, allowing for real-time alerts if distracted driving reaches dangerous levels. Solution providers are also moving beyond emerging technologies in the physical domain, like connected vehicles and drones, to those of the virtual realm, like artificial intelligence, cognitive computing and blockchain. These technologies have been slower to gain a foothold in the insurance world, partly due to a slower development timeline and because their insurance applications were still to be determined. That is already changing. Today, interest in artificial intelligence and machine learning, in particular, has skyrocketed. Intellect SEEC’s Intellect Risk Analyst is an AI-based risk discovery and assessment software for the commercial insurance industry. It aggregates data from more than,800 structured and unstructured data sources and applies rules set up by the underwriter. Machine learning analyzes the resulting underwriting decisions and refines its data aggregation and analysis capabilities in response. This solution gives insurers an accessible way to apply big data and AI to their existing transactional processing – in this case, underwriting. Integrating big data with advanced AI technologies gives underwriters access to new information and insights on the proposed risk. These new insights for risk assessment and pricing, together with the continuous improvement of data aggregation with machine learning, are powerful assets for underwriting. See also: It’s Rush Hour in Telematics Market   In these continually changing times, it becomes critical for all members of the insurance ecosystem to collaborate on new ways of doing business. Technologies that are focused on improving risk and harnessing the power of AI and machine learning, for example, showcase how resources and technology are interacting to change the insurance landscape. Innovation is flourishing across the insurance industry, and there are still great opportunities ahead. As solution providers continue to grow their use of advanced technologies into new territory, the potential value of their partnership increases for insurers. We expect to see solution providers’ innovations play a key role in the transformation of the insurance ecosystem.

Karen Furtado

Profile picture for user KarenFurtado

Karen Furtado

Karen Furtado, a partner at SMA, is a recognized industry expert in the core systems space. Given her exceptional knowledge of policy administration, rating, billing and claims, insurers seek her unparalleled knowledge in mapping solutions to business requirements and IT needs.

'Close Enough' Isn't Good Enough

Hyper-precise location data is needed. Without it, 5% to 10% of homeowners and auto policies are mispriced -- by as much as 87%.

sixthings
Insurers stake their businesses on their ability to accurately price risk when writing policies. For some, faith in their pricing is a point of pride. Take Progressive. The auto insurer is so confident in the accuracy of its pricing that it facilitates comparison shopping for potential customers—making the bet it can afford to lose a policy that another insurer has underpriced, effectively passing off riskier customers to someone else’s business. There are a number of data points that go into calculating the premium of a typical home or auto insurance policy: the claim history or driving record of the insured; whether there is a security system like a smoke or burglar alarm installed; the make, model and year of the car or construction of the home. Another contributing factor, of course, is location, whether it’s due to an area’s vehicle density or crime statistics or distance of homes from a coastline. Insurers pay close attention to location for these reasons, but the current industry standard methods for determining a location—whether by zip code or street segment data—often substitutes an estimated location for the actual location. In many cases, the gap between the estimated and actual location is small enough to be insignificant, but where it’s not, there’s room for error—and that error can be costly. Studies conducted by Perr&Knight for Pitney Bowes looked into the gap between the generally used estimated location and a more accurate method for insurers, to find out what impact the difference had on policy premium pricing. The studies found that around 5% of homeowner policies and a portion of auto policies—as many as 10% when looking at zip-code level data—could be priced incorrectly because of imprecise location data. Crucially, the research discovered that the range of incorrect pricing—in both under- and overpriced premiums—could vary significantly. And that opens insurers up to adverse selection, in which they lose less-risky business to better-priced competitors and attract riskier policies with their own underpricing. Essentially, this report discusses why a “close enough is good enough” approach to location in premium pricing overlooks the importance of accuracy—and opens insurers to underpricing risk and adverse selection. The first part of this paper discusses the business case for hyper-accurate location data in insurance, before going into more detail on the Perr&Knight research and the implications of its findings, as well as considerations when improving location data. It concludes with a few key takeaways for insurers going forward. We hope you find it constructive and a good starting point for your own discussions. The Business Case for Better Location Data Precise location data helps insurers realize increased profits by minimizing risk in underwriting, thereby reducing underpricing in policies. These factors work together to improve the overall health of the insurer’s portfolio. “The basic, common sense principle is that it’s really hard to determine the risk on a property you’re insuring if you don’t know where that is,” says Mike Hofert, managing director of insurance solutions at Pitney Bowes. “Really, the key question is, how precisely do you need to know where it is? If you’re within a few miles, is that close enough?” While most of the time, Hofert says, the answer might be yes—especially for homes in major hurricane, landslide or wildfire zones, because those homes all have a similar location-based risk profile—it’s not always the case. Where it’s not, imprecise location data can have costly consequences. “There are instances where being off by a little bit geographically turns into a big dollar impact,” he says. See also: Competing in an Age of Data Symmetry   Currently, industry standard location data for homeowner policies rely typically on interpolated street data. That means that streets will be split into segments of varying length, and homes within that segment are priced at the same risk. However, explains Jay Gentry, insurance practice director at Pitney Bowes, the more precise method is to use latitude and longitude measured in the center of the parcel, where the house is. That can be a difference of a few feet from the segment, or it can be a difference of 500 feet, a mile or more. “It just depends on how good the [segment] data is,” Gentry says. And that flows into pricing, because when underwriters can more accurately assess the risk of a location—whether it’s where a home is located or where a car is garaged—policies can be priced according to the risk that location actually represents. It’s tempting to look at the portion of underpriced policies and assume that they’re zeroed out by the overpriced policies an insurer is carrying, but Gentry says that’s the wrong way to look at it—it’s not a “zero sum” game. “If you really start peeling back the layers on that, the issue is that—over a period of time—it rots out the validity of the business,” he says. “If you have an over- and underpriced scenario, the chances are that you’re going to write a lot more underpriced business.” A key point here is reducing underpricing, because, when the underlying data leads to policies that are priced at a lower rate than they should be, not only does it open an insurer up to paying out on a policy it hasn’t received adequate premiums for, but underpriced policies may also end up constituting a larger and larger portion of the overall book. This is essentially adverse selection. Michael Reilly, managing director at Accenture, explains that if the underlying pricing assumptions are off, then a certain percentage of new policies will be mispriced, whether at too high or too low a rate. “The ones that are overpriced, I’m not going to get,” he says, explaining that the overpriced submissions will find an insurer that more accurately prices at a lower rate. “The ones that are underpriced, I’m going to continue to get and so, over time, I am continuing to make my book worse,” he says. “Because I’m against competitors who know how that [policy] should be priced correctly, my book will start to erode.” And, if that policy is seriously underpriced, losses could easily outweigh all else. Gentry recalls the example of an insurer covering a restaurant destroyed in the Tennessee wildfires in 2016, which it had underpriced due to an inaccurate understanding of that location’s susceptibility to wildfire. “The entire block was wiped out by the wildfire, and [the insurer] had a $9 million claim that they will never recoup the loss on, based upon the premiums.” The Value of Precision Perr&Knight is an actuarial consulting and insurance operations solutions firm, assisting insurers with a range of activities including systems and data reporting, product development and regulatory compliance. It also commonly carries out research in the insurance space, and Pitney Bowes contracted it to conduct a comparison of home and auto policy pricing with industry-standard location data and its Master Location Data set. We spoke with principal and consulting actuary Dee Dee Mays to understand how the research was conducted and what it found. The following conversation has been edited for clarity and length: How was each study carried out and what kinds of things were you looking to find? On the homeowners' side, we looked at the geo-coding application versus the master location data application. And on the personal auto side, we looked at three older versions that are called INT, Zip4 and Zip5, and we compared those results with the master location data result. In both cases, we selected one insurance company in one state—a large writer—and had Pitney Bowes provide us with all of the locations in the state. For homeowners, they provided us with a database of single-family, detached home addresses and which territory each geo-coding application would put the address in. They provided us with that database and then we calculated what the premiums would be based on those results and how different they would be, given the different territory that was defined. For both cases, we picked a typical policy, and we used that one policy to say, “Okay, if that policy was written for all these different houses, or for a vehicle with all these different addresses, how much would the premium differ for that one policy under the various systems?” And what did you find? What we found [for homeowners] was there were 5.7% that had a change in territory. So, almost 94% had no change under the two systems. It's coming down to the 5% that do change. I think that what is more telling is the range of changes. The premium could, under the master location data, either go up 87%, or it could go down 46%. You can see that there's a big possibility for a big change in premiums, and I would say that the key is, if your premium is not priced correctly, if your price is too high compared with what an accurate location would give you, you are probably not going to write that risk. [If] someone else was able to write it with a more accurate location and charge a lower premium, the policyholder would say, “Well, I want to go with this lower premium.” See also: Location, Location, Location – It Matters in Insurance, Too So, you're not going to get the premium that's too high, but if you're using inaccurate location and you come up with a lower premium than an insurer that was using accurate location, you are more likely to write that policyholder. The studies were conducted based on policies for homeowners in Florida and vehicle owners in Ohio; so what kind of conclusions can we draw about policies in other states? I think it really depends on what the individual insurance company is using to price its policies. One [example] is that it’s now more common in states like California, Arizona, even Nevada, for companies to have wildfire surcharges—and they determine that based on the location of the property. So it’s definitely applicable in another state like that, because any time you’re using location to determine where the property is and you have rating factors based on the location, you have the potential that more-accurate data will give you a better price for the risk that you’re taking. Putting a Plan in Place Michael Reilly works at Accenture with the insurance industry and advises on underwriting regarding pricing efficiencies; he also works with Pitney Bowes to educate insurers about location data and its potential to affect accuracy of premium pricing. We talked to him about Perr&Knight’s findings and the impact that more precise location data can have on pricing. The following conversation has been edited for clarity and length: Given the finding that more than 5% of policies can be priced incorrectly due to location, what's the potential business impact for insurers? It’s a very powerful element in the industry when your pricing is more accurate, when you know that you’ve priced appropriately for the risk that you have. And when there’s this leakage that’s in here, you’ve got to recognize that the leakage isn’t just affecting the 5% to 6% of policies. That leakage, where they’re underpriced, has to be made up from an actuarial discipline. So that underwriting leakage is actually spread as a few more dollars on every other policy that’s in the account. That jacks up all their pricing just a little bit, and it makes them a little bit less competitive. If their pricing is more accurate, that improves the overall quality of their book and improves their ability to offer better pricing throughout their book. What are some of the reasons insurers have been slow to act on improving location data? I think it’s coming from multiple elements. With anything like this, it’s not always a simple thing. One thing is, there are carriers that don’t realize it, don’t realize there is an opportunity for better location [data] and how much that better location [could] actually contribute to their pricing. The second is—and part of the reason there’s a lack-of-awareness issue—is that the lack of awareness is twofold, because it’s also a lack of awareness by the business. Typically, data purchases are handled either by procurement or by IT, and the business doesn’t think about the imprecisions they have in their data. They just trust that the data they get from their geolocation vendor is good, and they move on with life. The other piece about this is the fact that replacing a geospatial location is not [a matter of] taking one vendor [out] and plugging in a new one, right? We do have all these policies that are on the books, and I’ve got to figure out how do I handle that pricing disruption so I don’t lose customers that are underpriced. I want to manage them through it. I need to look at how I’m pricing out, and actually look it up and look in my file. Do I have to refile because I have a change in rate structure, or does my filing cover the fact that I replaced it with an accurate system? So, I need to look at a couple different things in order to get to where I’d be in the right price. And then, quite frankly, once they open the covers of this, it also starts to raise other questions of, “Oh, wait a second.” If this data element is wrong or this data element can be better, which other data elements can be improved; or what new data elements can be considered? Could the fire protection score be changed, or average drive speed be used? That’s why we’re starting to talk to carriers and say we might as well look for the other areas of opportunity, as well, because we probably have more leakage from just this. This is the tip. It’s very easily identifiable, very easily measurable, but it’s probably not the only source of leakage within your current pricing. See also: 10 Trends on Big Data, Advanced Analytics   What we’re trying to help [insurers] do is say, look, if you’re going to purchase this new data, let’s make sure that we have a plan on how we’re going to get in and start to achieve the value relatively quickly. In most cases, if it’s a decent-sized carrier, we know they’re issuing X number of wrong quotes per day because of not having the right location information. So how do we fix this as fast as possible, so we’re not continuing to make the problem worse? And when you say realizing value quickly, what would be a typical timeline? There are a couple of elements that will come into play. If someone has to do a refiling, the refiling itself will take a period of time. Assuming they don’t have to do a refiling—and not in all cases will they need to—and depending upon their technology, if they can immediately switch geolocations for new business post-renewals, then you can do that in a very, very short window. At least start to make sure that all new quotes are priced correctly. Then the question comes in as to how do you want to handle renewals? Whether you want to spread the pricing increase over one year or two years or along those lines? That usually takes a little bit more time to implement within a system, but probably not a significantly long period of time—only a couple of months and then a year to run through your entire book to fully realize the value. Now, if you have to do a filing, all that could be delayed by X number of months. Key Considerations Given that location has a material impact on premium pricing, the onus is on insurers to have the most accurate location data available. Those that do will have a competitive advantage over those that don't. Keep in mind the following considerations:
  • "Close enough" is not always good enough. Even though location is close enough most of the time, imprecision can have big costs when it masks proximity to hazards.
  • The portion of policies affected may be small, but it can have big cost impacts. The range of under- and overpricing varied widely, with some premium pricing off by more than $2,000. And, as Michael Reilly points out, the impact of underwriting leakage is actuarially spread across the entire portfolio, making premiums incrementally less competitive.
  • Underpricing is not "zeroed out" by overpricing. In fact, underpricing opens insurers to adverse selection, in which overpriced policies are lost to more accurately priced competitors, and underpriced policies make up a greater proportion of the business.
  • Time to value can be quick – and new ratings filings are not always needed.
You can download the full report here.

Lynda Brendish

Profile picture for user LyndaBrendish

Lynda Brendish

Lynda Brendish is a contributor for Forbes Insights, who has written extensively about location intelligence, customer engagement, insurance and data and analytics. In addition to her work with Forbes Insights, her writing has also appeared in a number of other publications, including The Guardian, LA Times, LA Weekly, Marie Claire Australia and Delta Sky magazine.

Roadblocks to Good Customer Relations

A good customer relationship management (CRM) system makes it possible to “keep it personal” while providing superior service.

|
For many small to medium-sized insurance carriers, government risk pools and captives, providing personalized customer service continues to be a priority. If your organization or your partners leverage digital technologies, you may have customers who expect that personal touch across channels. For these carriers, having a solid client relationship management (CRM) strategy is a priority. And while in theory a strategy is a great start, execution on that strategy often hits a couple of roadblocks, especially when it comes to finding the right technology to organize and standardize records related to better management of customer service, marketing and sales. After all, customer data is a lifeline to success for any business, but not having all of your customer data in one easily accessible place is usually a challenge faced by insurers that are on a growth path. See also: Yes, Personalize — but Get it Right!   This is complicated when insurers’ functional business units operate in silo fashion. For example, consider this workflow scenario: If underwriting can’t access a policyholder’s payment history or other financial records held in accounting, underwriting must email the accounting department to obtain it. Meanwhile, the customer, impatient for his quote, calls the carrier and is connected to a customer service representative who should be able to view all customer transactions, interactions, renewals, cancellations and other changes being made to the policyholder record, yet is unable to view data that reflects any issues that would affect the underwriter’s delays. Another roadblock relates to a common complaint among small to medium-sized insurers with limited or frozen budgets—the feeling by employees (users) of having to “do more with less.” Here we have a difficult and potentially negative cycle: If the insurer is operating with outdated technologies and processes and its spreadsheets and email platforms are overwhelmed by a growing customer database, the employee is unable to meet the customer’s needs and, over time, experiences burnout. The customer, meanwhile, is already shopping for another insurance carrier. For companies responding to these challenges by moving beyond a customer service excellence strategy and on to actual execution of a solution, an integrated CRM system is the next logical step. This type of technology puts the company in control—and requires rethinking of existing processes and creating process efficiencies. The inclusion of collaboration tools in the CRM help make this task possible, and creates a “team” effect even with the smallest of customer service departments. By their nature, CRM systems are rules-based, so customer data and records can be made available to the employee who needs it, when they need it. For example, consider the importance of receiving an automated alert of policyholder suspension, which triggers an audit trail, or the ability to build out custom fields to include additional categories, contact types based on demographics, channel partner status and more. The CRM should automate contacts, quotes, sales, tasks, calendar scheduling and more. But remember, this data automation doesn’t take place in a vacuum; it needs to be insurer-driven and should map to the policyholder’s unique requirements. It also should map to the distribution channel’s requirements, yet another source of critical customer data and the key to a better understanding of the policyholder’s existing status and changing needs. See also: Distribution: About To Get Personal   Let’s face it, digital technologies are with us to stay and can provide a powerful means to interact with a growing customer base. For small to mid-sized insurers on a budget, an integrated CRM system—once only an expensive pipe dream—today can be a reality. As your company grows and you have more policyholders than you can relate to personally, a CRM system makes it possible to “keep it personal” while providing superior customer service.

Jim Leftwich

Profile picture for user JimLeftwich

Jim Leftwich

Jim Leftwich has more than 30 years of leadership experience in risk management and insurance. In 2010, he founded CHSI Technologies, which offers SaaS enterprise management software for small insurance operations and government risk pools.

Misconception That Leads to Opioids

We physicians are not applying the right treatment to the right patient to the right body part at the right time.

sixthings
No physician wants to create an addicted patient. In almost all cases, they simply want to mitigate patients' pain. Good intentions with a bad strategy. The breakdown in the system stems from a poor understanding of pain and how to diagnose and classify it correctly. In effect, you have to match the treatment to the patient's condition, which means you need to possess a reliable method of diagnosing pain. Human beings experience three types of pain: 1) thermal pain -- quite rare and only produced in the very ill and systemically sick patients; 2) chemical or inflammatory pain -- pain that is mediated through a release of chemicals at a site of injury (this pain lasts five to seven days, occurs when trauma happens and is only present in 2% to 5% of all patients in pain); and 3) mechanical pain -- pain that is mediated through/by distortion or pressure on tissue (90%-plus of all pain that humans experience). Bend your finger back as far as you can until pain is produced, and you have just experienced mechanical pain in its purest form. A bulging or herniated disc in 95% of all patients produces pain because the wall of the disc is being distorted or strained just like your finger was when it was hurting. See also: Opioids: Invading the Workplace   You can't treat mechanical pain with a chemical intervention ( pills and injections). You can't treat chemical pain with a mechanical intervention. Makes sense, right? The problem is that we have a system built around using chemicals to manage pain and providers who receive less than two weeks of education in medical school around how to adequately assess and diagnose patients in this space. The evidence is overwhelming. There are dozens of studies that show little influence on back or joint-related pain (less than one point on a 10-point pain scale, and that's in only 30 % of the cohort) when using opioids, analgesics, muscle relaxants and steroids, yet every PCP and specialist in the land has them as the first stop off for MSK (musculoskeletal) patients. When the simple analgesics and muscle relaxants don't work, then escalate to opioids. Numerous studies show that less than 5% of patients experience any change in back pain when epidural steroids or transformational injections are used to put the medicine at the supposed source of symptom. Why are these studies struggling to find treatment effect on patients in pain with some of the best-trained examiner/physicians in the world conducting the study? It's simple. We don't train them to assess patients in a reliable way and to match chemical patients with chemical interventions and match mechanical patients with mechanical interventions (surgery and movement-based strategies). See also: 6 Shocking Facts on Opioid Abuse  90% of opioids are prescribed for back or chronic joint pain. The solution to the crisis is to teach providers to reliably sub-group patients into their appropriate pain group. Mechanical patients only get mechanical solutions, and chemically dominant or inflammatory patients get chemical treatment. Our failure to do this has allowed us to continue to use treatment methods long ago determined to be ineffective in this population and also forces providers to become inventive. We blame the patient; we claim they are gaming the system; we think the problem is psychosomatic or a construct in their mind -- when in reality we are not applying the right treatment to the right patient to the right body part at the right time.

Chad Gray

Profile picture for user ChadGray

Chad Gray

Chad Gray has been a clinical practitioner for two decades and is a widely recognized entrepreneur, health-benefit design consultant and concierge practitioner, focused on groundbreaking innovations in musculoskeletal triage, health care and self-care. He is a thought and practice leader in group health, workers’ compensation and disability outcomes optimization, and he has a proven track record of performance improvements in health benefits design, clinical residency programs, employer-based clinics, primary care practices, orthopedic triage facilities, sub-acute rehabilitation centers, skilled nursing facilities and physical therapy clinics.

A Test Case on Sanity of Drug Prices

A new Hepatitis C drug is better for the patient, more effective and costs far less. What will it say if the drug doesn't succeed?

sixthings
In both traditional healthcare and pharmaceuticals, the phrase “value-based purchasing” is all the rage. Rightfully so, we want to spend our precious healthcare dollars on the care that is most valuable. In other words, we want to pay for care and drugs that are effective and not pay for those that aren’t. Like everything else, the shortest path to value is a truly competitive market. The gorilla in the room is that healthcare, and especially pharmaceuticals, severely lack this fundamental capitalist feature that we have benefited greatly from. American healthcare dwells in never-never land. We have neither explicit price controls through regulation nor implicit controls through a functional market, resulting in the worst of all possible worlds: a system that’s entrenched, opaque and dysfunctional. It gets worse when we narrow our focus on the drug market. We don’t even understand what it is that we are purchasing because buyers neither spend much time understanding drug effectiveness in the real world nor tie effectiveness to payment. Instead, in an attempt to save dollars, employers, health plans and the government have turned to intermediaries, pharmacy benefit managers, to manage the problem on their behalf. PBMs’ efforts to manage pharmacy costs rely on typical buzzwords like “formulary management,” “prior authorization” and “step therapy.” And PBMs are, as Bloomberg News explains, “the middlemen with murky incentives behind their decisions about which drugs to cover, where they’re sold and for how much.” See also: 9 Key Factors for Drug Formularies   This leads us down an unintelligible labyrinth of perverse financial incentives, with zero transparency for the payer or patient on the actual costs, alternatives for therapy and individual outcomes. That’s a problem especially in specialty pharmacy, the fastest-growing sector of pharmacy spending. Only a few years ago, specialty drugs composed a reasonable-sounding 10% of our overall drug spending. Last year, it bloated to 38%, and by 2018 it will be an astounding 50%, which is an increase of $70 million a day! Contrary to what we often think, there are better options even for many specialty drug therapies. Mavyret, manufactured by AbbVie, is the first example of a new brand name Hepatitis C drug that is actually better for patients and costs far less since Sovaldi hit the market at a price point of $1,000 a pill (never mind that you can purchase it for $4 per pill in India). Eighty percent of patients with Hep C can do an eight-week course versus alternatives manufactured by companies like Gilead and Merck, which generally require 12 weeks. Mavyret is the only drug that works for genotype’s 1-6 and has a list price that is less than half of what competitors charge, even after factoring in middleman shenanigans such as rebates. The final cost to cure a patient of Hep C is approximately $26,000. If that sounds high, consider that specialty medications for chronic conditions such as psoriasis are now $60,000 to $120,000 or more per year. If you’re like most payers, our current system locks you into paying more for drugs for your members that are less effective than proven, cheaper alternatives like Mavyret. For starters, your PBM may only provide more expensive drugs on its formulary because of large manufacturer rebates, the majority of which they retain. Formulary decisions, of course, are not based on what is most effective for the patient or cheaper for you, the payer. We feel the financial pain of this broken system every day, but it doesn’t have to be this way. Two decades ago, the internet revolution made the travel agency obsolete for most Americans. Uber and Lyft have done the same to parts of the transportation industry, and Amazon continues to do this to many others. What have these disruptive innovations taught us? That we might, in fact, be able to make better decisions ourselves, without non-value-added middlemen. It is time for this type of disruptive innovation to hit the pharmacy world. Today’s system focuses on controlling suppliers through PBMs, which in reality just limit our choices and prevent the functioning of a real market. Instead, if we were to focus on value, we could use patient data to give us an objective understanding of whether the patient was getting the right outcome at the right price. This scenario represents an opportunity for better health outcomes and savings compared with the status quo. Here’s the catch: To enter this world, we have to start saying “no” to the current "travel agents" and their obsolete model. See also: Opioids: Invading the Workplace   In many ways, Mavyret is like the canary in the coal mine. If this drug isn’t successful – we know it is better for the patient, more effective and costs less – what signal does this send pharmaceutical companies? Don’t bother discovering better drugs that cost less because they won’t sell! We salute AbbVie for doing what is right for patients and payers. America is the leader in driving innovation and investment in new drug discovery, and our inability to make the right choice not only reduces therapy choices for millions of Americans and their physicians but also for billions of others around the world who depend on us for leadership. Now is the time for payers to demand a functional market and stop overpaying for less effective therapeutic options.

Pramod John

Profile picture for user PramodJohn

Pramod John

Pramod John is the founder and CEO at Vivo Health. Pramod John is team leader of VIVIO Health, a startup that’s solving out of control specialty drug costs; a vexing problem faced by self-insured employers. To do this, VIVIO Health is reinventing the supply side of the specialty drug industry.

Creative ideas needed for solving opioid epidemic

Journalists are going to be naming names on the opioid crisis and telling stories of those killed or crippled. McKesson, Cardinal Health and Amerisource Bergen are already being singled out.

sixthings

The opioid crisis in the U.S. burst into full view over the weekend based on an investigation by the Washington Post and "60 Minutes" that showed drug distributors co-opting a few in Congress to pass a law that, beginning a year and a half ago, neutered any attempts by the Drug Enforcement Agency to halt even wildly suspicious shipments of the narcotics. This, even though the opioid crisis has already claimed more than 200,000 lives and created addictions that wrecked far more, in what the Post calls "the deadliest drug epidemic in U.S. history. (Here is the main Washington Post article and a transcript of the "60 Minutes" piece, including an interview with a whistleblower.)

Already, Rep. Tom Marino (R-PA), one of the three members of Congress singled out for pushing the legislation, has withdrawn as the nominee to be the drug czar in the Trump administration, and this story feels like it has legs. The president, who said in early August that he would declare opioids a national crisis, now says he will do so next week, adding official impetus to what will surely be a major effort among journalists.

When I was an editor at the Wall Street Journal, someone once described the ultimate story by an investigative journalist. It would begin: "There are a lot of bad people in the world. Here are their names...."

Well, journalists are going to be naming names on the opioid crisis and telling stories of those killed or crippled. McKesson, Cardinal Health and Amerisource Bergen are already being singled out, as the three biggest drug distributors, but there will be many more.

That coverage will create a platform for the many in the insurance world, especially workers' comp and healthcare, who have been sounding the alarm on opioids. We at ITL have been supportive, most notably in a manifesto back in February by Joe Paduda. Warning: "Pill-pushers" is the nicest term he uses to describe the drug distributors, which he says should have to pay to solve the crisis that he believes they created. If you're interested, search on "opioids" at the website, and you'll find many more articles, on various aspects of the problem.

We will now move into high gear, to try to take the opportunity to make headway on this huge problem. We welcome any thoughts you'd like to publish with us and will do all we can to help spread the word on ways to attack the crisis.

More generally, you'll also see us focus more on healthcare. I never believed that Washington, as dysfunctional as it is, would come up with some wonderful, clean solution to health insurance, but the drama needed to play out. Now that it has, we'll be publishing more pieces on ways that the private sector can both improve care and tame costs. The problem is daunting, but I assure you, just based on conversations I'm having, that an awful lot of smart people have a huge number of creative ideas. We'll bring as many as we can to the fore.

Cheers,

Paul Carroll,
Editor-in-Chief


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.