Tag Archives: predictive analytics

5 Top Challenges Carriers Face In A Rapidly Changing Industry

Are We In A Hard Market?
According to MarketScout, the average property/casualty rate increased by 5% from 2011 to 2012, with this same upward trend continuing into 2013. And yet, just last week the Council of Insurance Agents & Brokers (CIAB) reported that rate increases in the second quarter did not keep pace with the previous two quarters. CIAB believes the hardening market is moderating. Fitch Ratings also weighed in, noting that the increased premiums in Q1 and Q2 are helping, but they “believe this trend is likely to diminish as strong capital levels and ample underwriting capacity promote market competition.” However, it’s unlikely that short-term price increases will be enough to make up for several years of pricing inadequacy. The reality is that, while carriers are seeing much lower returns on investment income, there is an increase in total surplus dollars relative to total premium. This dynamic makes sustaining a hard market difficult and therefore pricing competition for the best risks continues to be fierce.

From an underwriting perspective, there’s some good news to report as well as mixed results when you look more closely at specific lines of business. Overall, the Property & Casualty market saw improvement in underwriting performance with combined ratios falling to 103.2 in 2012, down from 108 in 2011. Within specific lines of business, workers’ compensation also improved to a combined ratio of 109 in 2012 compared to 115 in 2011. The homeowners market, a historically volatile line with wide performance swings year-to-year, has an average combined ratio of 113 from 2008 to 2011. Bottom line, there’s still more work ahead to make underwriting profitable.

The gains in underwriting performance may signal an intentional focus from carriers to counter the significant losses in investment income. But, making up for these losses is a long-term proposition that cannot be remedied quickly. Recently, the CEO of a global insurer compared declining investment returns to one of the biggest catastrophic weather events. “The lack of investment income continues to be an issue that the industry hasn’t fully addressed,” the CEO said. “We call it ‘the hidden catastrophe,’ equivalent to more than a Katrina-sized hit to profitability every year relative to the long-term baseline.”

Tackling Economic Stagnancy
A sluggish economic recovery affects insurance premiums. For commercial lines carriers, the slow growth in payroll means that overall exposure is not increasing at the same rate as medical inflation. Various estimates put medical inflation in the 4% range for 2012 and payroll growth under 2%. This puts pressure on both claims and underwriting. Underwriting must be more stringent and selective to avoid unnecessary loss on the front end while mitigation strategies need to improve in claims. These new challenges require advanced tools and methodologies that provide real-time information and relevant data in order to reduce the insurer’s risk, and simultaneously generate more profit per policy.

Regulation Woes
According to the 2013 KPMG survey, 60% of executives stated that regulatory and legislative pressures served as the most significant inhibitor of growth in the coming year, a 13% increase from the 2012 survey, and 19% from 2011’s. Contrarily, 59% of executives noted cost as being their primary growth concern in 2011. Healthcare and tax reform are two of the most significant regulatory pressures weighing down on insurers, with over half of those surveyed naming the Affordable Care Act as the most significant individual measure.

The non-renewal of the Terrorism Risk Insurance Act (TRIA) poses a similar challenge, as the probable addition of a terrorism premium to policies will put added pressure on discretionary pricing, especially for the better risks.

“Insurers have experienced a significant shift in the marketplace; in just two years, industry executives have abruptly diverted their attention from pricing concerns to regulatory matters,” said Laura Hay, national leader of KPMG LLP’s insurance practice, as reported in the 2013 KPMG survey. “This turnabout is even more significant when you consider that economic conditions have only slightly improved during this time period, so the combination of these two factors creates an exceptionally challenging market.”

Further to this point, the head of KPMG’s U.S. insurance regulatory group, David Sherwood, added to the survey news saying, “Regulators continue to ask tough questions and regulatory intrusion is set to increase in the coming years. More than ever, regulations and agendas established internationally, in Washington, as well as in local jurisdictions, have as much influence on the industry as market conditions and consumer confidence.”

Data Access & Literacy
Plain and simple: big data helps carriers leverage empirical evidence in their decision-making. Combining data with analytics allows underwriters to take a holistic approach to their craft, which exponentially increases the efficacy of the process.

According to the same KPMG study, only 55% of execs claimed that their company demonstrated advanced data and analytics literacy. That means that just under half of U.S. insurance companies are still not using big data to its full potential, crippling their ability to improve underwriting performance. If the other 45% wants to stay competitive, they need to make analytics a top priority moving forward.

As carriers increase their “data literacy”, they will become more concerned about issues like selection bias. When carriers are limited to a selective or small data sample, it is impossible to draw accurate conclusions. As an example, if a carrier does a great job selecting the best roofing companies to insure and they model future policy performance using only their own data, the analysis will conclude that all roofing companies are good risks. Intuitively we all know that’s not true — it’s a simple example to illustrate the importance of a diverse and large data set when making important business decisions.

Big data is now a board level conversation, and carriers are being asked: “What is your big data strategy?” When that question arises in your meeting, will you have a good answer?

Talent Crisis
Not only do you need advanced data and analytics to meet the financial challenges of the current insurance climate — you need these more sophisticated tools to keep your company competitive in the employment race. In addition to the increases in risk-pricing competition, the competition for the top mindshare of the best agents is increasing substantially. The industry is estimated to have 400,000 positions to fill by 2020, and 20 percent of underwriters will retire in the next few years. This up-and-coming generation of workers expects to be equipped with sophisticated tools and advanced technologies in the workplace. Young people have been raised in a technologically driven world and are inherently tech savvy; the industry must match their technological expectations if we hope to recruit the next generations’ best and brightest. The more technology savvy carriers will be able to use this as a recruiting tool.

Of course, implementing data and analytics across your organization is not something that can be done overnight. Do not be afraid to start small. Whether your company already has some form of predictive analytics in place, or you’re just at the conceptual stage, you can build on early wins to develop measurable results in securing organizational buy-in. Good advice is to start small and build from there — don’t cross your fingers, go all-in and hope for the best. The best advice is to begin now, this is no time to sit back and wait, letting the fast-paced changes in the industry pass you by.

Insurers Should Deploy Predictive Analytics Across The Enterprise

Ovum Publishes Report On Creating An Insurance Predictive Analytics Portfolio
If the past four years are a reliable guide, the insurance industry will face complexity and even chaos in the coming months and years. Insurers need to be concerned about a future which could promise “black swan” events (those that are unpredictable, infrequent, and have a severe effect), the quickening pace of customer-driven commerce, the continuing spread of the digital economy, and the annual occurrence of severe weather events historically forecast to happen only once a century.

Insurers already know that the future holds tightening regulation, demanding customers who expect a better-quality experience, aging populations, and economies still weakened by the financial crisis. Ovum's recently published report Creating an Insurance Predictive Analytics Portfolio discusses the importance of insurers using predictive analytics to prepare for future market challenges and opportunities. We also discuss the types of quantitative professionals that insurers need, data sources and data management issues, and the areas in which insurers should apply predictive analytics.

Many Opportunities Exist For Insurers To Improve Their Competitive Position With Predictive Analytics
Where to use predictive analytics is limited only by the creative imagination of the staff responsible for its application. Ovum suggests that insurers consider creating predictive analytics initiatives in the following areas, to achieve the example objective listed. More objectives and units of analysis for each initiative are detailed in the report.

  • The insurance company itself as the focus of the initiative: To determine which markets to enter or leave.
  • Marketing: To create a portfolio of customized marketing offers.
  • Product development: To create the best product for each channel.
  • Channel management: To determine which insurance agencies to appoint.
  • Customer acquisition/retention: To estimate each customer's lifetime value to shape target market initiatives.
  • Customer services: To estimate retirement income for each life insurance customer.
  • Litigation management: To estimate litigation costs for each claim as it is reported.
  • Claim management: To determine the best way to reduce loss expenses/combined ratio for each line of business and each selling agent or claims adjuster.
  • Risk management: To estimate potential losses for the book of business as each new customer is added.
  • Cost control: To determine how the cost levers might change for different company governance structures.
  • Underwriting management: To estimate how many underwriters of what level of experience by line of business to have on staff.

The Insurance Industry Exists In A World Of Increasingly Rich Data
The insurance industry exists in a world of increasingly rich data. More and more data is available from existing sources (e.g. third-party providers offering information about weather events and forecasts, attributes of geographic locations, and consumer credit behavior), newer sources (e.g. social media), and those that are largely still conceptual (e.g. from machine-to-machine communications, also known as the “Internet of Things” — specifically from vehicle telematics).

In particular, insurers can access (although not necessarily free of charge) a never-ending torrent of (mostly) semi-unstructured and structured data from sources such as:

  • insurance business systems
  • social media
  • embedded sensors (e.g. vehicle telematics)
  • insurance company portals
  • mobile apps
  • location intelligence
  • complementary insurance information (e.g. FICO scores, building repair cost, and business formation data).

The Data Scientist Role Is Emerging As Equally Important As The Data Miner Role In The Insurance Industry
The data miner role is no longer the only one to use predictive analytics in the insurance industry; a new role of data scientist is emerging. A growing number of insurance companies are creating new departments of these types of quantitatively skilled professionals.

A data scientist and a data miner could be the same person. But the two roles should have different perspectives regarding the scope of predictive analytics initiatives and the time horizon of predictive analytical models. Moreover, data scientists may need different skills to fulfill their responsibilities. An insurer should expect a data scientist to approach a predictive analytics initiative by first collecting data — although not necessarily all the data required to complete the initiative — and then investigating the data on an iterative basis until a coherent hypothesis emerges.

Furthermore, Ovum believes that data scientists should be responsible for models that support short, medium, and long-term corporate objectives. Data miners, however, should be primarily involved with predictive analytics initiatives that support short and medium-term corporate objectives.

Tackling Underwriting Profitability Head On

For many years, insurance companies built their reserves by focusing on investment strategies. The recent financial crisis changed that: insurers became incentivized to shift their focus as yields became more unpredictable than ever. As insurance carriers looked to the future, they know that running a profitable underwriting operation is critical to their long term stability.

Profitable underwriting is easier said than done. Insurers already have highly competent teams of underwriters, so the big question becomes, “How do I make my underwriting operation as efficient and profitable as possible without creating massive disruptions with my current processes?”

There are three core challenges that are standing in the way:

  • Lack of Visibility: First, the approach most companies take to data makes it hard to see what's really going on in the market and within your own portfolio. Although you may be familiar with a specific segment of the market, do you really know how well your portfolio is performing against the industry, or how volume and profit tradeoffs are impacting your overall performance? Without the combination of the right data, risk models, and tools, you can’t monitor your portfolio or the market at large, and can't see pockets of pricing inadequacy and redundancy.
  • Current Pricing Approach: You know the agents that underwriters engage with every day want you to give them the right price for the right risk, and it's not easy. In fact, it's nearly impossible. Underwriters are often asked to make decisions based on limited industry data and a limited set of risk characteristics that may or may not be properly weighted. As an underwriter reviews submission after submission, you need to make decisions such as, “How much weight do I assign to each of these risk characteristics (severity, frequency, historical loss ratio, governing class, premium size, etc.)?” Imagine how hard it is to do the mental math on each policy and fully understand how the importance of the class code relates to the importance of the historical loss ratio or any other of the most important variables.
  • Inertia: When executives talk about how to solve these challenges around visibility and pricing, most admit they're concerned about how to overcome corporate inertia and institutional bias. The last thing you want to do is lead a large change initiative and end up alienating your agents, your analysts, and your underwriters. What if you could discover pockets of pricing inadequacy and redundancy currently unknown to you? What if you could free your underwriters to do what they do best? And what if you could start in the way that makes the most sense for your organization?

There's a strong and growing desire to take advantage of new sources of information and modern tools to help underwriters make risk selection and pricing decisions. The implementation of predictive analytics, in particular, is becoming a necessity for carriers to succeed in today's marketplace. According to a recent study by analyst firm Strategy Meets Action, over one-third of insurers are currently investing in predictive analytics and models to mitigate against the problems in the market and equip their underwriters with the necessary predictive tools to ensure accuracy and consistency in pricing and risk selection. Dowling & Partners recently published an in-depth study on predictive analytics and said, “Use of predictive modeling is still in many cases a competitive advantage for insurers that use it, but it is beginning to be a disadvantage for those that don't.” Predictive analytics uses statistical and analytical techniques to develop models that enable accurate predictions about future event outcomes. With the use of predictive analytics, underwriters gain visibility into their portfolio and a deeper understanding of their portfolio's risk quality. Plus, underwriters will get valuable context so they understand what is driving an individual predictive score.

Another crucial capability of predictive modeling is the mining of an abundance of data to identify trends, patterns and relationships. By allowing this technology to synthesize massive amounts of data into actionable information, underwriters can focus on what they do best: they can look at the management or safety program of an insured, anything they think is valuable. This is the artisan piece of underwriting. This is that critical human element that computers will never replace. As soon as executives see how seamless it can be for predictive analytics to be integrated into the underwriting process, the issue of overcoming corporate inertia is oftentimes solved.

Just as insurance leaders are exploring new methods to ensure profitability, underwriters are eager to adopt the analytical advancements that will solve the tough problems carriers are facing today. Expecting underwriters to take on today's challenges using yesterday's tools and yesterday's approach to pricing is no longer sustainable. Predictive analytics offers a better and faster method for underwriters to control their portfolio's performance, effectively managing risk and producing better results for an entire organization.

Predictive Analytics And Underwriting In Workers' Compensation

Insurance executives are grappling with increasing competition, declining return on equity, average combined ratios sitting at 115 percent and rising claims costs. According to a recent report from Moody’s, achieving profitability in workers’ compensation insurance will continue to be a challenge due to low interest rates and the decline in manufacturing and construction employment, which makes up 40% of workers’ comp premium.

Insurers are also facing significant changes to how they run underwriting. The industry is affected more than most by the aging baby boomer population. In the last 10 years, the number of insurance workers 55 or older has increased by 74 percent, compared to the 45 percent increase for the overall workforce. With 20 percent of the underwriter workforce nearing retirement, McKinsey noted in a May 2010 Report that we will need 25,000 new underwriters by 2014. Where will the new underwriters come from? And more importantly, what will be the impact on underwriting accuracy?

Furthermore, there’s no question that technology has fundamentally changed the pace of business. Consider the example of FirstComp reported by The Motley Fool in May 2011. FirstComp created an online interface for agents to request workers’ compensation quotes. What they found was remarkable. When they provided a quote within one minute of the agent’s request, they booked that policy 52% of the time. However, their success percentage declined with each passing hour that they waited. In fact, if FirstComp waited a full 24 hours to respond, their close rate plummeted to 30 percent. In October 2012, Zurich North America was nominated for the Novarica Research Council Impact Award for reducing the time it takes to quote policies. In one example, Zurich cut the time it took to quote a 110-vehicle fleet from 8 hours to 15 minutes.

In order to improve their companies’ performance and meet response time expectations from agents, underwriters need advanced tools and methodologies that provide access to information in real-time. More data is available to underwriters, but they need a way to synthesize “big data” to make accurate decisions more quickly. When you combine the impending workforce turnover with the need to produce quotes within minutes, workers’ comp carriers are increasingly turning toward the use of advanced data and predictive analytics.

Added to these new industry dynamics is the reality that both workers’ compensation and homeowners are highly unprofitable for carriers. According to Insurance Information Institute’s 2012 Workers’ Compensation Critical Issues and Outlook Report, profitable underwriting was the norm prior to the 1980s. Workers’ comp has not consistently made an underwriting profit for the last few decades for several reasons including increasing medical costs, high unemployment and soft market pressures.

What Is Predictive Analytics?
Predictive analytics uses statistical and analytical techniques to develop predictive models that enable accurate predictions about future outcomes. Predictive models can take various forms, with most models generating a score that indicates the likelihood a given future scenario will occur. For instance, a predictive model can identify the probability that a policy will have a claim. Predictive analytics enables powerful, and sometimes counterintuitive, relationships among data variables to emerge that otherwise may not be readily apparent, thus improving a carrier’s ability to predict the future outcome of a policy.

Predictive modeling has also led to the advent of robust workers’ compensation “industry risk models” — models built on contributory databases of carrier data that perform very well across multiple carrier book profiles.

There are several best practices that enable carriers to benefit from predictive analytics. Large datasets are required to build accurate predictive models and to avoid selection bias, and most carriers need to leverage third party data and analytical resources. Predictive models allow carriers to make data-driven decisions consistently across their underwriting staff, and use evidenced-based decision making rather than relying solely on heuristics or human judgment to assess risk.

Finally, incorporating predictive analytics requires an evolution in terms of people, process, and technology, and thus executive level support is important to facilitate adoption internally. Carriers who fully adopt predictive analytics are more competitive in gaining profitable market share and avoiding adverse selection.

Is Your Organization Ready For Predictive Analytics?
As with any new initiative, how predictive analytics is implemented will determine its success. Evidence-based decision-making provides consistency and improved accuracy in selecting and pricing risk in workers’ compensation. Recently, Dowling & Partners Securities, LLC, released a special report on predictive analytics and said that the “use of predictive modeling is still in many cases a competitive advantage for insurers that use it, but it is beginning to be a disadvantage for those that don’t.” The question for many insurance executives remains: Is this right for my organization and what do we need to do use analytics successfully?

There are a few important criteria and best practices to consider when implementing predictive analytics to help drive underwriting profitability.

  • Define your organization’s distinct capability as it relates to implementing predictive analytics within underwriting.
  • Secure senior management commitment and passion for becoming an analytic competitor, and keep that level of commitment for the long term. It will be a trial and error process, especially in the beginning.
  • Dream big. Organizations that find the greatest success with analytics have big, important goals tied to core metrics for the performance of their business.

8% Reduction In Claims Costs Spells Success for Workers' Compensation Pilot Program

Physician-Guided Managed Care Achieves Better Results

Ever wondered why managed care costs more every year but the results seem about the same? For decades, the most expensive portion of a claim was the indemnity payments. Today, with medical advances, it’s the medical expenses, which in workers’ compensation alone, have increased nationwide by an annual average of 8 percent, nearly double the medical consumer price index of 4.3 percent over the same six-year period.

Although managed care services vary somewhat from company to company, they are more or less delivered as commodities, with each service providing similar capabilities regardless of vendor. Upfront fees are the selling point, and price is the primary differentiator. Some service providers may be more efficient than others, but only because their technology underpinnings are better (or better managed). Either way, technology-based processes often define the service, with poor accommodation for human intervention.

In this typical managed care model, medical bill reviews sail through software systems as fast as possible, grabbing savings along the way based on automated business rules and built-in triggers. Experienced nurses conduct utilization reviews (URs), but generally in a rubber-stamp role, and escalation of questionable utilization reviews to physicians can slow the review process by days, or even weeks. Similarly, case management is a nurse-based service in which physicians come into play only on an exception basis. And finally, there are the networks of doctors and hospitals that discount fees. Because the managed care vendors that build these networks absorb part of the discounts as payment for network access, they have little incentive to choose these providers selectively.

In this standard managed care model, one service provider might boast the lowest price for medical bill review, another for utilization review, and both will attract buyers on price alone. But insurance entities that choose providers based on upfront fees are sacrificing a higher level of savings — one that can only come with a more holistic view of managed care services.

Current Managed Care Model
Many insurance companies use managed care services to find the obvious savings (or “low hanging fruit”) through case management, bill review, utilization review of patient treatment plans, and provider networks at discount prices.

Yet most managed care service providers seem powerless to arrest medical costs and have been unable to utilize or develop a different approach. They continue to use nurses and clerical review staff to oversee the medical component of a claim, when their valuable input often doesn't reach the treating physician in any meaningful way. And when a physician finally does become involved, the case is often already derailed by out-of-control treatment plans and costs.

Instead of charging fees to catch problems after the fact, industry innovators want a new, more effective model to lower costs and influence the quality of care from the beginning of a claim.

A New Model: Physician-Guided Managed Care Services
What is needed is a managed care infrastructure that leverages the credibility and expertise of doctors at key points in every service.

Physician-Guided Care (PGC), a ground-breaking approach to managed care, combines knowledgeable individuals with predictive analytics and systems to measure and influence medical care. It's a model where treatment is lead by doctors — not clerical review staff or nurses.

Widespread as it is “holistic” in nature, the Physician-Guided Care model informs the overall delivery of all managed care services. Put another way, Physician-Guided Care can be defined as supporting the right treatment at the right time by the right professional — and all at the right cost to workers' compensation programs. And this model has been proven to deliver better results, including:

  • 11% faster return to work for injured persons; and,
  • 8% reduction in overall claims costs.

The Right Treatment At The Right Time — By The Right Professional
To understand the value of “right” in this context, consider the prevailing practice of nurse-conducted utilization reviews (UR). Customers pay for the nurse's review, and again for a second review at a higher incremental price; each time a utilization review case is escalated to a doctor for specialized medical advice.

Alternatively, if the nurse chooses to call the treating physician to discuss the matter, there's no guarantee the call will be returned quickly, if at all, and nothing preventing the provider from proceeding with the planned treatment. Either way, relying on nurses at the initial stage of less-routine utilization review cases can increase costs, slow turnaround times, and prolong the life of the claim.

With the Physician-Guided Care model, only physicians conduct utilization reviews. The collaborative nature of physicians, trained to work together, delivers greater efficiencies and better outcomes to the process. In fact, the approach of using physicians at the appropriate level of every service has upended the commodity-based service model favored by the managed care industry. As trained clinicians, they pinpoint problems, negotiate with treating physicians, and arrive at fair resolutions more quickly and effectively. Physicians are used in the following ways:

  • Medical Bill Review: The Physician-Guided Care model combines the expertise of senior-level bill analysts with proprietary quality assurance technology that flags possible violations of medical procedure coding, PPO network discounts, and state fee schedules. Level of Physician Involvement: Questionable treatment, billing codes, and charges for medical services are escalated to physicians for clinical review.
  • Utilization Review: The Physician-Guided Care model uses staff physicians to review medical treatment plans and collaborate with treating physicians on patient care. Level of Physician Involvement: All utilization reviews are conducted by physicians.
  • Rx Utilization Management: The Physician-Guided Care model reviews prescriptions before they're filled, specifically Class II and III drugs, special requests, and prescriptions flagged by specifically configured triggers as potentially out of scope or harmful to the patient. Level of Physician Involvement: All requests are reviewed by physicians.
  • Case Management: The Physician-Guided Care model for case management combines physician and field nurse case managers who work with treating physicians and families to ensure the best possible patient care without incurring undue costs. Level of Physician Involvement: In the Physician-Guided Care model, physicians are assigned to any claim that meets at least one of dozens of critical factors and anticipates six weeks or more of lost work time, based on predictive modeling.
  • Physician on Call: The Physician-Guided Care model makes physicians available via an 800 number to help claims examiners resolve medical issues quickly, especially when they're under pressure. Level of Physician Involvement: All calls are handled by physicians.
  • 24/7 Nurse Triage: The Physician-Guided Care model uses phone-based triage-trained registered nurses to guide accident victims to the right treatment option the moment an accident occurs. Level of Physician Involvement: Nurse triage operations are overseen by a physician certified in internal and emergency medicine.
  • Claim Analysis: The Physician-Guided Care model helps claims examiners resolve persistent issues and move toward settlement of difficult or long-term claims. Level of Physician Involvement: All analyses are performed by physicians.
  • Medicare Set-Asides (MSAs): The Physician-Guided Care model helps claims staff forecast Medicare Set-Asides more accurately, expedite reporting, and comply with Medicare's Secondary Payer Act for case settlements. Level of Physician Involvement: Physicians oversee the work of analysts and forecasters.

Delivering Better Results For Claims Organizations
Over the last few years, Physician-Guided Care has confirmed its value for businesses by reducing medical costs, accelerating patient recovery, and minimizing appeals of managed care decisions.

Many workers' compensation carriers choose to first pilot the Physician-Guided Care model in order to evaluate results and confirm the benefits of the approach. One example of such a pilot was an insurance company specializing in workers' compensation claims. This organization chose to evaluate the Physician-Guided Care program in order to measure the success of using physician case managers, specifically on cases that involved severe injuries.

This pilot program ran between July 1, 2010 and May 31, 2011, during which time physicians were assigned as case managers to any claim that met the following criteria: involved an injury with certain critical factors and had at least six weeks of anticipated lost work time due to temporary total disability (TTD), based on predictive modeling.

By any measure, the results were impressive. During this pilot program, the use of physician case managers resulted in:

  • Medical expenses to drop by 8 percent.
  • Compare that to the 2 percent increase in the medical cost inflation rate for workers' compensation insurance in 2010, and the effect is a 10-point better result.

The Physician-Guided Care Model: Making an Impact
One thing is certain: the traditional model for managing medical costs and care is outdated and no longer generates sustainable improvements. The new Physician-Guided Care model has been tested with thousands of claims, and shown to deliver measurable improvements in claims outcomes and costs.

Physician-Guided Care is the groundbreaking approach successfully leveraging the credibility and expertise of doctors at critical points in every managed care service. The Physician-Guided Care model is successful due in large part to its foundation — the collegial and collaborative nature of physicians. In an environment where doctors have historically been trained to work together, the Physician-Guided Care model harnesses the peer-to-peer relationship to manage patient care from the start and throughout the entire claims process. The result: the treatment plan is set on the right course to get the injured person back to health quickly, and unnecessary medical procedures, costs, and prescriptions are avoided.