Download

Fixing Illinois' Outdated Workers' Comp

Illinois’ system has not evolved to meet the modern workplace; it works more for special interests than employers and employees.

|
The American workplace has changed dramatically since Illinois created its workers’ compensation system in 1911. But the workers’ compensation system, especially in Illinois, has not kept pace. Not only does the current system do a poor job of serving the majority of workers, especially parents and other workers who need flexibility to work hours outside the traditional workday and in off-site locations such as their own homes, but it also prioritizes the financial interests of groups such as lawyers and workers’ compensation doctors over the needs of both workers and employers. The system needs to be reformed. Illinois policymakers should allow workers and employers to opt out of the state-run workers’ compensation system and to craft their own agreements around their particular circumstances – rather than forcing all workers and employers to adhere to rigid regulations that often no longer serve their purpose. The early 20th century origins of workers’ compensation At the turn of the 20th century, increasing numbers of Americans found themselves in new, hazardous working conditions in the jobs created by the Industrial Revolution. But few protections existed for workers who might be unable to support their families if they became injured at work. Workers’ compensation was designed to remedy that situation by providing medical care and income replacement to injured workers. The system, however, has not evolved to meet the needs of today’s workers and employers and is ill-suited to address the problems of the modern workplace. Changes in the modern workplace Far fewer people work in inherently risky jobs today. The industrial sector employed nearly a third of the workforce in 1900, but employed just 19% in 1999. And even today’s dangerous jobs have become less hazardous. Deaths per 100,000 workers fell more than 93% to just four by the end of the 20th century, down from 61 deaths per 100,000 workers at the start. But workers also face new challenges. In the middle of the 20th century, just 30% of women were part of the workforce. That number has risen to nearly 60%. Increasing numbers of Americans must now balance work responsibilities with caring for a child or elderly relative: 82% of parents are in families where both parents work. Many employers have met those challenges by offering more flexible work environments such as telecommuting and flexible schedules. But workers’ compensation – a system supposed to protect workers – increasingly stands in the way of new work arrangements to meet workers’ needs. See also: How Should Workers’ Compensation Evolve?   Workers’ compensation was designed for an industrial workplace. Yet, it applies equally to a telecommuter working from home. A professor who slips on papers in his home office or an interior designer who trips on her dog can claim workers’ compensation. That makes businesses less likely to give workers flexibility to work at home or, when employers do, to let workers set their own hours. A worker who answers email at night, after taking time to pick up children from school and prepare dinner, could still be considered in the workplace as though the distinction between work and home could be drawn as simply as when workers punched a time card. Employers have little control over possible costs if the employee is injured at home, and the broken workers’ compensation system gives employers an incentive to take away flexible working arrangements for fear of legal liability. These problems are not unique to Illinois, but the Prairie State is unusual both in having one of the most costly workers’ compensation systems in America and in not having exemptions for small businesses or domestic workers. The absence of an exemption for domestic employees hurts increasing numbers of workers who must balance work with child or elder care. As with telecommuting, this can affect all workers, but it disproportionately affects women, who tend to spend more time caring for children. And, while not everyone can afford a live-in nanny, reducing impediments to hiring domestic help makes it easier for women to hold more senior positions. Opting out of the state-run workers’ compensation system While Illinois has one of the most restrictive workers’ compensation systems, Texas has one of the least restrictive, even allowing employers to opt out entirely. Critics of the Texas system allege this has led employers to cut services, but the evidence suggests employers prefer to save money by cutting areas prone to fraud, while often increasing benefits that employees value. Employers often provide better benefits than required for the same reason they offer flextime: to recruit the best employees at the lowest cost. Special interests benefit from the current workers’ compensation system to the detriment of workers and employers The government-imposed workers’ compensation system has also been far more susceptible to co-option by special interests. While workers and employers use the workers’ compensation system only when there is an injury, lawyers interact with workers’ compensation every day. As a result, although the workers’ compensation system is supposed to provide quick resolution to workers’ claims, the powerful lawyers’ lobby helped create a system that can stretch claims out over years. This costs businesses money and denies injured workers rapid settlement of their medical bills. Medical providers, too, have benefited from a system that unnecessarily prolongs treatment and facilitates the overprescription of certain medications, including addictive opioids. See also: The Pretzel Logic on Oklahoma Option   Employers and workers both have an incentive to design a better system, but the false presumption that the government-run system is better prevents them from doing so. Interestingly, Texas employers who opted out of the state-run workers’ compensation system have all but eliminated opioid overprescription. Fixing Illinois’ workers’ compensation system means government must step back and allow workers and employers to reach agreements that make sense in their specific situations – arrangements that suit the needs of workers and employers, rather than line the pockets of special interest groups benefiting financially from the current system.

Mark Adams

Profile picture for user MarkAdams

Mark Adams

Mark Adams is the director of regulatory reform at the Illinois Policy Institute. He is working to find solutions to legal, economic and regulatory problems in Illinois with a focus on identifying alternative approaches to policies that disproportionately affect opportunities for low-income, working families.

Why Sustainability Is Becoming Big

Evidence is mounting that companies with lower greenhouse gas emissions perform better on average. But why?

|
Overview In recent years, there has been a global shift toward more environmentally sustainable ways of working. The world’s biggest companies are also increasingly disclosing their greenhouse gas emissions and other energy metrics – and being judged on them by consumers – with 71% of the world’s top 500 companies opting to externally audit their environmental impact numbers. Although most countries don’t yet require companies to disclose such information, this is likely to change. China has recently issued a draft environmental tax law, and the U.S. has announced plans to cut carbon dioxide emissions;  both are aimed at encouraging businesses to become more green and proving that to the government. The E.U.'s Directive 2014/95/EU entered into force in December 2014, requiring companies with more than 500 employees operating in the E.U. to report on a range of non-financial (including environmental and sustainability) issues by the end of 2016. While there is some debate on the business benefits of shifting to a more sustainable model, evidence is mounting that companies with lower greenhouse gas emissions perform better on average. So is being environmentally friendly about good PR — or just good business? In-Depth Tracking your company’s environmental impact The challenge in assessing both your businesses’ environmental impact and the potential benefits of becoming more sustainable is in working out the true extent of your operations. While some aspects are relatively easy to identify — amount of recycling, office energy efficiency or the number of flights taken by employees, for example — the connections among our increasingly globalized supply chains and business operations can make monitoring broader ramifications incredibly complex. Though there are no universal standards for environmental business reporting and impact analysis, there are a number of initiatives under way to encourage more transparency and provide guidance: • The United Nations Global Compact devotes three of its 10 principles to environmental issues and boasts more than 10,000 corporate signatories. It promotes taking a precautionary approach to environmental challenges, encourages businesses to promote environmental responsibility and is pushing for the development and adoption of environmentally friendly technologies. • The Global Reporting Initiative has produced guidelines for sustainability reporting that have now been adopted by more than 7,500 companies. With 30 environmental indicators, the focus is around energy, biodiversity and emissions. • The Carbon Disclosure Project offers guidance on the kinds of data needed to identify ways to reduce negative environment impact, with more than 5,000 corporate signatories by the end of 2014. • The Leadership in Energy and Environmental Design program takes a more focused approach, offering guidance and certification for the development and running of more environmentally friendly buildings. Operating in more than 30 countries, and with 20,000 organizations signed up, LEED-certified buildings are not only better for the environment but also more cost-effective because of the reduction in energy use. The benefits to the bottom line Analyzing your environmental impact may not yet be universally mandated, but it can be worthwhile. The detailed analysis of the true costs that thorough environmental reporting necessitates can not only help you avoid being accused of a “greenwashing” PR exercise but can also help identify potential savings. The guidelines provided by these organizations can serve as a handy blueprint for identifying more sustainable ways of working. With energy price volatility “the new normal,” making your operations more energy-efficient and less wasteful can reduce the unpredictable impact of shifts in costs. At the same time, the cost of installing on-site sources of renewable energy is decreasing as technologies improve. Some governments offer subsidies and tax breaks to implement renewable energy, and others (like the U.K.) even pay for renewable energy generated. Even without subsidies, installing renewable energy sources can prove to be a good investment, depending on your location. In the U.S., solar panels may still be expensive to install, but they tend to pay for themselves within 10 to 20 years. Making your supply chain more sustainable is also a sensible long-term investment, albeit considerably harder to develop. When Puma became the first company to publish the cost of the carbon emitted and water used throughout its supply chain back in 2011, it helped identify ways to reduce water, energy and fuel consumption by 60%, resulting in potential savings of millions of dollars. You may not need to invest as much as Swedish furniture giant Ikea with its plan to invest €1 billion in projects to encourage sustainability, or Google with its $2 billion investment in solar and wind projects. With the climate challenge too big for any one company (or country) to tackle alone, every little bit helps — and, at a big enough scale, even the smallest changes can make a huge difference. The starting point of identifying ways to reduce your environmental impact and maximize your efforts’ business benefits is understanding what you’re currently doing through detailed analysis and reporting. Only then can you identify what you can do and what impact this can have on both your business and the planet. Talking Points “The fact is… big businesses assess risk and opportunity at a global level, which means that their actions can reverberate across the planet. They develop systems that not only scale up, but require stability and continuity to be good investments.… These days, it is Big Business – not governments or consumers – that is stepping up… because they know their own corporate futures are at stake.” – National Geographic “Understand that for markets to grow, and for your own future prospects to be successful, it makes sense to integrate, in your strategic thinking and operations, environmental, social and governance issues” – Georg Kell, executive director, UN Global Compact “Planners, presidents and prime ministers might sign sweeping ‘deals’ but the CEO is where the real power lies, and they will not move a muscle unless change makes sense financially – nor should they.” Robert Clarke, entrepreneur-in-residence, School of Business and Entrepreneurship, Bath Spa University “The challenge is to distinguish between the (environmental, social and governance) factors that have a material influence on company performance and those that do not. But the data that companies currently report are inadequate to enable investors to make this distinction.” – Laura Tyson, Haas School of Business Further Reading

Mark Fishbaugh

Profile picture for user MarkFishbaugh

Mark Fishbaugh

Mark Fishbaugh is Aon's National Power Practice Leader. He has overall responsibility for strategic planning and execution for utility risk services in the U.S. He is responsible for ensuring that industry expertise and resources are available and used to enhance growth opportunities in this sector.

Bridging From Today to Tomorrow, Part 1

There are six major ways that insurers are gearing up to prepare for the challenges and bridge to tomorrow.

|
Change is constant; we all know that. External forces are conjoining with emerging technologies and changing our whole world, including the business of insurance. And changes bring challenges. How insurers face these challenges and bridge their current world to the future is the key to success. Our perceptions of change vary. For some, it feels like the pace is accelerating beyond comprehension, and keeping up may feel like an insurmountable challenge. Sometimes, it may seem almost impossible to fathom. As you walk into work today or attend a meeting, the business itself may feel the same on the surface. They’re the same people and similar processes. It appears to be business as usual … but is it? What we at SMA know is that, just below the surface, insurers are working hard to respond to external changes in a variety of ways and to transform their businesses in response to these external stimuli. We have observed patterns of change adoption in insurance that, at times, may seem very scattered, unstructured and, depending on the size of the insurer, very erratic. Insurance is embracing innovation; it is really happening. It’s still difficult to apply absolutes and see consistent patterns. But the data is there to gather. See also: 100 Ideas That Changed Insurance No one has a crystal ball to predict the pace or time frame for these developments. But one thing is clear: Even now, the insurance industry is starting to see the implications of emerging technologies, and there is likely to be significant change over the next several years. Many insurers recognize this and are working to position their companies to be more agile and responsive. At the same time, insurers have complex operations, organizations and products that require intense focus every day. Managing operations and improving competitive position are challenging enough even before considering the ways that emerging technologies will transform insurance. So, how are insurers responding? SMA has been tracking six major ways that insurers are gearing up to prepare for the challenges and bridge to tomorrow:
  • Establishing venture capital arms to actively invest in startups – focused on new technology solutions, new agent/broker firms and even insurance company startups. The focus is within the insurtech and fintech spaces.
  • Creating new partnerships with global companies such as Facebook and LinkedIn, and global technology companies like Google, Microsoft and even IBM. These partnerships are also expanding into auto and other industries as the digitally connected world matures.
  • Developing new roles to match the digitized world and capabilities – like digital officer, customer experience officer and chief data scientist. These new roles are attracting new talent and also motivating seasoned insurers to develop new skills.
  • Embarking on on new strategic initiatives like digital transformation, customer experience and even big data analytics. These initiatives are a result of the outside pressure from the digitally connected world, the changing customer experience and the advancement of analytics and data sources.
  • Creating innovation labs within their businesses and outside the operations, encouraging a culture of innovation. These innovation labs promote ideas from within and are transforming the way in which individuals work and are rewarded.
  • Transforming their core operations – underwriting, billing and claims with core systems, initiatives that establish a platform for modernization, optimization and innovation.
At SMA, we focus on what all of this means for insurance, when the impacts are likely to be felt and how insurers can become Next-Gen Insurers positioned for success in the future. See also: The 4 Major Sources of Change for Insurance Bridging the traditions of insurance with new technological innovations and transformations is essential for success, as well as being a major focus of the SMA Summit this September. Insurers large and small are all facing this transformative time in different ways, and some of them will be sharing their experiences and insights at the Summit. Step one is identifying what insurers face and knowing that the time to act is now. Step two is determining “the hows.” How to build the bridge between today and the future. How to take action. How to ensure that you are competitive in today’s world. I’ll address those steps in the next blog: Bridging Today to Tomorrow, Part 2.

Deb Smallwood

Profile picture for user DebSmallwood

Deb Smallwood

Deb Smallwood, the founder of Strategy Meets Action, is highly respected throughout the insurance industry for strategic thinking, thought-provoking research and advisory skills. Insurers and solution providers turn to Smallwood for insight and guidance on business and IT linkage, IT strategy, IT architecture and e-business.

How Technology Breaks Down Silos

It's easy to talk about collaboration but hard to act. What tools and strategies in the C-suite bring about successful coordination?

|
Overview New digital technologies and the data they are producing have forced collaboration among senior business leaders across all levels of all organizations. To obtain insights from data to drive decision-making and embed a data-driven approach within a company’s culture, it is critical for the C-suite to lead the way. It’s easy to talk about collaboration, but much harder to act. Analyzing information, deriving insights and responding with effective strategies requires an understanding of the analytical tools themselves, as well as collaboration. As technologies get smarter and various functional groups collaborate, simply moving to single systems can give broader teams greater visibility to inefficiencies and broken processes. But how does a business get to such a place? What tools and strategies bring about successful coordination of activities in such dynamic situations? And what are the challenges of working together that C-Suite executives should anticipate? In Depth Just about every functional group within an organization can now collect, connect and analyze data. But big data – from keyword searches, social sites, wearables, mobile devices, customer feedback and so on – presents challenges as well as opportunities for business leaders. One of the biggest is how to maximize the potential of this data by transcending organizational silos to unlock its true potential. Technology is also transforming how businesses develop and deliver goods and services and is placing enormous new demands on those responsible for strategies to navigate the challenges. These are the people who need to apply institutional knowledge, implement changes and allocate resources toward new ways of working on a day-to-day basis. Paul Mang, Global CEO of Analytics and leader of the Aon Center for Innovation and Analytics in Singapore, says there are two types of data analysis that can be leveraged to accomplish this: business analytics and enterprise analytics. Business analytics focus on the use of established tools and capabilities, while enterprise analytics “create new product or value propositions for existing clients or new client segments altogether.”  Short-term, enterprise analytics can lead to disruptive innovation while quickly contributing to improved long-term performance. “Business and enterprise analytics should work side-by-side and complement each other” to support decision making, Mang says. The Changing Role of the CIO The need to become an effective data-driven organization has dramatically increased the importance of the chief information officer (CIO), a role that John Bruno, chief information officer at Aon, says is that of “an integrator – someone who works across the entire organization to embed data within the business.”  He sees the value that information technology (IT) brings, and notes that “IT is less about bits and bytes of data, but more about bringing them together to extract specific insights.” The need to centralize and mine big data for market opportunities and to parse out weaknesses is also prompting some firms to create a C-suite level position of chief data officer (CDO). This role would be responsible for working with business managers to identify both internal and external data sets that they may not even realize exist, as well as continually looking for new ways to experiment and apply that data. Equally critical to communicating changes in customer preferences and behaviors, and for their ability to leverage insights from customer purchase patterns into developing new products and services, is the chief marketing officer (CMO). Like the CMO, the effective CIO needs an intimate understanding of how current technology can increase the company’s sales. However, Bruno says, “in any large organization, there are multiple leaders in different parts of the organization who address different elements of the same challenges. It’s the CEO who can see the whole view and works to have teams bring forward integrated solutions to distributed problems.” He sees the role of the CEO as one who looks beyond short-term disruptions and organizational adjustments to seize opportunities that ensure long-term growth. This is why, increasingly, the role of the CIO/CDO is about balancing business needs against an incoming stream of opportunities – and risks. This broad cross-business knowledge can only come from constant and deliberate collaboration with the rest of the C-level executive suite. Above all, the CIO has to be able to effectively show how technology and the subsequent data it brings are assets rather than cost centers. For CIOs to really succeed, this means informing C-level colleagues about technology and the opportunities it can create. Making Collaboration Count: Finance and HR The role of the CFO is increasingly about analyzing data to give it meaning and partnering across the organization to make the information actionable. One area that is seeing CFOs use data to drive real results is in collaboration with the chief human resources officer (CHRO). Eddie Short, Aon Hewitt’s managing director, Global Data & Analytics, says that in most organizations the C-Suite has not been getting sufficient insight into people-related business issues, typically owned by human resources (HR) teams. Today, with the CIO’s help, digital tools are increasingly being used by leading organizations to measure employee performance, reduce attrition and cultivate talent through a better understanding of the data about their workforce that they can gather and analyze. “People analytics,” as this emerging field is known, attempts to bridge the gap between HR and the rest of the organization by providing specific insights into an organization’s talent. “People analytics is all about connecting the value of your people to the strategic goals and objectives of the business,” Short says. “This approach represents a major opportunity for HR and finance leaders to take a road centered on the greatest asset that organizations have – their people – and start to shape the value-add they will create for the business over the next five to 10 years using predictive analytics.” With skills shortages an increasingly pressing issue for many organizations around the world, gaining this kind of insight can help a business to identify and meet its future talent needs. Aligning for Agility As technology continues to disrupt, CEOs and the C-Suite in general must accept that there may not be a set playbook to follow to adapt and evolve. Flexibility is paramount, and often organizations must invent and reinvent as they move forward. Intelligently applying analytics tools to derive value from big data can help them navigate this new terrain. “Today, CXOs want predictive insights,” Short says. “They want answers to the predictive ‘what could I do?’ questions as well as prescriptive – ‘what should I do?’ -- questions.” Yet most tools and programs currently available are merely descriptive – to derive true insight needs additional interpretations from people who really understand the business. This is where C-Suite collaboration becomes so vital. Organizations thrive when there are diverse and complementary personnel and systems working together. Sharing insights from the analysis of big data across the C-suite and across functions can position businesses to draw valuable insights from this data, harmonize planning around it, align their actions and understand the full value this brings both to their own divisions and the organization as a whole. And the more that data is shared, the more leading businesses discover that they can find answers to today’s – and tomorrow’s – questions. With the measurable business benefits this data sharing can bring, the business case for breaking down silos within organizations is stronger than ever. Where this may have once been a C-Suite aspiration, the make-or-break implications of insights drawn from this data has made it a business imperative. Talking Points “In every industry, our analysis and our work with clients would suggest technology at a minimum is going to be a tremendous accelerant. So if you have a a business model, the opportunity to scale it more effectively, grow it more effectively gets… amplified.” – Greg Case, CEO, Aon “The way that big data pervades most organizations today creates a dynamic environment for C-level executives to explore how it can and should be used strategically to add business value.” –  Economist Intelligence Unit Further Reading

John Bruno

Profile picture for user JohnBruno

John Bruno

John G. Bruno serves as Aon’s chief operating officer as well as chief executive officer of Aon’s data and analytic services solution line, which includes the firm’s technology-enabled affinity and human capital solutions businesses.

Bad-Faith Claims: 4 Ways to Avoid Them

Bad-faith claims are often the result of an oversight or simple miscommunication, and can be avoided.

|
An allegation of bad faith in claims handling can have far-reaching effects, including drawn-out legal battles resulting in potentially sizable settlements and damage to the organization's reputation. But bad-faith claims are not always the result of an organization's deliberate attempt to avoid paying a claim. Rather, they're often the result of an oversight or miscommunication. It's this latter category that claims professionals should focus on. If an insurer is intentionally underpaying its customers or denying claims without valid reason, best practices are not going to improve the situation. But taking a step back and looking at the claims process at an organizational level is an effective way to identify gaps in knowledge or processes that can and do lead to bad-faith claims. Before looking at some specific best practices for avoiding bad-faith claims, it's worth reviewing the seven primary elements of good-faith claims handling, straight from The Institutes' Associate in Claims (AIC) designation course materials
  • Thorough, timely and unbiased investigation
  • Complete and accurate documentation
  • Fair evaluation
  • Good-faith negotiation
  • Regular and prompt communication
  • Competent legal advice
  • Effective claims management
Using these seven keys as a baseline, organizations can further improve the claims process and reduce the risk of bad-faith claims by focusing on the following four best practices: See also: Should Bad Faith Matter in Work Comp? 1. Exercise due diligence when investigating claims. Claims representatives and their insurers' special investigative units have a lot of experience detecting and investigating fraudulent claims and are trained to watch for specific triggers and red flags. However, a suspicious claim is not always a fraudulent one, and claims representatives must still conduct a fair and balanced investigation. Although this may be difficult, waiting until a definite determination is made is the most prudent way to go. Even if a claim appears to be fraudulent, it still requires the same level of due diligence throughout the investigation--interviewing witnesses, inspecting property damage, reviewing medical records, etc. Proper documentation goes hand in hand with proper investigation techniques. Claims professionals should encourage the claimant to submit all relevant documentation or evidence, even if the claim seems fraudulent. This documentation may help clear up any uncertainties. And the investigation must be timely as well as thorough. Often, the timeline for an investigation is mandated by regulations or the specific terms and conditions of the policy. Sticking to this schedule is crucial to meeting requirements and maintaining your reputation with the insured. More and more, claimants expect timely updates with faster resolutions. It's hard to blame them--people want payment for their medical bills or repairs to their homes. Insurers need to stick to the timeline they promised. 2. Rely on a solid claims system. A good claims system that documents a claim's progress is one of the best ways to protect your organization should bad-faith claims allegations arise. Claims representatives usually have a lot on their plates; a formal yet easy-to-use framework makes it easier to comply with regulations and the specifics of individual policies. A robust claims system also helps maintain consistency. A lot of different people may access a claim or contribute to it, such as supervisors, auditors, underwriters and attorneys. Online systems that prevent anyone from changing information once it's been entered help guarantee that everyone who touches the claim is up to date and on the same page. 3. Make use of experts and mentors to stay informed. Having a strong support network is essential to anchoring the claims process. Any time a claims representative is unsure of how to proceed when processing a claim, she should know exactly where to go to get an answer and get the claim moving again. That includes an up-to-date claims manual with set procedures and a chain of command with decision makers who can resolve uncertainties during the claims process. Sharing this information should be a top priority during onboarding for claims professionals. Continuing education is also key. Webinars, designations and state-specific resources detailing evolving regulations and case law are essential. Individual claims representatives should work to expand their knowledge in areas they frequently handle. If you primarily adjust residential claims, become an expert in that field, then use that knowledge to mentor other employees or act as the go-to source of knowledge on that topic. The National Association of Insurance Commissioners, your state's insurance department and insurance commissioner, your insurer's legal and training departments and your direct supervisor are all good sources of information on regulatory standards. States have different laws and court rulings regarding bad-faith claims, and insurers have their own company-specific standards, as well. For larger organizations or individuals in the field who cover a large territory, it may be necessary to keep up with several states' standards. One possible source: Unity Policyholders, which provided a survey and an overview of bad-faith laws and remedies for all 50 states in 2014. See also: Power of ‘Claims Advocacy’  4. Have the right attitude. Claims representatives can often facilitate the claims process simply by listening. Never lose sight of the fact that you may talk to people on some of the worst days of their lives. Sometimes, a person will call, upset and frustrated, and start talking about legal representation. It may be best to listen; it doesn't mean that you'll pay the claim or agree to everything they want, but you can offer some compassion and avoid becoming aggressive in turn. Rarely will all parties agree during the claims process. The key is finding a balance between established procedures that rely on best practices while also leaving enough room in the process to treat each claim uniquely and provide a personal touch for customers. Interested in learning more about good-faith claims handling? Take a look at The Institutes' Good-Faith Claims Handling course. For broader claims knowledge, learn about The Institutes' AIC and Associate in Claims Management (AIC-M) designation programs.

Susan Crowe

Profile picture for user SusanCrowe

Susan Crowe

Susan Crowe, MBA, CPCU, ARM, ARe, AIC, API, is a director of content development at The Institutes. She is also a member of the Philadelphia CPCU Society Chapter and of the Reinsurance Interest Group committee.

5 Misunderstandings on Home Insurance

The relationship between brokers and homeowners is getting more strained. These misunderstandings are probably the reason.

|
Hiring an insurance broker should mean ease, speed and extra security. But not everything about putting a middle man in the process of buying insurance is great. Mistakes and mishaps are bound to happen at some point. Misunderstandings between homeowners and insurance brokers aren’t uncommon. The insurance industry has become a lot more chaotic. More clients are finding it hard to trust agents and brokers, who do sometimes use unethical tactics to earn a living. Let’s take a look at some of the most common misunderstandings. 1. Conflict of interest Insurance brokers get remunerated through a fee or commission for their services. They can get paid by the insurer for bringing a large volume of business to the company. They can also get a commission from their clients by finding the best deal and insurance for them. The risk of conflict arises when the insurance broker favors his personal gains over his duty to his client. This can result in the client agreeing to higher prices or extra coverage he doesn’t really need. See also: A Wakeup Call for Benefits Brokers   2. Nondisclosure and negligence Before a client signs up for insurance, it is his responsibility to divulge all pertinent information, including his income, medical history, home values and details of his home security. Failure to disclose all this information can render him uninsured when he files a claim. There are cases, however, where even forthright and honest men can forget pieces of information. Having an insurance broker handling all the processing can make it more likely to happen. And negligence by a broker can result in a costly misunderstanding. 3. Failure to understand exclusion Clients mostly shop around for price and reputation without realizing the other important factors that can affect their coverage. Insurers are slowly cutting back on coverage and increasing their deductibles in an attempt to increase profits. While insurance brokers can give their best when discussing the exclusion clauses buried in lengthy policies, they can still miss critical details, and one word or phrase can mean thousands of dollars when it’s time to make a claim. A carport, for example, does not technically fall into the category of a building, which means that a client should not expect his insurance to cover a collapse. 4. Underinsurance When doing an assessment, a typical insurance broker would need the help of real estate appraisers or an online program to know how much coverage a homeowner can get. If the broker is fairly new and untrained, he may even obtain figures by directly asking the homeowner how much exactly he is expecting to get. This lack of knowledge can mean that homeowners are greatly underinsured. Yet they will have a false sense of assurance and only realize their problem in the wake of a disaster, such as a tornado or flash flood. Another common misunderstanding between homeowners and insurance brokers involves replacement cost and market value. Most homeowners expect to receive a coverage that will equate to their home’s market value. Replacement cost, on the other hand, is generally higher than the amount a buyer is willing to pay for a house. It’s based on a lot of factors, including the materials used, cost of labor for the demolition and repair, etc. An agent or a broker needs to be very thorough in discussing these details so that he and his client can determine the right insurance and coverage. See also: A ‘Perfect Storm’ of Opportunity (Part 3)   5. Change of policies It’s the insurer’s obligation to notify its clients about any changes in their insurance coverage. It’s also a part of the broker’s responsibilities to let his client know the terms of renewal, cancellation and expiration of the insurance he’s offering and to make sure the client understands. But sometimes clients don't get the message and are underinsured or even uninsured when they file a claim. In cases like this, a client can take legal action against the broker. He may also file a case against the insurer, if it changes the insurance without its client’s consent.

Rose Cabrera

Profile picture for user RoseCabrera

Rose Cabrera

Rose Cabrera is the lead content writer for Top Security Review. It has always been her passion to spread awareness and the right information on keeping homes and families safe.

How to Bulletproof Regulatory Risk

The complexity across literally hundreds of jurisdictions make compliance daunting for even the most seasoned claim professionals.

|
Compliance has become a top priority for insurers as technology emerges to make the process easier and more cost-effective. Perhaps even more importantly, these solutions have demonstrated the ability to improve operational results while also boosting productivity and staff and policyholder satisfaction. Brief History of Insurance and Regulation For thousands of years, insurance was basically unregulated – until 1945. That year, with a focus on protecting consumers from “unscrupulous” insurers, a U.S. Supreme Court ruling put into motion the regulation of insurance, holding that insurance companies were subject to the Sherman Anti-Trust Act. Shortly thereafter, Congress enacted the McCarran Ferguson Act, creating the regulatory framework that has been guiding the insurance industry ever since and providing for individual states to regulate insurance. Fast Facts about U.S. State Insurance Regulation in 2015
  • Total revenue collected by states from the insurance industry increased 3.4% to $22.6 billion
  • Total projected fiscal year 2017 budgets for all state insurance regulatory agencies total more than $1.4 billion
  • State insurance departments received 299,625 official complaints and nearly 1.9 million inquiries
  • Total full-time insurance department staff was 11,304 (down 7.3% from 2007)
  • Market conduct examiners and analysts numbered 497 and represented 4.4% of total staffing
  • Of 880 market conduct only examinations completed, 714 resulted in administrative orders (fines)
  • Fines and penalties against insurers totaling $224 million represented one-sixth of the total annual budget for all state insurance regulatory agencies
See also: The Coming Changes in Regulation   Top 5 Market Conduct Actions Against P&C Insurers According to research shared by Wolters Kluwer Financial Services, claims handling continues to be among the top areas of market conduct criticism:
  1. Failure to acknowledge, pay, investigate or deny claims within specified timeframes
  2. Using unapproved/unfiled rates and rules or misapplying rating factors
  3. Failure to provide required compliant disclosures in claims processing
  4. Failure to cancel or non-renew policies in accordance with requirements
  5. Failure to process total loss claims properly
Compliance Challenges in Claims Management “Claims management has consistently been one of the top three compliance challenges for insurers over the last several years, and once again was the top compliance challenge in 2014 across all lines of business,” said Kathy Donovan, senior compliance counsel at Wolters Kluwer Financial Services. “Insurance claims professionals have to manage a variety of internal and external factors when processing claims, including claimant communications and mandatory disclosures, all within established timeframes. The targeted end result is providing proper payment in accordance with policy provisions and state law.” Compliance Solutions Software and technology are particularly well-suited to enable carriers to avoid fines and penalties, and total loss claim payments – the fifth most frequent source of market conduct fines – is a prime candidate. The U.S. auto insurance industry manages approximately 3.2 million total loss claims annually and is required to get those complex calculations 100% right every time or face fines of as much as $10,000 per claim. The vast majority of total loss claims payments are based on clearly stated rules, regulations, taxes and fees, but the complexity and frequent changes across literally hundreds of relevant state, municipal and other jurisdictions make the task daunting for even the most seasoned claim professionals, let alone the growing number of newer and less experienced staff. Sophisticated purpose-built software supported by a dedicated, expert research team can not only perform the majority of calculations with 100% accuracy every time but can also maintain an all-important audit trail for use in future market conduct exams and can also provide claim staff with instant access to relevant regulations and references for those few files where interpretation and judgment is required. For example, the issue of whether or not to include sales tax and partial refunds of title and registration fees has vexed claims handlers for years. State departments of insurance regularly cite insurers for failing to include or properly calculate tax on automobile total loss claim payments. Worse yet, a large number of insurance departments have either remained silent or issued ambiguous directions about what amounts must be paid and how they should be calculated. See also: Increasing COI Compliance   While increasing numbers of large, well-established information technology firms and some new early stage entrants offer enterprise solutions broadly defined as risk and compliance management solutions, few are specifically insurance-centric, and fewer yet are focused on specific areas of high exposure. In my practice, I have become familiar with some highly innovative insurance compliance solutions that are focused on solving a significant specific need in a major area of complexity and exposure. One such solution that fits this description is a cloud-based total loss workbook that provides automated settlement calculations on a high percentage of passenger vehicles of all types and sizes, including motorcycles in all jurisdictions. The software has the capability to be integrated with third party claims systems and information providers of total loss valuations and other relevant services to provide a truly bulletproof seamless end-to-end solution supplemented by a complete, up-to-the-minute reference library. I encourage insurance carriers and claim departments to take the time to regularly review all available solutions and, in so doing, refocus on their compliance strategies and results. I am available and glad to answer questions and discuss this topic with interested industry participants.

Stephen Applebaum

Profile picture for user StephenApplebaum

Stephen Applebaum

Stephen Applebaum, managing partner, Insurance Solutions Group, is a subject matter expert and thought leader providing consulting, advisory, research and strategic M&A services to participants across the entire North American property/casualty insurance ecosystem.

Back to the Drafting Table on Work Comp

The spate of recent, important decisions raises a question: How do state legislatures wind up passing such complex laws?

|
Recent Supreme Court decisions in Oklahoma, Florida and — to a far lesser extent — Utah – have touched off a firestorm of debate over the so-called interference of the judiciary in the administration of state workers’ compensation systems. But the real issue in these cases is not the specific interpretations of complex laws; it is how state legislatures wind up passing such complex laws to begin with. Consider two older Supreme Court decisions: Hayes v. Continental Ins. Co. (1994) 178 Ariz. 264, 872 P.2d 668 and Smothers v. Gresham Transfer (2001), 332 Or. 83, 23 P.3d 333. In late 1985, the Arizona Court of Appeals recognized a civil action for bad faith by an injured worker against an insurance company. That opinion was not unanimous and seemed to be at odds with prior case law on this issue. Arizona’s legislature generally has short sessions, and, while this cannot be proved, it is likely that a solution to this new case law would have been difficult to arrive at by the May 14, 1986, sine die date in Phoenix. By 1987, the legislature did adopt a bad faith statute, giving the industrial commission the authority to resolve issues regarding unfair claims processing and bad faith actions by claims administrators. The statute begins: “The commission has exclusive jurisdiction as prescribed in this section over complaints involving alleged unfair claim processing practices or bad faith by an employer, self-insured employer, insurance carrier or claims processing representative relating to any aspect of this chapter. The commission shall investigate allegations of unfair claim processing or bad faith either on receiving a complaint or on its own motion.” That pretty much should have resolved the issue. Why would the legislature adopt a bad faith statute after 75 years other than to make clear that exclusive remedy barred an action by an injured worker for bad faith in the handling of claims and to effectively nullify recent judicial decisions saying otherwise? In 1994, the Arizona Supreme Court had the opportunity to answer that very question in Hayes. The court began its analysis by noting, “Although the trial and appellate courts assumed that the statute preempts and divests all state courts of jurisdiction over workers' compensation bad faith cases, Plaintiff correctly notes that the statute does not explicitly say this. In fact, it does not mention either common-law damage actions or divestiture of court jurisdiction.” See also: Where the Oklahoma Court Went Wrong That observation signaled where the court landed on the issue, and Arizona workers’ compensation claims administrators — and, recently, the legislature — have tried and failed ever since to limit the ability of an injured worker to access the courts under the theory of tortious bad-faith claims handling by insurers. The issue is far more complicated than simply one of judicial interpretation. The Arizona Supreme Court has yet to reach ultimate state constitutional issues regarding bad faith claims by injured workers because of its somewhat strained interpretation in Hayes of the 1987 law. At about the same time, the Oregon legislature was addressing a Supreme Court case, Errand v. Cascade Steel Rolling Mills, Inc. (1995), 320 Or. 509, 525, 888 P.2d 544, that called into question the scope of the exclusive remedy of workers’ compensation. The legislation enacted in Salem in response to this opinion included language stating, “(t)he exclusive remedy provisions and limitation on liability provisions of this chapter apply to all injuries and to diseases, symptom complexes or similar conditions of subject workers arising out of and in the course of employment whether or not they are determined to be compensable under this chapter.” In 2001, the Oregon Supreme Court issued its opinion in Smothers. The court took under consideration the cumulative effect of the very comprehensive exclusive remedy legislation enacted in 1995 and the requirement that workplace conditions be a “major contributing cause” of a claim for compensation arising out of an occupational disease. After an exhaustive analysis, the Supreme Court held that the exclusive remedy statute violated Article 1, Section 10 of the Oregon Constitution, which guarantees every Oregonian a “remedy by due course of law for injury done him in his person, property, or reputation.” The legislature promptly responded to the court’s decision and in 2001 addressed the ability of an injured worker who failed to meet the major contributing cause standard to bring a civil negligence action against the employer. The legislative resolution preserved the rule of law in Smothers, was constitutionally firm and ultimately resulted in little (if any) of the expected fallout from the court’s decision. As noted by the Oregon Department of Consumer and Business Services in its 2010 Report on the Oregon Workers’ Compensation System, “Although it was estimated that the Smothers decision could affect as many as 1,300 cases per year and cost up to $50 million per year, there have been no known cases in which workers have prevailed at trial; in a few cases workers have received settlements.” So what do these cases have to do with Maxwell v. Sprint PCS (in Oklahoma), Castellanos v. Next Door Company (in Florida), Westphal v. City of St. Petersburg (in Florida) and Injured Workers Association of Utah v. State of Utah? Everything. Workers’ compensation legislation has generated volumes of appellate case law across the ages and in all jurisdictions. There are a host of reasons for this, but one major factor is the very nature of workers’ compensation public policy. Rarely is the system going to be reviewed by legislators unless there is a crisis — historically, in the form of high insurance premiums but more recently when self-insured employers call for change and labor is more than willing to sit down with them and negotiate. This leads to prophylactic laws designed to ameliorate a specific situation and are combined with long-term benefit increases. Even Oregon, whose management labor advisory committee (MLAC) is a model for workers’ compensation public policy development, is not immune from these pressures, as Smothers demonstrated more than a decade ago. See also: Appellate Court Rules on IMR Timeframes   There is no justification to suggest that every element or iteration of workers’ compensation laws passed for well over a century is somehow immune from judicial scrutiny. Indeed, most states have, within their body of case law, important decisions redefining the law that are the result of appeals from employers rather than injured workers. These frequently result from interpretation of laws from administrative tribunals, as is noted in the lengthy line of cases over the past 10 years in California of appeals from decisions of the Workers’ Compensation Appeals Board. Furthermore, the courts cannot be held to a legislative agenda that is the result of one particular group or another successfully negotiating the political winds of the time. The stakeholders of the system are not immune from suggesting ill-conceived laws any more than legislatures are immune from passing them. None of this is to suggest that the majority opinions in these recent cases represent good legal scholarship. It is to say, however, that when going back to the respective state legislatures to address these cases, a more careful consideration of policy — even at the expense of losing a bit of the singular focus on costs — could lessen the possibility of unintended consequences. And, as for “due process,” we should all remember New York Central Railroad Co. v. White, (1917) 243 U.S. 188 also contained the following language when holding that the New York compensation scheme under review met due process standards: “This, of course, is not to say that any scale of compensation, however insignificant, on the one hand, or onerous, on the other, would be supportable. In this case, no criticism is made on the ground that the compensation prescribed by the statute in question is unreasonable in amount, either in general or in the particular case. Any question of that kind may be met when it arises.” The recent challenges and questions raised over workers’ compensation reform throughout the states over the past 20 years suggest we are closer to “the question of that kind” arising. Whether it does is in large part because of what stakeholders and policymakers determine should be done, rather than what one side or the other knows it can do simply because it has the votes.

Mark Webb

Profile picture for user MarkWebb

Mark Webb

Mark Webb is owner of Proposition 23 Advisors, a consulting firm specializing in workers’ compensation best practices and governance, risk and compliance (GRC) programs for businesses.

New Healthcare Brawl, Different This Time

Payers and health systems are blending in provider-sponsored organizations, driving toward integrated care in smaller pockets of populations.

|||
For the thousands of healthcare consumers reading this post...and the millions in attendance across this great country...lllllllllllllllet's get ready to rummbulllllll! In this corner - fighting over a period of 68 years, with a track record of unaffordable and ever-skyrocketing premiums, causing long-term wage stagnation, plus lower rates of savings for individuals all over the land. The largely unchallenged, reigning champion in U.S. healthcare coverage for nearly all Americans under 65.......the third-party payers! And in this corner - fighting for more than 25 years. They've captured and controlled healthcare populations, acquired and limited provider competition, all the while driving up costs, consumer medical debt and personal bankruptcies. With a long history of mass overutilization, lower care quality and high administrative salaries......the hospital and health systems! Ladies and Gentlemen...this same fight took place back in the 1990s, for the purse strings of nearly all the private pay healthcare market. The hospital and health systems took on the risk of creating traditional health plans, and many of them took it on the chin. There were too many operational nuances, such as claims, underwriting administration and attracting sicker patient pools. Not to mention the ire of health payer executives. The environment is different today. The stakes are far higher. The outcome may determine the direction of more than $1 trillion per year in consumer and employer-directed healthcare payments, transforming the model of U.S. healthcare into one of needed, sustainable long-term growth. Today's payer profits are limited, not only by the medical loss ratio rule, but now with provisions of the ACA to accept all patients with pre-existing conditions. On the other side, health and hospital systems have successfully acquired a significant and well-diversified care "umbrella" over large populations. Many achieve greater leverage in securing higher payer reimbursement rates, all the while still capturing local practices, doctors and newly minted med school graduates, who willingly trade off past, present and future administrative headaches for more patient engagement and a steady paycheck. See also: Keep the Humanity in Healthcare   Yet within their growing mini-monopolies, hospitals and health systems, like payers, are also having a come-to-Jesus moment. Medicare quality scores give only 2.8% of all hospitals their highest, 5-star rating. Nearly 70% received only between two to three stars! There is argument on the factors for these ratings, yet administrators clearly understand that future Medicare payments will be based on value of care, where quality of care is very important. This is especially true as health systems and affordable care organizations (ACOs) will continue to dominate healthcare delivery in the U.S. And yes -- future private insurer payments are likely to follow suit. Let's Give Health Plans Another Shot More than ever, we're seeing health systems creating their own provider-sponsored plans (PSPs) and simply becoming their own payer, even where they can compete for covered lives in the growing Medicare Advantage and Medicaid Managed business. PSPs come in multiple varieties, depending on the questions asked and resulting strategies formed: Starting fresh or acquiring an existing plan? Partnering or not partnering on risk with existing insurers and provider networks? Covering care only in their care system or with other systems and providers? Though previously unsuccessful, PSP results appear more promising today. Atlantic Information Services (AIS) data shows there are more than 270 PSs in existence. This is up from 107 just two years ago -- and more than one-third have more than 10,000 members. If the trend continues, predictions are that about 70 million Americans could be enrolled in PSPs in five years. While that is happening, we are seeing payers such as Harvard Pilgrim and others seeking to go the way of Kaiser, adding medical facilities to create integrated care systems. Hence, both payers and health systems are blending more than ever, driving toward integrated care in smaller pockets of populations. These smaller pockets of integrated care appear to offset more risk, especially when they seek to merge. We are then likely to run into a microcosm of the same anti-competitive pricing fears as with Anthem-Cigna and Aetna-Humana. Let's Bypass the Problem...With Direct Contracting This option allows self-insured employers to work around health payers, by contracting with large, geocentric health systems to deliver care to their employees. By using third party administrators (TPAs), contracted transparent care and drug fees and in-house actuaries and risk managers, employers can also lower claim administration costs. Plus, employers gain other savings by working around payers. Just a little wrinkle here...What about the many millions of individual members who remain under fully insured payer plans? Well, we have the growth of health insurance captives, which pools together smaller companies to gain self-insured benefits. But the question still remains... When health plans get further cut out of the self-insured employer client loop, what happens to pricing for the rest of the remaining payer risk pool? The revenues and profits for payers will need to come from somewhere. See also: Healthcare: Time for Independence   We know payers are not getting onto ACA exchanges to acquire more customers. That leaves subsidization from the government to fill in the affordability gap for the fully insured. Other options are: 1) creating a single pay government plan; 2) providing government incentives to PSPs to be competitive in more local pockets or 3) offering incentives for the formation of fully insured and PSP plans, so payers and health systems can cover more with greater risk sharing. Healthcare 3.0: Increased quality, better technology, higher taxes, greater unemployment and remaining unaffordability Health systems have grown by implementing effective leadership, making strong IT investments, reducing geographic competition and employing better risk solutions and strategy. They are going to get stronger with direct contracting, mergers and acquisitions, growth of PSPs, improved care coordination and the use of new technologies in the emergence of value-based care. Solutions in areas such as predictive analytics, mhealth, patient-generated healthcare data, diagnostic accuracy, supply chain management, population health, chronic disease management, telehealth and artificial intelligence will promote greater efficiency, better outcomes and increased patient satisfaction and drive down cost in key healthcare industries. But driving down cost DOES NOT mean pricing for care, coverage and drugs will plummet for consumers. I expect mass unaffordability will largely remain. Look, the first order for businesses in any for-profit sector is to make profit, grow customers and remain competitive. While healthcare consumers and advocates believe in reforming our system to a fairer, more affordable solution for care, coverage and medications, equally for all Americans....businesses don't. And they're not going to succumb to guilt or public shaming, or be willing to give themselves significant salary haircuts to do so. In fact, I would expect that early cost-reduction successes will translate into healthcare companies largely funneling the differences back into themselves as re-investments or profits, while holding prices steady to claim consumer-friendly positioning. "Hey, at least we've put the brakes on higher prices. We'll try to figure out how to do more...but look, this is great news for now. Be in touch soon!" The only path I see toward future consumer affordability is to push and provide incentives blending our three-party into a two-party system. With enough healthcare players, greater transparency and relatively equal levels of care quality, free market forces will ultimately work to create a greater downward push for consumer pricing. A free market system would be painful in the beginning. Healthcare players will not only have to invest in and implement new technology, but also utilize augmented and artificial intelligence solutions. This would drive efficiency and accuracy to the obvious point where industry leaders would greatly reduce and replace their greatest expense...employees. By the end of 2016, the healthcare sector will be the largest employment pool in the U.S. In a free market, you cannot have bloated employment with acquired technology capable of creating massive efficiencies to drive down cost and consumer price. However, today's price-regulated market would allow that to happen, where excess expenses are simply passed down to healthcare consumers and employers in the form of higher prices. Free market healthcare is for now a far-off dream. So as the market slowly transforms and reshapes itself, we will likely see personal and corporate taxes going up. See also: Is Transparency the Answer in Healthcare?   With no foreseeable surge in GDP, new jobs or average worker wages, we're seeing the middle class slipping. Healthcare costs are not likely to translate into significant consumer price decreases. Add to that the past, current and future growth of healthcare subsidies per the ACA exchanges and ever-expanding Medicaid programs. Folks...that big nut will have to be covered at some point. Parting Thoughts... Woodrow Wilson once said, "The seed of revolution is repression." Healthcare has operated on a model outside of free market forces, where consumers have paid the price, literally for decades. In the next era of healthcare, consumers carry an obligation, not just to continue funding this juggernaut, but to take on greater responsibility for their health choices and results. No matter how this emerging fight changes healthcare, the patient, through greater engagement and care, should be at the center. I see population health management as both educating and necessarily empowering healthcare consumers. Not only to recognize poor past choices and grow from new healthier ones, but to appreciate and value how much they truly need healthcare services, coverage and medications that are simply out of their financial reach. Perhaps their own transformations will turn large-scale frustration into massive targeted determination, demand and revolution, where elected politicians begin to cower and capitulate, not to special interests, but to a population of healthy Americans who recognize the importance of an affordable and sustainable health care system. So they and future generations can embrace the American dream - and live healthier, less stressful lives while doing it. I hope to live to see that day.

Stephen Ambrose

Profile picture for user StephenAmbrose

Stephen Ambrose

Steve Ambrose is a strategy and business development maverick, with a 20-plus-year career across several healthcare and technology industries. A well-connected team leader and polymath, his interests are in healthcare IT, population health, patient engagement, artificial intelligence, predictive analytics, claims and chronic disease.

Solvency II: Still Missing Buy-in

The new prudential framework is not yet "business as usual." Many insurers are far away from using the framework.

|
The first QRT and the Opening information ("Day 1 Reporting") have been filed. The insurers are "on track" with Solvency II ... at least from a pure regulatory compliance point of view ! However, my day-to-day observation is more likely to be a mixed picture. The new prudential framework is not yet "Business As Usual (BAU)" as the large majority of insurance organizations are pretty far away from using the framework as a risk and capital based decision framework.  The first "real world" ORSA process has been (more or less) launched but it continues to be considered as a "reporting exercise" despite EIOPA having launched a EU-wide stress-test and a major "stress event" has occurred in June with the UK "Brexit"! See also: The Right Way to Test for Solvency It appears useful to have a look on a target operating model underlying Solvency II:
  • Solvency II Risk and Capital Management represents the core of the new model.
  • This model requires policies to meet the regulatory and organizational targets - these policies have been designed and approved in 2015 or early 2016 and need to be applied progressively.
  • The two ORSA preparation projects 2014 and 2015 should now become BAU and "only" need to be run as a real process translating the ORSA policy designed in 2015 or early 2016.
  • The 2017 reporting deadline for the first SFCR needs preparation and strategic decision making on how to meet the regulatory requirements and how to communicate with the stakeholders, the analysts, the competitors and any other third party eager to understand the new transparency.
  • The AMSB (Administrative Management Supervisory Board) is ready to demonstrate that it takes decisions based on the risks and the capital based principles governing Solvency II!
See also: Solvency 2: An Outcome Very Different Than Planned   We definitely face a sufficient number of challenges, expected and unexpected, to be eager to apply and test the new framework in BAU conditions. Let's take the time, or, if necessary, the break to really make it happen. This experience will be crucial to contribute to the 2018 review of the Solvency II framework by EIOPA and the NCAs.

Hans Willert

Profile picture for user HansWillert

Hans Willert

Hans Willert is the insurance practice partner of Magellan Partners, a Paris and Luxembourg-based consulting group. Willert has 26 years of experience in the insurance industry. He is a thought leader in risk and capital management, insurance digitization and transformation.