Download

AI: The Next Stage in Healthcare

Many medical professionals fear that AI will cost them their jobs, but costs must somehow decline, and errors must plunge.

sixthings
In the great tradition of discoveries such as germ theory, X-rays, DNA and penicillin, the next stage of our growth in healthcare will come from artificial intelligence (AI). This includes effective machine learning and neural networks and a promise of greater pattern recognition, data analysis, "general thinking," decision-making and efficiency.   Until recently, a large percentage of data was not digitized, nor did we have the storage and processing power. Now that we have this capability, we will soon see significant steps forward in AI innovation, growth of neural networking and deep-learning technology, including:
  • Medical imaging and diagnostics;
  • Wearables;
  • Mental health;
  • Virtual assistants;
  • Risk management;
  • Drug discovery;
  • ER and hospital monitoring;
  • Health and lifestyle management;
  • Biotechnology; and
  • Genomics.  
Take a moment to see what's on the horizon: smart visionaries and emerging companies such as  Merge HealthcareZephyr HealthLumiataGinger.ioApixioZebra Medical VisionBabylon HealthSentrianAICure and HealthAware. Venture funding will continue to increase in the digital health sector, with respect to dollars invested and deals made. Look to see greater M&A activity in many of healthcare's largest industries. These initiatives appear to be putting our country's health in a great place for the future. Many of these efforts are intertwined with the healthcare's industry's efforts to capture the Triple Aim: improving the patient's experience with care (quality and satisfaction), bettering the health of populations and reducing healthcare cost.  See also: How to Think About the Rise of the Machines Thanks to abundant processing power that was previously available only on the world's most robust supercomputers, AI is developing faster than most predicted. As medicine faces many existing and new challenges, it will seek to address many areas in healthcare rife with inefficiencies and waste. By the end of 2016, the healthcare sector is set to have the largest workforce in all U.S. industries — including government. At nearly 19% of the U.S.' GDP (and growing), healthcare must recognize that technologies such as AI can serve a major purpose in improving human-powered efficiency.   When leaders think about AI in healthcare, they are immediately drawn to innovating, reducing medical error, improving management of patient health and making discoveries in drugs and biotechnology. But the greatest benefit may lie in the reduction of human capital, a large direct and indirect contributor to cost. Many medical professionals fear that, because of AI, they may lose their jobs. Early AI companies, especially those courting large players such as IBM, recognize this. These companies are not keen on holding hands and have been assuring medical clients this will not happen. Clearly, we must walk before we can run, and augmented intelligence, as a form of AI, is this needed step.   However, our healthcare and political leaders must recognize the importance of the last leg of the Triple Aim. Until cost drops significantly, we will continue to see great financial strain on many Americans. Outcome-based payments are set to help with this financial burden, though they might not be enough. Publicly owned, for-profit companies need to satisfy their shareholders. Besides, a recent Johns Hopkins study put medically caused deaths as the No. 3 killer in America today, at nearly 251,000 deaths (nearly 10% of the total), showing that even the best-intentioned and -trained humans have limitations. When AI truly arrives, we must make sure our healthcare and political leaders are held to doing what is best for the health and well-being of the American workers and taxpayers. If greater efficiencies and accuracy through replacing jobs with AI can lead to the Triple Aim and greater healthcare system sustainability, we must not be afraid to move forward. Healthcare is not only about business and profit but about serving what Abraham Lincoln called "the better angels of our nature" by passing on a better system to the next generation.

Stephen Ambrose

Profile picture for user StephenAmbrose

Stephen Ambrose

Steve Ambrose is a strategy and business development maverick, with a 20-plus-year career across several healthcare and technology industries. A well-connected team leader and polymath, his interests are in healthcare IT, population health, patient engagement, artificial intelligence, predictive analytics, claims and chronic disease.

How to Lose $7 Billion a Year

Bad valuations cost underwriters $7 billion a year on business interruption insurance -- but third-party data can end the problem.

sixthings
It is estimated that commercial property writers lose out on more than $7 billion annually in business interruption insurance, a line that could deliver increased and sustainable earnings upside annually but has often struggled to do so. This loss represents not only undervalued policies, but also income lost because of premium calculations that are not commensurate with risk. As with other property business, the No. 1 culprit is the decades-old difficulty that insurance companies’ face establishing adequate coverage limits for property lines -- and business interruption insurance (BII) often has worse results than insurance on buildings and contents. For the past 10 years especially, the property insurance industry worldwide has been buzzing with concerns about coverage adequacy for BII. The problem affects both business owners policies (BOPs) and the larger package policies (CPP/SMPs). Caroline Woolley, senior vice president at Marsh’s Business Interruption Center, wrote a comprehensive report in 2015 summarizing the challenges the industry faces making BII coverage profitable. Woolley lays out five major obstacles that agents, companies and brokers face when underwriting this coverage line. Woolley says the No. 1 problem is simply “getting the values right” when policies are first written and again at time of renewal. The valuation concern stems from the fact that there has been no standardized, simple-to- learn-and-use insurance-to-value (ITV) system for BII coverages similar to what is done today for buildings and contents. No. 1: Getting the values right According to a survey conducted by the Chartered Institute of Loss Adjusters in 2012 ((and quoting from PMWBG)), 40% of declarations were deemed too low by about 45%. More recently, PMWBG research shows as much as 58% of BII coverages are undervalued by 48%, suggesting the problem is getting worse at a time when demand for property insurance is in decline and competition is fierce. Inadequate coverage disenfranchises consumers, and improper valuation undermines providers. In a very competitive marketplace, where too much supply is chasing dwindling demand, carriers losing on the valuation front lose reputation, financial advantage and long-term revenue. From the inception of BII coverage in the 1930s, calculating risk-specific BII limits has not been easy. The BII coverage addresses shortfalls in the margins corporations face when loss occurs, so underwriters, brokers and agents should understand key variables in the insured's financials. Unfortunately, not enough industry professionals are proficient in this area, leading to costly exposure errors, pricing mistakes and the age-old dilemma of undervaluation. As important is the fact that, unlike with other lines, there has been very little third-party data to aid insurers with BII calculations.  When losses occur, it’s too late in the game to correct undervaluation problems. The impact, especially in today’s economy, where wildfires, storms and other disasters routinely happen, has caused companies like Marsh to look again at the coverage line, suggesting the need for industry-standard ITV calculation tools.   Now, modern web-enabled technology offers both substantive raw data on businesses that actuaries will want to work with to improve pricing models, at the same time carriers will use the program’s web-based ITV system to calculate detailed BII coverage reports for the majority of businesses found anywhere in the U.S. Virtually any enterprise can be valued, with complex insurance specific data sets searched automatically on behalf of the user to both pre-fill input and create BII reports. First Step to Success Vast amounts of insight about corporations and their supply chains can be aggregated on to estimate BII limits in seconds, accessible anywhere from the Internet. In the case of the BOP sector, actuaries and pricing managers have instant access to large amounts of aggregated data for the various sizes and types of business insured, to develop more representative and localized pricing models. Users can also adjust models automatically for the business opportunity rather than offer one-size-fits-all pricing. Additionally, because core data changes annually, savvy users can also upgrade model variables.

Peter Wells

Profile picture for user PeterWells

Peter Wells

Peter M. Wells is a 30-year veteran of the P&C insurance industry and banking community who is known for creating key technical innovations that are used every day by millions of business professionals in the financial services and other industries.

Competing in an Age of Data Symmetry (Pt. 3)

Consumers will no longer be at a data disadvantage; they will be able to test brand promises -- and insurers must be ready to react.

sixthings
The Internet is a mirror of sorts — a data mirror. Right now, it is a sort of fuzzy data mirror, but the pictures grow clearer as the available data grows. Soon, the image of an insurers’ customer service, pricing and claims experiences will grow crisp. How will it happen? How will insurers respond and remain competitive? In Part 1 and Part 2 of our series, we discussed data symmetry — the leveling of the playing field that is currently happening because insurers are gaining access to many of the same streams of data. The trend runs in contrast to data asymmetry, which allowed insurers to comfortably differentiate themselves by being good at the analysis of their own in-house data. As insurers use more and more of the same data and some of the same analytics tools and methodologies, they will find themselves in a pool of sameness. Differentiation by price and service will be less about introspective analysis and more about finding and delivering on real brand promises. So, in today’s blog we are crossing a bridge of sorts. We are going to look at how the consumer will achieve data symmetry by gaining a clear view of the real insurer. See also: Data Science: Methods Matter Changes in scrutiny are causing data symmetry Insurers are the subjects of constant scrutiny. The NAIC, the Federal Insurance Office, the Department of Labor, every state and every consumer protection organization have an interest in watching insurers. Yet all of that scrutiny may pale in comparison to the impact of the coming wave of individual consumer scrutiny. Consumers are using ratings, stars, comments and shopping patterns to give instant feedback to all service providers. Feedback (real experience) is a sales tool for aggregators and retailers. It is a reason for consumers to choose particular channels or pipelines. Amazon and eBay don’t have to build trust for any one product. They only have to facilitate feedback and let the products, services and suppliers speak for themselves. These outside views are the result of symmetrical data availability. Prospects are now able to compare any product or service, including insurance, with greater real data, including both sources that are verifiable and those that contain unstructured data. Consumers may look at an insurer through the lens of an insurance aggregator, such as Insure.com or The Zebra, or through simple search terms such as “worst auto claims experience in my entire life.” They may also witness an insurance interaction through their relationships with friends on social media. Reputation analysis will hold tremendous power to validate or invalidate brand promises. Does the insurer make it simple to file a claim? Does it have a poor track record in paying claims? Are renewal rates much higher or lower than competitors'? These bits of information weren’t as public in the past. Today, they are common and easy to find. See also: What Comes After Big Data? Data symmetry’s effect on the insurer will operate much like a looking glass. The insurer will begin to see itself, not as it has attempted to portray its brand, but as it is perceived during real interactions. This will lead some insurers to make course corrections. The good news is that data symmetry will supply healthy doses of reality. Insurers will know and understand their competition. They will have an unprecedented, timely idea about what customers really want and how well they are supplying it. If they are prepared for the coming levels of data symmetry, insurers will also be able to make agile shifts and meaningful steps toward selling insurance through many different channels. Many of these details are still food for our insurance visions. One thing is certain, however. Data and analytics will continue to unlock the secrets of market positioning to keep insurers competitive. Data’s relevance to business decisions will always grow.

John Johansen

Profile picture for user JohnJohansen

John Johansen

John Johansen is a senior vice president at Majesco. He leads the company's data strategy and business intelligence consulting practice areas. Johansen consults to the insurance industry on the effective use of advanced analytics, data warehousing, business intelligence and strategic application architectures.

AI: Everywhere and Nowhere (Part 2)

Detractors of artificial intelligence say it hasn't live up to the big promises. In fact, AI is everywhere -- and just getting started.

|
This is part two of a three-part series. Part 1 can be found here As we saw in a previous blog post on AI Everywhere and Nowhere, defining artificial intelligence is like trying to hit a disappearing target. As soon as any aspect of AI gains widespread adoption, people fail to distinguish it as an AI technology, and it dissolves into the sea of general technology. As a result, most detractors of AI, at least until recently, have questioned the real-world applications of AI. In turn, AI never gains the respect and recognition it needs to evolve and reach its full potential. The beauty (and bane) of AI is that it is everywhere and yet nowhere – it is becoming ubiquitous in all of our interactions (at least all of our virtual interactions), yet most people fail to recognize and respect it. Artificial Intelligence Is Ubiquitous Intelligence You wake up in the morning and from your bed ask your digital assistant, "What is the weather like today?" It replies, "We have 80% chance of snow in Lexington later in the evening – with accumulations of one to three inches." The voice recognition, the natural language understanding of our question, the search through the Internet to get the right answer and the translation of that answer into speech is all AI. You get into your office and open your email. Your email gets automatically sorted into "Social," "Forums," "Private" or whatever categories you have created, gets identified as important or not or marked with whatever tags you have provided to make it easier for you to read and clear your email. The classification of your email based on the To, From, Subject and Content fields, the natural language processing to extract the right keywords, the machine learning to determine what is spam or not spam or who is important or not is all AI. You open up your online newspaper to check on the stock market performance from yesterday. You get a description of the overall stock market performance and the movement of your favorite stocks. The news is personalized to the topics, sources and authors that you want to read, and the newspaper has recommendations on what is trending among the sources or people you follow. The natural language generation based on structured stock market performance data, the curation of articles based on personal preferences and the recommendation engine for suggested articles are all AI. You open up your favorite search engine, and, as you type your query in the search box, the system suggests possible completions. Then, the system recommends the right websites from billions of documents on the Internet and the right ad that matches your query, and fulfills the best bid for your search term among competing advertisers who want to personalize their message to you. The statistical inference in suggesting completions, the page rank algorithm that computes the relevant pages to display and the selection of the right ad using a real-time ad exchange are all AI. See also: How to Think About the Rise of the Machines The list goes on and on. In fact, there is very little in our day-to-day life that is not affected by AI in some way. Yet the real power of AI is the insight that it provides us, without our being aware of it. The intelligence hidden behind many of our day-to-day interactions is powered by an AI algorithm related to machine learning, natural language processing or more generally unstructured data processing, intelligent search, intelligent agents and robotics. And, while AI is ubiquitous, we have only scratched the surface regarding what it can mean for us.

Anand Rao

Profile picture for user Anand_Rao

Anand Rao

Anand Rao is a principal in PwC’s advisory practice. He leads the insurance analytics practice, is the innovation lead for the U.S. firm’s analytics group and is the co-lead for the Global Project Blue, Future of Insurance research. Before joining PwC, Rao was with Mitchell Madison Group in London.

Gene Testing: Time Is Ripe in Work Comp

Gene testing is showing promise as a tool to get the right medication at the right dose to each workers' compensation patient.

sixthings
Pharmacogenetic testing (PGT) has the potential to help clinicians improve outcomes for injured workers and reduce costs for payers. While research showing the clinical value of PGT continues to grow rapidly, evidence of the return on investment in the workers' comp space is just beginning to emerge. Practitioners can benefit from the technology without falling victim to the hype of some proponents by becoming better educated about PGT and those providing it. Because the use of PGT in the workers' comp population is relatively uncommon, practitioners may find it challenging to realize the true value of the tests. "A few of our customers are trying PGT on select claimants," said Dianne Tharp, pharmacist and executive clinical liaison for pharmacy benefit manager Healthcare Solutions, an Optum company. "This is a complex area; everything is evolving. It's relatively new for the industry, and we are all still learning." One growing area of interest is in genetic tests that can identify injured workers most at risk for addiction and abuse. However, there are many challenges with such tests, including uncertainty about their predictive performance in clinical settings, which must be overcome before clinicians can use them to help identify whether an injured worker may misuse or abuse a prescribed opioid. While PGT could be a welcome tool, the science is not yet at a level where clinical application is appropriate. "On the other hand, pharmacogenetic testing for drug response is often more — and in some cases highly -- predictive,” said Naissan Hussainzada, senior director of genetics strategy and commercialization at Millennium Health. “For example, certain genetic variations can change how an individual metabolizes some opioid medications. Using this information, clinicians can identify patients at higher risk for medication failure and/or side effects, which may help them make more informed and tailored treatment decisions.” Injured workers with preexisting conditions or those who develop comorbid conditions post-injury may especially benefit from PGT — as they may be receiving multiple medications that could potentially elevate their risk for drug-drug and gene-drug interactions. PGT information could also help the clinician better understand whether drugs prescribed for comorbid conditions will be effective. "In the workers' comp space, PGT could be used to help the clinician optimize medication prescribing and avoid trial and error,” Hussainzada said. “This has the potential to translate to faster recovery, less time away from work and shorter claim duration for the injured worker.” See also: Genetic Testing: The New Wellness Frontier Polypharmacy challenges Multiple medication regimens and comorbid conditions are frequently present in workers who are injured on the job. The inability to work and the presence of pain can result in additional comorbidities, especially depression. Metabolism can play an important role in how patients respond to medications, particularly antidepressants, opioids, certain anticoagulants and cardiovascular medications. Mental health providers, in fact, were among the first to recognize the value of PGT in guiding medication therapy and dosing. "Mental health disorders are often assessed subjectively, and drug therapy can be lengthy, unpredictable and suboptimal,” Hussainzada said. “It may take several months to stabilize a patient on an effective antidepressant using trial and error.” PGT can be especially useful for antidepressants. “There are actionable PGT results with good evidence for the antidepressants,” Tharp said. “That would be an instance where PGT may be useful [among injured workers].” In addition to antidepressants, Tharp said PGT is also being used to help determine a patient's ability to properly metabolize warfarin, which is used to prevent blood clots. Drug-drug interactions Individuals metabolize medications differently, partly depending on a person's genetic makeup and partly on clinical factors, such as hepatic (liver) disease, lifestyle factors and administration of other medications. For example, introducing a new medication may change how existing drugs are metabolized, which can change their effectiveness or tolerability. Conversely, an existing medication may have an impact on the metabolism of a new medication. “There are well-documented drug-drug interactions between opioid analgesics and certain antidepressants,” Hussainzada said. “This is because some antidepressants can inhibit or ‘turn off’ the enzymes responsible for metabolizing opioids. This can lead to the opioid becoming less effective, or in some cases, intolerable or potentially toxic. Making matters more challenging, there are some individuals that carry certain genetic variations that can make them more susceptible to a phenomenon called ‘phenoconversion,’ which can elevate their risk for certain types of drug-drug interactions. For injured workers receiving polypharmacy, PGT may help clinicians identify these higher-risk individuals and help mitigate some of the risks of phenoconversion.” There are four categories of metabolizer type that correspond to how individuals may metabolize certain medications via hepatic enzymes. Individuals classified as “extensive” metabolizers possess fully functional enzymes and are able to metabolize medications normally. However, some individuals carry genetic variations that lead to reduced or significantly reduced enzyme function, and are classified as “intermediate” or “poor” metabolizers. Finally, some people may have genetic variations that lead to significantly increased enzyme function and are classified as “ultra-rapid” metabolizers. What that means is: Two people taking the same drug at the same dose can have very different responses because of their metabolizer status. Individuals susceptible to phenoconversion can “switch” metabolism type, for example, from an intermediate or extensive metabolizer to a poor metabolizer. The trigger for these conversions is non-genetic extrinsic factors, such as administering a drug that inhibits the enzyme pathway. Certain metabolizer types are associated with higher risk of phenoconversion and risk of drug-drug interactions. "Intermediate metabolizers may be at higher risk for phenoconversion compared to normal metabolizers," Hussainzada said. "However, it can be difficult to identify these patients because they may display normal or typical response to a medication, even if they are metabolizing that drug at a reduced rate. However, if an inhibitor of the drug is added to their regimen, this can shift the individual from intermediate to poor metabolism and lead to medication failure and/or potentially serious side effects.” For some claimants who take medications for pre-existing conditions, adding a pain medication can increase the risk for drug-drug interactions and phenotypic conversion. "So a claimant who has been taking antidepressants for years is now also prescribed an opioid because of his injury," Hussainzada said. "If he is an intermediate metabolizer for the opioid, the antidepressant may convert him to a poor metabolizer. This could lead to inadequate pain relief, which may delay recovery and increase risk of poor outcomes.” In another scenario, an injured worker who is taking opioids for his injury and who later develops depressive symptoms may be treated with concomitant antidepressant therapy. “In this case, the opioid may have been initially effective, but certain opioids would lose analgesic potency once the inhibitor, or antidepressant, is added," Hussainzada said. PGT can also help a clinician identify patients who may need to be started with atypical or non-standard doses of certain analgesics. One particular enzyme responsible for the metabolism of a large number of medications is cytochrome P450 2D6, or CYP2D6. Claimants who are reduced metabolizers for the pathway may not respond adequately to a standard dose of oxycodone. “If you are a CYP2D6 poor metabolizer, standard doses of oxycodone or hydrocodone may not effectively control your pain,” Hussainzada said. “However, without knowing this type of genetic information beforehand, it may appear to the clinician that these individuals are drug-seeking if they continue to ask for higher doses.” Some poor metabolizers may not get any pain relief, even with very high doses of a medication. Identifying these patients through PGT can lead the clinician to prescribe a different pain medication from the start, something that can be critical to getting an injured worker back to function. According to a recent position paper from Healthcare Solutions, the rates of comorbidity and polypharmacy are on the rise in workers’ comp and can lead to increased medical costs, delayed returns to work and longer claim durations. Clinical depression is a common comorbidity, and the use of antidepressants is prevalent; however, both are associated with poor recoveries and outcomes. "For patients taking multiple medications, there may be multiple enzymes that are recruited to metabolize and eliminate these drug combinations from the body,” Hussainzada said. "Some recent data indicates that when you look across multiple enzymes, genetic variation is much more common than when you look at a single enzyme. So for the claimant receiving polypharmacy, it may be even more important to understand how their genetics will contribute to their medication response since it is likely that at least one enzyme system may be variant.” Clinicians can use PGT information at the beginning of a claim to optimize initial prescribing and dosing of opioids and other medications, which may hasten the recovery time. "In workers' comp, the data are pretty clear: The faster we can facilitate post-injury recovery and get the claimant back to work, the better their overall prognosis,” Hussainzada said. "Particularly with opioid therapy, we want to use these drugs judicially and effectively. See also: Urine Drug Testing Must Get Smarter The future Researchers and workers’ comp practitioners continue to monitor the clinical evidence for testing in an effort to help clearly identify those injured workers who would benefit most from PGT — in terms of better outcomes and lower costs. For now, there are several types of injured workers who may be good candidates for testing. "A claimant taking multiple medications from several therapeutic classes, one who has failed several therapies and changing dosages or a patient on ultra-high daily morphine equivalent doses may be a good candidate for PGT,” Healthcare Solutions' Tharp said. Ultimately, proponents hope PGT can be a useful tool in getting the right medication at the right dose to each patient. If test interpretations are based on firm clinical evidence, PGT can provide clinicians with a road map for navigating prescribing decisions that can often be complex and subjective. However, providers are advised to become familiar with PGT and, especially, the companies marketing these services. "Payers, clinicians and patients need to be aware that not all pharmacogenetic testing is equal. Ask questions about the evidence for specific genes and drugs and make sure there are clinical standards in place for how results are interpreted,” Hussainzada advised. "Some tests may not be ready for clinical use, so it’s important to be informed."

Nancy Grover

Profile picture for user NancyGrover

Nancy Grover

Nancy Grover writes Workers' Compensation Report, a national newsletter published 18 times per year. Grover is also a regular columnist for WorkCompCentral and has contributed an article to NCCI's Annual Issues Report for the past five years.

Now, Everything Can Be 'As-a-Service'

Start-ups can buy resources incrementally, as the business grows -- a game changer. And every company can offer products "as-a-service."

sixthings
Everything-as-a service is transforming the economics of establishing and running a company. Product-as-a-service will fundamentally change insurance product design and delivery. Before recently joining insurance fintech start-up Instanda, I spent the last 15 years working within the insurance industry for U.K. FTSE 250 companies — such as Hiscox, Capita and, most recently, Xchanging. For the up-and-coming executive, there is something very comforting about working for a big, established company during the early part of your career. You are able to immediately plug in to a brand, revenue flow and customer base that is already well-established. There are lots of people with well-defined roles to support you, and you will undoubtedly benefit from a significant investment in physical infrastructure (whether that be a branch network of offices around the globe or big, heavy IT infrastructure sitting in your own data centers). But that level of comfort comes at a price for the big corporation. There is an enormous amount of capital in the business tied up in “stuff” (office furniture, leases, servers, etc.), and there is an inevitable restriction in the ability to move quickly to respond to changing customer needs. We all know, when a company gets bigger, it becomes more unwieldly and bureaucratic. What has really struck me since joining Instanda is how technology and service provision have moved on to such an extent that you can gain access to the same benefits and capabilities of the infrastructure of big companies at a fraction of the cost — and without losing your agility and flexibility to respond to the needs of the business. As a business, Instanda is a firm believer in consuming "everything-as-a-service." We are a technology company that does not own a server; all of our IT infrastructure is procured from Microsoft Azure, which gives us access to almost instantaneous unlimited storage and processing power from our desktop dashboard. For office and email suite, we use Microsoft 365, where are able to tap into the many years and millions of dollars of Microsoft's investment for a small monthly sum per employee. "As-a-service" is often thought of as being a software service provided out of the cloud, but, of course, it can just as easily be physical infrastructure. The sharing economy is full of examples where physical infrastructure is available to be purchased at a fractional cost. Uber is "transport-as-a-service," and through the good offices of property services firm wework, we are able to procure very high quality workspace as "property-as-a-service." Our newly built offices are sitting on the edge of London, close to our customer base and fitted out to the highest standards. In the past, for a small company like Instanda, these offices would have simply been beyond our means, but in the new "as-a-service" economy, we can purchase as many (or as few) desks as we like — with only a monthly notice period required to add seats or to exit the space, all while still benefiting from the full range of office facilities of a multimillion-pound company. Similarly, our accounting, payroll and CRM systems are all consumed as cloud-based services where we only pay for what we consume. Yet it was not long ago when the idea of placing your key customer data on a system and servers you didn’t own or control would have been seen as a crazy business risk. Imagine going to your CEO today and saying, “I want to build our own bespoke CRM system, buy some physical servers and store them in our own operated data center.” You would soon be shown the door. So, what was considered risky and unthinkable in the past can very quickly move to business-as-usual when the competitive advantages become undeniable. See also: How to Insure the Sharing Economy So what all this means is that a relatively new business like Instanda can purchase all the key services it needs to operate as a business on-demand  with "everything-as-a-service" and, most importantly, at an incremental cost completely aligned to the size of the business. The ability to buy all these capabilities "as-a-service" fundamentally shifts the cost dynamics of operating a business and allows a much smaller business to effectively compete with much bigger, longer-established businesses on equal footing. In fact, it gives you a strong competitive advantage because you can operate at a price point and with a degree of flexibility that bigger companies cannot match because of their past significant investment in physical infrastructure. In the insurance industry, capital is becoming increasingly commoditized as surplus capital seeks better returns in this sector. Underwriting and insurance products have become harder to differentiate because of increasing competition, so the battleground is now in distribution. Whether you are a reinsurer moving into insurance, an insurer opening new global offices or trying to dis-intermediate your broker channel by going direct, a broker establishing your own branded products or an MGA reaching into new markets, the overriding business challenge is: “How do I get my products out to my customer quickly and cost effectively? So what we have done at Instanda is to take all the benefits and advantages of “everything-as-a-service” and applied the same concepts to "products-as-a-service," establishing a platform to facilitate the manufacture and global distribution of insurance products. The benefit of this approach is that we can get our customers to market anywhere in the world — 10 times quicker and 10 times cheaper than the traditional approach of building products within an installed back office software system. Our configurable toolkit allows our customers to quickly assemble any type of insurance product and completely control the look and feel of the online and mobile product. Our customers can build their products themselves without the need to code or deploy IT staff — combined with a commercial model completely aligned with the success of the products on our platform. See also: Is That Opportunity Calling in the ‘Sharing Economy’? (Part 2) By fundamentally changing the cost dynamics of insurance product manufacture and distribution, "product-as-service" opens up new sales opportunities that simply were not possible or justifiable before. Do you want to create a different look and feel for the same product for each of your agents or distribution channels? Do you want to launch a micro-insurance site for single items that are bought by the hour? Do you want to offer a short-term insurance product for a single event? Do you want to test the attractiveness of a new product before investing in worldwide distribution? All these become simple and cheap when utilizing a "product-as-a-service" platform. Of course, the real test is whether this "product-as-a-service" approach delivers the tangible benefits promised to the customer. Already, large insurance organizations such as Sompo Canopius and U.K. retail insurer LV, are utilizing the benefits of "product-as-a-service" to shorten time and costs to get to market. The approach also works for smaller organizations like Compass Underwriting.

Max Pell

Profile picture for user MaxPell

Max Pell

Max Pell joined Instanda after an extensive management career in technology and business processing outsourcing. Pell spent five years as managing director of Xchanging's U.K. Insurance division, which provided central technology and processing services to the London market.

The Latest Charts on Internet Statistics

Mary Meeker gave her always-anticipated, annual presentation on the state of the Internet this week.

sixthings

Mary Meeker gave her always-anticipated, annual presentation on the state of the Internet this week, and I thought I should share with you. Here is the link to her massive, 213-slide presentation: http://www.kpcb.com/internet-trends.

You will certainly see numbers or even whole slides proliferate in coming days and weeks, but I encourage you to at least skim through this. Meeker, a partner at venture capital firm Kleiner Perkins Caufield Byers, has become an institution in Silicon Valley because of this presentation, which serves as a reference point for many innovators.

You won't find huge surprises -- unless I've missed something -- but I thought a few things were worth noting:

-- The main one for me is maturing of voice recognition, which she covers starting on slide 112. Just when you thought you were starting to figure out how to move to mobile, Silicon Valley starts to move to another disruptive technology for you to cope with....

See also: Solution to Brain Drain in Insurance?

Meeker has a chart showing that voice recognition is now about 90% accurate even in a noisy environment with speakers who have a variety of accents. That is up from 70% just six years ago. She shows essentially straight-line improvement since 1970 and says that, once voice recognition hits 99% accuracy, the human interface with computers will quickly move to voice, with all kinds of implications.

Now, she hasn't always been accurate. Back in the late 1990s, when she was a securities analyst at Morgan Stanley, she was one of the main characters pumping air into what turned out to be a bubble of valuations for Internet start-ups. Personally, I wouldn't assume that her chart will continue to show straight-line growth for voice recognition. It's a lot easier, typically, to get from 70% to 90% than it is get those last few percentage points of improvement for any technology.

I'm also a bit jaded because I've been hearing about voice recognition for a good 25 years, at least since I saw a demonstration at a conference I attended during my days as a technology reporter at the Wall Street Journal. A gentleman was supposedly chosen at random from the audience and, despite a heavy Russian accent, had his speech recognized almost perfectly when he spoke into a microphone. Yet here we are 25 years later, and uses of voice recognition are almost always part of phone trees where the choices of response are quite limited -- and where the system doesn't seem to hear you when you demand a live person, no matter how you loud or distinct you are when you say the word "representative."

Still, anyone who has used an Amazon Echo or similar device knows how great it can be to be able to just call out a question about the weather or what the score is in a baseball game. And the change caused by voice recognition will be disruptive enough that any thinking about new user interfaces should at least contain some experimenting with voice recognition. You need to figure out how close it is now to being useful for your purposes and to stay on top of developments in coming years. I don't think it'll be 25 years before I write again about voice recognition, and, when I do, I'll probably dictate to my computer.

-- Starting on slide 137, she does a nice job laying out the latest stats on the connected car. Nothing startling, if you've been following along, but lots of good material.

-- Beginning on slide 185, she offers some trend lines about the Internet that include some names you won't know and might want to note -- I certainly didn't know some of them, and I follow this stuff pretty closely.  She singles out Slack, a communication system that is becoming popular in some circles, especially the younger types. She also mentions Looker, an interesting data platform, plus Mapbox, Datadog, Ionic Security and so on.

-- The last 10 slides or so contain some good stats about cyber security.

See also: Best Practices in Cyber Security  

There is plenty of other good material, including about opportunities in China and India, but I wanted to single out the sections that touch most closely on the themes we've been hitting about innovation at Insurance Thought Leadership.


Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

The Latest Charts on Internet Statistics

Mary Meeker's annual presentation on the state of the Internet says voice-recognition technology is nearly ready and will be highly disruptive.

sixthings
Mary Meeker gave her always-anticipated, annual presentation on the state of the Internet this week, and I thought I should share with you. Here is the link to her massive, 213-slide presentation: http://www.kpcb.com/internet-trends. You will certainly see numbers or even whole slides proliferate in coming days and weeks, but I encourage you to at least skim through this. Meeker, a partner at venture capital firm Kleiner Perkins Caufield Byers, has become an institution in Silicon Valley because of this presentation, which serves as a reference point for many innovators. You won't find huge surprises -- unless I've missed something -- but I thought a few things were worth noting: -- The main one for me is maturing of voice recognition, which she covers starting on slide 112. Just when you thought you were starting to figure out how to move to mobile, Silicon Valley starts to move to another disruptive technology for you to cope with.... See also: Solution to Brain Drain in Insurance? Meeker has a chart showing that voice recognition is now about 90% accurate even in a noisy environment with speakers who have a variety of accents. That is up from 70% just six years ago. She shows essentially straight-line improvement since 1970 and says that, once voice recognition hits 99% accuracy, the human interface with computers will quickly move to voice, with all kinds of implications. Now, she hasn't always been accurate. Back in the late 1990s, when she was a securities analyst at Morgan Stanley, she was one of the main characters pumping air into what turned out to be a bubble of valuations for Internet start-ups. Personally, I wouldn't assume that her chart will continue to show straight-line growth for voice recognition. It's a lot easier, typically, to get from 70% to 90% than it is get those last few percentage points of improvement for any technology. I'm also a bit jaded because I've been hearing about voice recognition for a good 25 years, at least since I saw a demonstration at a conference I attended during my days as a technology reporter at the Wall Street Journal. A gentleman was supposedly chosen at random from the audience and, despite a heavy Russian accent, had his speech recognized almost perfectly when he spoke into a microphone. Yet here we are 25 years later, and uses of voice recognition are almost always part of phone trees where the choices of response are quite limited -- and where the system doesn't seem to hear you when you demand a live person, no matter how you loud or distinct you are when you say the word "representative." Still, anyone who has used an Amazon Echo or similar device knows how great it can be to be able to just call out a question about the weather or what the score is in a baseball game. And the change caused by voice recognition will be disruptive enough that any thinking about new user interfaces should at least contain some experimenting with voice recognition. You need to figure out how close it is now to being useful for your purposes and to stay on top of developments in coming years. I don't think it'll be 25 years before I write again about voice recognition, and, when I do, I'll probably dictate to my computer. -- Starting on slide 137, she does a nice job laying out the latest stats on the connected car. Nothing startling, if you've been following along, but lots of good material. -- Beginning on slide 185, she offers some trend lines about the Internet that include some names you won't know and might want to note -- I certainly didn't know some of them, and I follow this stuff pretty closely.  She singles out Slack, a communication system that is becoming popular in some circles, especially the younger types. She also mentions Looker, an interesting data platform, plus Mapbox, Datadog, Ionic Security and so on. -- The last 10 slides or so contain some good stats about cyber security. See also: Best Practices in Cyber Security   There is plenty of other good material, including about opportunities in China and India, but I wanted to single out the sections that touch most closely on the themes we've been hitting about innovation at Insurance Thought Leadership.

Paul Carroll

Profile picture for user PaulCarroll

Paul Carroll

Paul Carroll is the editor-in-chief of Insurance Thought Leadership.

He is also co-author of A Brief History of a Perfect Future: Inventing the Future We Can Proudly Leave Our Kids by 2050 and Billion Dollar Lessons: What You Can Learn From the Most Inexcusable Business Failures of the Last 25 Years and the author of a best-seller on IBM, published in 1993.

Carroll spent 17 years at the Wall Street Journal as an editor and reporter; he was nominated twice for the Pulitzer Prize. He later was a finalist for a National Magazine Award.

2 Novel Defenses to Hacking of Browsers

Start-ups Authentic8 and Ntrepid have clever ways to keep thieves from hacking into a computer, then corporate systems, through a browser.

sixthings
Cyber attackers continue to exploit a significant security gap found in a familiar tool used pervasively in all company networks: the common web browser. Mozilla Firefox, Google Chrome, Microsoft Explorer and Apple Safari all use an architecture that makes it relatively easy for an attacker to embed malicious code on an employee’s computer — and then use that infected machine as a foothold to probe deeper into the breached network. Here’s the good news: There is a growing cottage industry of security vendors developing sophisticated technology specifically to plug this gaping exposure. Browser security vendors first appeared on the scene about 2010; leading innovators include Invincea, Bromium, Spikes Security and Menlo Security. ThirdCertainty recently visited with two new entrants, Ntrepid and Authentic8. Here is what each brings to the table: The morphing of browser usage Authentic8 recently introduced a service called Silo, which isolates web browser malware code from the targeted computer — and the rest of the company network — by routing all employees’ browsing sessions to dedicated servers. Authentic8 CEO Scott Petry has a long history helping companies keep intruders out of companies' networks. Petry founded email-filtering company Postini, which was bought by Google and folded into the search giant in 2007. Petry, who co-founded Authentic8 with another Postini alum, Ramesh Rajagopal, observes that the arrival of sophisticated browser security tools (like Silo) is a reflection of how web browser usage in corporate settings has morphed over the past couple of decades. In the 1990s, IT departments “would control how you compute, when you compute and what applications you access,” Petry recalls. Steadily, the web browser “became such a massive focal point or gravity center for how people consumed different web services,” Petry says. “It became extremely compelling for employees to access the web for personal use and for businesses to start taking advantage of the web as a way to perform business functions.” Amazon pioneered e-commerce, and Google got businesses and consumers accustomed to quickly searching for, and pinpointing, desired information. All of this leveraged the browser’s capacity to execute code on individual computers in response to users’ clicks. “As soon as that happened, business data that IT departments used to control in their environment was suddenly scattered across third-party websites that they didn’t control,” Petry says. Then social media, including Facebook and Twitter, appeared, and all bets were off. See also: 3 Steps to Improve Cyber Security Routing malware to silos The environment “is now a mess,” Petry says. “If you think about how the browser is used, it’s a one-size-fits-all solution. People use the same browser with a tab opened to get to Facebook, a tab opened to get to Dropbox and a tab opened to get to wherever. It’s a mix of personal use and business activity, and it’s no wonder that the browser is such a point of vulnerability.” Venture capitalists are funding tech entrepreneurs and are coming forward with new systems to lock down browsers — because, going forward, how we have come to use browsers is not likely to change. “I’m sure at some point we will move away from a monolithic browser,” Petry says. “It might change over time, but people have been predicting the death of email for 10 or 15 years, and it is still the most common form of business communication. So, no, I don’t think the browser is going anywhere any time soon.” Authentic8’s Silo product isolates all web code in a secure, remote container in the cloud, giving users a benign display of web content. Nothing reaches the user’s device except pixels. “The attack surface area is now ours, and that’s where we deal with it,” Petry says. Virtual sessions Instead of moving browser sessions into isolated servers, Ntrepid addresses the problem by inserting a virtual browser into every employee’s computer. Any malicious code arriving via a web browsing session is isolated from the hard drive or memory of the targeted computer. The machine, in essence, is inoculated against browser malware and cannot be used by the attacker as a beachhead to go deeper into the company’s network. Web browsers, by design, execute code over which network administrators have zero control. This code execution enables all of the cool, interactive things we can do on our browsers. Trouble is, criminal hackers can all too easily slip malware into this mix. Like Authentic8’s isolated servers, Ntrepid’s virtual browsers protect the organization from “all web-based attacks, including web-delivered malware, watering hole attacks, spear phishing, passive information leakage and drive-by downloads,” according to Ntrepid. Ntrepid’s technology, called Passages, enables employees to “safely browse anywhere,” providing them “the freedom to surf online without the risk of infecting their machines or compromising valuable enterprise data.” To activate Passages, a user simply clicks on it on the desktop instead of Internet Explorer, Firefox or another conventional browser. See also: How to Measure Data Breach Costs Any malware encountered on a website is "trapped” inside Passages’ virtual machine and can’t infect anything else on a user’s computer, says Lance Cottrell, Ntrepid’s chief scientist. The malware is destroyed when the browser session is over. While, for the moment, browser security technology is being marketed to small- and medium-sized businesses and large enterprises, Ntrepid and Authentic8 are both developing marketing efforts to serve individual consumers. “We’re starting off on enterprises — our early adopters — but they are always saying, ‘What about my wife, what about my kids, can I get this at home?’” Cottrell says. Cognizant of a massive data breach last year at the U.S. Office of Personnel Management — when hackers accessed personal information of more than 21.5 million employees, family members and others — Ntrepid is accelerating its marketing efforts to consumers, Cottrell says. ThirdCertainty’s Gary Stoller contributed to this report. More stories about browser security: Spikes Security isolates malware, keeps it from hijacking Web browsers More organizations find security awareness training is becoming a vital security tool Managed security services help SMBs take aim at security threats

Byron Acohido

Profile picture for user byronacohido

Byron Acohido

Byron Acohido is a business journalist who has been writing about cybersecurity and privacy since 2004, and currently blogs at LastWatchdog.com.

#1 Affliction Costing Businesses Billions

Healthcare is "managed" as a large operational expense — instead of as a strategic asset that delivers a sustainable competitive advantage.

sixthings
Preserving the status quo (PTSQ) is repeatedly the cause of lost revenue, missed opportunities and even bankruptcies. The pace of innovation and change in business is accelerating at an ever-faster pace. Organizations with good leadership decide to move forward scared, rather than remain frozen with fear. Recently, I had the pleasure of spending the day with Peter Diamandis, the founder of X Prize Foundation and the best-selling author of Abundance and BOLD. Diamandis was named by Fortune as one of  “The World’s 50 Greatest Leaders.” He spoke about why we are living in a world of abundance and about how to recognize the future direction of technology and business opportunities. All of his examples and stories were directed at educating the audience on how to stay ahead of the competition and how to create disruptive innovation in any industry. (And, did I mention he graduated from medical school and has very strong opinions about the future direction of medicine? I’ll save those comments for another day.) See also: 10 Reasons Why Healthcare Varies His talk on the “6 D’s of Exponentials” was exceptional. It’s a way of thinking about how exponential technologies are affecting our world. He proceeded to describe business examples in robotics, artificial intelligence, 3-D printing, biotechnology, self-driving cars, space exploration and medicine. So, how is it, in the face of exponentially increasing change and all this business opportunity, that managing healthcare for so many organizations represents a slow, linear decision-making process characterized by rigid thinking, detached leadership and high costs? Even more confounding is that many corporate C-suites have abdicated responsibility of a multimillion-dollar division (healthcare) to internal managers who inadvertently make the problem worse each year. The typical corporate culture talks about innovation, but it only reinforces and encourages business as usual and the preservation of the status quo. Business as Usual, No Disruption Here For the majority of mid-sized organizations, healthcare is "managed" as one of the largest operational expenses on the balance sheet —instead of as a strategic asset that delivers a sustainable competitive advantage. Continuing to manage healthcare as an expense while somehow expecting a different result will continue to be costly. If you don’t disrupt your business practices, then someone will disrupt you. A study from the John M. Olin School of Business at Washington University estimates that 40% of companies in today’s S&P 500 will not exist in 10 years. The world of healthcare is changing rapidly. The Affordable Care Act was merely a catalyst that has triggered a tsunami of endless change in the future of healthcare in America. The challenge is to recognize new opportunities and to implement them effectively. In PwC’s latest CEO study, almost 75% of those surveyed said they are concerned their companies lack the skills needed to meet future competition. Think about all the job duties, responsibilities and competing demands a typical healthcare manager experiences. These day-to-day competing priorities mean a manager must focus time and energy on eliminating the tallest fire first, regardless of the schedule, all while managing the second-largest capital allocation for the organization: healthcare. Is it any wonder that incrementalism, fear of change and choosing the path of least resistance run rampant in corporate America? See also: Healthcare at the Tipping Point The C-suite needs to get involved and apply its business finance skills to healthcare. The C-suite must wake up to the fact that linear thinking — like the illusion that vendor size creates market leverage in healthcare — is as outdated as flip phones and fully insured insurers who refuse to substantiate annual billing and rate increases with financial transparency. According to an AON/Hewitt Survey of more than 1,000 companies, 77% of respondents said the actions of their peers influence their organization's healthcare strategy. Comparing your results with other organizations trapped by the same poor outcomes of legacy best practices and groupthink is like listening for an echo and expecting a different answer. The survey results should make the C-suite sheepish. If you’re a CEO or CFO, you have to ask yourself these questions: With all the evidence about increasing change in business, do you honestly believe managing healthcare to preserve the status quo is the prudent thing to do? Can your organization’s stakeholders afford the cost of doing nothing?

Craig Lack

Profile picture for user CraigLack

Craig Lack

Craig Lack is "the most effective consultant you've never heard of," according to Inc. magazine. He consults nationwide with C-suites and independent healthcare broker consultants to eliminate employee out-of-pocket expenses, predictably lower healthcare claims and drive substantial revenue.