Tag Archives: UI

Emerging Technology in Personal Lines

Personal lines insurers are investigating emerging technologies and developing strategies and plans related to individual new technologies. Technology is advancing so rapidly that it is even difficult to define what should be considered an emerging technology. For the past several years, SMA has been tracking 13 technologies that many consider to be emerging. These include technologies such as autonomous vehicles, AI, wearables and the Internet of Things. In our recent research, five of these technologies have emerged as “power players” for personal lines insurers, based on the level of insurer activity and the potential for transformation. The specific plans by insurers for these and other technologies are detailed in the SMA report, Emerging Tech in Personal Lines: Broad Implications, Significant Activity.

See also: 2018’s Top Projects in Personal Lines  

Some big themes for emerging tech in personal lines stand out:

  • Artificial Intelligence dominates. AI is often a misunderstood and misused term. However, when specific technologies that are part of the AI family are evaluated, much activity is underway – by insurers, insurtech startups and mature tech vendors. Chatbots, robotic process automation (RPA), machine learning, natural language processing (NLP) and others are the subjects of many strategies, pilots and implementations.
  • The Autonomous Vehicle frenzy is cooling.There is still an acute awareness of the potential of autonomous vehicles to dramatically alter the private passenger auto insurance market. But there is also the realization that, despite the hype, the transition is likely to be a long one, and the big implications for insurers are probably 10 or more years out.
  • The IoT is going mainstream. Discussions continue about the transformational potential of the IoT for all lines of business. But rather than just talking about the possibilities, there is now a great deal of partnering, piloting and live implementation underway. We are still in the early stages of incorporating the IoT into strategies and insurance products and services, but their use is becoming more widespread every day.
  • UI Options are dramatically expanding. The many new ways to interact with prospects, policyholders, agents, claimants and others should now be considered in omni-channel plans. Messaging platforms, voice, chatbots and more are becoming preferred ways to communicate for certain customer segments.

See also: Insurtech and Personal Lines  

Certainly, other trends and much emerging tech activity are happening outside these main themes. Wearables, new payment technologies, drones, blockchain and other technologies are being incorporated into strategies, pilots and investment plans. The next few years promise to be quite exciting as advancing technologies spark more innovation in the industry.

Top Mega-Trends With Big Implications

It used to be common to say that “technology is marching forward, improving business and society.” But today, it would be more accurate to say that technology is sprinting forward – with progress at breakneck speed and breakthroughs happening in multiple fields on a regular basis. There are so many technologies – some new, some just emerging – that is it virtually impossible to track the progress of all of them, let alone explore all their implications. This may put insurers in an uncomfortable position. Insurance is an industry based on historical data and long-term predictions. However, technologies are now inundating the world with real-time data and a change-pace so accelerated that it is difficult to make predictions. Fortunately, SMA’s new research report, The Emerging Tech Landscape: 10 Mega-Trends for 2018 and Beyond, assists by taking a big-picture view of the key developments in the tech world.

To paraphrase George Orwell’s quote from Animal Farm, “All technologies are equal, but some technologies are more equal than others.” Every technology has a role to play in the business and personal spheres. Mature technologies such as telephony and email still matter. More recent technologies like mobile and social media have become mandatory, foundational technologies and have been instrumental in transforming the world. Emerging technologies such as autonomous vehicles, the Internet of Things (IoT), wearables and many others are poised to anchor the next wave of global transformation – affecting the way we live, work and play. It is these emerging technologies that require the rapt attention from insurers now. The earlier technologies are still very important, but insurers have already built those into their business and have had lots of experience with them. But the emerging technologies now have more potential to fundamentally change the insurance industry than anything else at any other time in history. The risk landscape will change. Many new options are becoming available that will change internal operations. Customer expectations are changing, and new customer segments are coming into view.

See also: Key Insurtech Trends to Watch  

Some of the mega-trends that insurers should be monitoring and considering in terms of strategy implications are:

  • 5G and AI Form the Foundation: 5G communications networks and artificial intelligence will form the key foundations for the digital connected world in the next decade. We will need to move lots of data very fast and automate the analysis and actions surrounding that data.
  • User Interfaces Are Revolutionized: We are witnessing a dramatic expansion of how we interact with computers and the world around us in new and more natural ways. These new UI technologies affect both emerging technologies and incumbent technologies. The technologies are now rapidly maturing to mimic and capitalize on all of our senses as well as the movements of our bodies.
  • Mobility is Hot: Autonomous vehicles are enjoying a great deal of press these days. But this is only one aspect of a complex picture of the evolution of mobility. The notion of mobility encompasses many innovative technologies and approaches to moving people and goods from place to place.

These are just a few of the mega-trends that are important for insurance. Expect these and others to be dominant themes over the next few years. Taken as a whole, the change wrought by emerging technologies is likely to rock the insurance industry for the next decade. That said, insurance is still insurance, and the industry has many strengths to build on. The great challenge (and opportunity) for senior management teams is to double down on traditional insurance strengths while building a highly adaptive organization to respond to changes and prosper in the new era.

See also: 5 Trends for Employers to Watch in 2018  

Click here for more information on SMA’s recent research report, The Emerging Tech Landscape: 10 Mega-Trends for 2018 and Beyond.

Healthcare Data: The Art and the Science

Medicine is often considered part science and part art. There is a huge amount of content to master, but there is an equal amount of technique regarding diagnosis and delivery of service. To optimally succeed, care providers need to master both components. The same can be said for the problem of processing healthcare data in bulk. In spite of the existence of many standards and protocols regarding healthcare data, the problem of translating and consolidating data across many sources of information in a reliable and repeatable way is a tremendous challenge. At the heart of this challenge is recognizing when quality has been compromised. The successful implementation of a data quality program within an organization, similar to medicine, also combines a science with an art form. Here, we will run through the basic framework that is essential to data quality initiative and then provide some of the lesser-understood processes that need to be in place in order to succeed.

The science of implementing a data quality program is relatively straightforward. There is field-level validation, which ensures that strings, dates, numbers and lists of valid values are in good form. There is cross-field validation and cross-record validation, which checks the integrity of the expected relationships to be found within the data. There is also profiling, which considers historical changes in the distribution and volume of data and determines significance. Establishing a framework to embed this level of quality checks and associated reporting is a major effort, but it is also clearly an essential part of any successful implementation involving healthcare data.

Data profiling and historical trending are also essential tools in the science of data-quality management. As we go further down the path of conforming and translating our healthcare data, there are inferences to be made. There is provider and member matching based on algorithms, categorizations and mappings that are logic-based, and then there are the actual analytical results and insights generated from the data for application consumption.

See also: Big Data? How About Quality Data?  

Whether your downstream application is analytical, workflow, audit, outreach-based or something else, you will want to profile and perform historical trending of the final result of your load processes. There are so many data dependencies between and among fields and data sets that it is nearly impossible for you to anticipate them all. A small change in the relationship between, say, the place of service and the specialty of the service provider can alter your end-state results in surprising and unexpected ways.

This is the science of data-quality management. It is quite difficult to establish full coverage – nearly impossible – and that is where “art” comes into play.

If we do a good job and implement a solid framework and reporting around data quality, we immediately find that there is too much information. We are flooded with endless sets of exceptions and variations.

The imperative of all of this activity is to answer the question, “Are our results valid?” Odd as it may seem, there is some likelihood that key teams or resident SMEs will decide not to use all that exception data because it is hard to find the relevant exceptions from the irrelevant. This is a more common outcome than one might think. How do we figure out which checks are the important ones?

Simple cases are easy to understand. If the system doesn’t do outbound calls, then maybe phone number validation is not very important. If there is no e-mail generation or letter generation, maybe these data components are not so critical.

In many organizations, the final quality verification is done by inspection, reviewing reports and UI screens. Inspecting the final product is not a bad thing and is prudent in most environments, but clearly, unless there is some automated validation of the overall results, such organizations are bound to learn of their data problems from their customers. This is not quite the outcome we want. The point is that many data-quality implementations are centered primarily on the data as it comes in, and less on the outcomes produced.

Back to the science. The overall intake process can be broken down into three phases: staging, model generation and insight generation. We can think of our data-quality analysis as post-processes to these three phases. Post-staging, we look at the domain (field)-level quality; post-model generation, we look at relationships, key generation, new and orphaned entities. Post-insight generation, we check our results to see if they are correct, consistent and in line with prior historical results.

If the ingestion process takes many hours, days or weeks, we will not want to wait until the entire process has completed to find out that results don’t look good. The cost of re-running processes is a major consideration. Missing a deadline due to the need to re-run is a major setback.

The art of data quality management is figuring out how separate the noise from the essential information. Instead of showing all test results from all of the validations, we need to learn how to minimize the set of tests made while maximizing the chances of seeing meaningful anomalies. Just as an effective physician would not subject patients to countless tests that may or may not be relevant to a particular condition, an effective data-quality program should not present endless test results that may or may not be relevant to the critical question regarding new data introduced to the system. Is it good enough to continue, or is there a problem?

We need to construct a minimum number of views into the data that represents a critical set and is a determinant of data quality. This minimum reporting set is not static, but changes as the product changes. The key is to focus on insights, results and, generally, the outputs of your system. The critical function of your system determines the critical set.

Validation should be based on the configuration of your customer. Data that is received and processed but not actively used should not be validated along with data that is used. There is also a need for customer-specific validation in many cases. You will want controls by product and by customer. The mechanics of adding new validation checks should be easy and the framework should scale to accommodate large numbers of validations. The priority of each verification should be considered carefully. Too many critical checks and you miss clues that are buried in data. Too few and you miss clues because they don’t stand out.

See also: 4 Ways to Keep Data Quality High  

Profiling your own validation data is also a key. You should know, historically, how many of each type of errors you typically encounter and flag statistically significant variation just as you would when you detect variations in essential data elements and entities.  Architecture is important. You will want the ability to profile and report anything that implies it is developed in a common way that is centralized rather than having different approaches to different areas you want to profile.

Embedding critical validations as early in the ingestion process as possible is essential. It is often possible to provide validations that emulate downstream processing. The quality team should have incentives to pursue these types of checks on a continuing basis. They are not obvious and are never complete, but are part of any healthy data-quality initiative.

A continuous improvement program should be in place to monitor and tune the overall process. Unless the system is static, codes change, dependencies change and data inputs change. There will be challenges, and with every exposed gap found late in the process there is an opportunity to improve.

This post had glossed over a large amount of material, and I have oversimplified much to convey some of the not-so-obvious learnings of the craft. Quality is a big topic, and organizations should treat it as such. Getting true value is indeed an art as it is easy to invest and not get the intended return. This is not a project with a beginning and an end but a continuing process. Just as with the practice of medicine, there is a lot to learn in terms of the science of constructing the proper machinery, but there is an art to establishing active policies and priorities that effectively deliver successfully.

Top 10 Lists From CES2018

CES (formerly known as the Consumer Electronics Show), has become the biggest tech event in the world. CES2018 was so massive that there could probably be 50 different Top 10 lists. Here are just a few of mine that I hope you will find interesting and useful.

  1. 5G and AI are the top enabling technologies for the connected world of the next decade.
  2. Voice assistants are everywhere – incorporated into every smart device possible.
  3. Everyone is talking about mobility (driverless vehicles, the sharing economy, smart cities reshaping transport).
  4. The next big user interface (UI) trend will be Augmented Reality for All (AR for All).
  5. Cutting the cord is a big trend (wireless power, untethered virtual reality (VR), wireless audio, etc.).
  6. Biometrics gain steam for security (facial recognition, fingerprint, voiceprint, iris scan, etc.).
  7. Smart Cities are gaining more visibility (they even had a special agenda and exhibit focus this year).
  8. AI is not only an enabler for the next decade, it is becoming dominant today.
  9. Specialized chips and sensors abound – for LiDAR, AI, visioning, and many other applications.
  10. Smart-home tech continues to proliferate, and winning platforms and companies are starting to emerge.

  1. BYTON vehicle: New car company with an awesome vision for a “smartphone on wheels.”
  2. Flexound: Sensation of touch added through sound waves.
  3. Bellus3D: 3D modeling of human face/head to create avatars, etc. (To see mine, click here).
  4. Aflac robotic duck: Cuddly animatronic, AI-based duck given to kids with cancer.
  5. IV-Walk: Vest to administer IV fluids and enable patients by providing more mobility.
  6. Foldimate: Automatic clothes-folding machine.
  7. LG Display’s roll-up TV: Ultra-thin 65” OLED TV display that can be rolled up.
  8. SapientX: Movie quality avatars for conversational AI.
  9. Monuma: Blockchain-based app to record and estimate the value of costly objects.
  10. Guardian by Elexa: Water monitoring system with leak detection and automatic shut-off capability.

  1. Tennibot: Autonomous tennis ball collector.
  2. Robomart: World’s first self-driving store.
  3. Phrame: Smart license plate frame.
  4. 90Fun Puppy 1: Self-driving luggage (yes, it follows you around).
  5. B-Hyve: Smart yard (monitors watering systems).
  6. Milliboo: Smart couch.
  7. Kohler: Connected, talking toilet, enabled by Alexa.
  8. Somnox: Small robot that you can cuddle and sleep with.
  9. Velco: Connected handlebars.
  10. Kuri: Robot that acts like a digital pet.

See also: Collaborating for a Better Blockchain  

What, you may ask, does this have to do with insurance? It turns out that many of these are relevant for insurance in one way or another (okay, maybe not the talking toilet). But overall, these lists give a small glimpse of the era of unprecedented innovation that is sweeping the world. The things that the insurance industry insures, the way insurers communicate with prospects and policyholders, the nature of risk and how insurers improve operations all are being affected by the trends in emerging technologies, and we are only at the beginning of the digital, connected world.

CES: User Interface Is Front and Center

This year’s CES is no less mind-boggling than in prior years. With 2.7 million square feet of exhibition space, about 4,000 exhibitors, hundreds of sessions and 180,000 people, it is virtually impossible to take it all in. However, there are a few big themes that always emerge, along with a variety of interesting new products – some are potential game-changers while others are head-scratchers. But, I’ll save a more in-depth analysis for another blog to concentrate on one overarching theme I’ve seen from CES2018 – the prominence of the user interface (UI).

This emphasis on the UI is especially interesting because CES has historically been considered a “hardware” show, with the latest and greatest statistics touted by tech companies. Metrics related to speeds, capacities, pixels, size (some devices keep getting bigger while others keep getting smaller) and other units of measure dominate the discussions and marketing materials. But one prevalent thread throughout much of CES2018 is the dramatic expansion and innovation regarding how we interact with computers and the world around us.

See also: Rise of the Machines in Insurance  

Start with the fact that voice assistants are increasingly embedded into new solutions – Alexa, Google Assistant, Cortana, Bixby and others from prominent tech brands are leveraged in home devices, vehicles, mobile devices and many other smart things. Next, consider that haptics and gestures are becoming more advanced and being used to control more devices. New car company BYTON unveiled a car that allows interactions via five simple hand gestures (and that vehicle also has a 49-inch touchscreen and has integrated Amazon’s Alexa). Also, interactions based on our movements continue to be enhanced in the VR world.

Another area in which interaction is rapidly advancing is the use of biometrics. Fingerprints are already broadly used to unlock devices and to gain access to other digital assets, but we increasingly see solutions based on iris scans, facial recognition, hand geometry and other unique aspects of human physiology. We can all hope that the days of the password are numbered (YAY!).

Chatbots are emerging in many places, and people are getting used to interacting with them for sales advice, customer service and tech problems. Many still need to be infused with more AI to perform at a higher level, but there is a distinct trending toward more chatbot use. It is also likely that we will see a resurgence of avatars to give more personality to chatbots. At CES2018, I had my face scanned, and a highly accurate 3D model of my head was created in less than a minute. While the early applications of these types of 3D digital capture and creation tools are designed for virtual reality, using the tools for customer interaction is a natural extension.

See also: Cyber Threats: Big One Is Out There  

Add to this mix the amazing advances in augmented and virtual reality, the appearance of all manner of screens (every size, shape and location possible), tech that adds the sensations of touch and smell to our virtual interactions with machines, and you get a formula that engages all our senses. The digital, connected world is in its infancy and is poised to transform our daily lives. One thing is clear, the way we interact with the world around us will be based on the types of UI advancements that are so front and center at CES2018.

One final wish: Imagine a world without passwords, where TV remotes are a thing of the past, using your fingers to type on a keyboard is rare, and mice only show up in the barn. Sounds like nirvana to me.