Tag Archives: alexa

4 AI Payoffs in Commercial Insurance

The commercial insurance industry is in an early and exciting stage of adopting artificial intelligence. With widespread acceptance, AI is set to provide a large source of value and a key driver of competitive advantage. However, there is confusion about what it means and how it will affect the bottom line.

Certainly, futuristic perspectives of AI — with human-like robots — have been instigated by Hollywood films. However, to push the dialogue into meaningful territory and truly consider how AI can benefit insurance, we must first remove the shroud of mystery that exists around it.

What Is AI?

Simply put, AI is an intelligent computer program that strives to work much like a human does. As such, it’s usually defined by two main characteristics:

1. Ability to interact in a natural, human-like way. AI solutions (as opposed to traditional software solutions) strive to interact with users in ways similar to what humans do. One new and increasingly popular mode of interaction is through voice commands. A user might ask Alexa to turn on the lights in a room or instruct Siri to call “mom.” The same technology can be used to transcribe conversations with claimants and perceive the sentiment therein.

Free text is another form of interaction, where users don’t need to provide precise data to get a system to react. For instance, claim notes entered by adjusters have been traditionally difficult to process, yet they are a gold mine of insights for AI-based systems.

Images serve as another source of input. An AI system could use a picture of a car after a collision to assess the level of damage much faster and easier.

2. Aptitude to learn. Whereas traditional software has to be highly defined and programmed, an AI solution can be given a few parameters and learn on its own. AI is still quite far from being able to think as efficiently as human beings do, but these learning machines are improving by leaps and bounds every day. Increasingly sophisticated capabilities are important because data is ever-changing and noisy, particularly in commercial insurance data.
With interactive and learning faculties, AI is evolving to exhibit the “smarts” we need to improve profitability in insurance. Executives should consider how and where to deploy AI to maximize the financial upside. It’s important for executives to realize that AI is fundamentally very different from the typical business intelligence (BI) approaches of the past. The BI infrastructure (i.e., data warehouses, reports) that most organizations have is a great foundation, but it does not have the agility and self-learning capabilities offered in the new wave of AI-based solutions.

See also: Leveraging AI in Commercial Insurance  

Why Does It Matter?

The reason insurers are taking the time to consider AI investments is that these systems are increasingly showing the potential to dramatically improve their bottom line. There’s little doubt in a CEO’s mind that AI will redefine the competitive landscape in the years to come. Those with a strategic approach will thrive. Here are a few examples of how AI is being applied to enhance profitability:

Identifying the best doctors to provide care to injured workers. In workers’ compensation, selecting the right doctor — particularly on a complex claim like a spinal injury — can have a dramatic impact on claims costs and outcomes. AI can analyze and rank providers into tiers based on a variety of performance measures, such as claims duration, medical expenses and return-to-work results, better than ever before. A low-ranked physician can potentially drive claims costs five to 10 times higher than a high-ranked doctor. The cost differential speaks to the quality of the provider and the treatment approach used.

High-ranked doctors typically take a holistic approach to care. They consider all aspects of the injured worker’s health that affect recovery, including comorbidities such as high blood pressure or diabetes. These doctors are usually aggressive in getting injured workers the care they need to recover and return to work. Many have a long history in the occupational health setting, so they understand that being able to return an employee to work — even in a modified capacity — can play a key role in the healing process.

Low-ranked doctors, on the other hand, often inexplicably drive up the number of office visits. They may unnecessarily prescribe opioids and other drugs. In essence, AI detects a trend of over-treatment rather than a focus on getting injured workers back on the job.

The insurance industry previously relied on a traditional statistical approach to select physicians, but those models required a significant number of claims per doctor to yield a meaningful assessment. Today, AI can assess a doctor with just a few claims as it can delve deeper, using unstructured data such as notes and descriptions. AI is highly efficient in its analysis.

Improving litigation trends. AI can also help to reduce litigation costs significantly. This is an expense that affects all lines of insurance, but in workers’ compensation litigation rates are particularly high, and in many cases they are unnecessary. So, attacking this trend can be a significant source of savings.

AI can identify early on which claimants are likely to seek legal representation. By employing communication and management on these cases, insurers can avoid litigation. When attorneys must be involved, AI can help identify the best lawyer to represent the case, similar to its ability to determine the best physician. It makes attorney recommendations based on many factors, including type of case, jurisdiction and judge. AI can also indicate whether it’s best to settle and, if so, for what amount.

See also: 3 Steps to Demystify Artificial Intelligence  

Avoiding costly, unnecessary surgical procedures. AI can also help to identify claims that may be on a treatment track to surgery. By detecting this risk early, case managers can seek a better approach to care and possibly avoid the need for such an expensive, invasive procedure. This would yield savings, while also avoiding other potential medical complications that could be costly and, ultimately, affect quality of life for the injured worker.

Aggressively managing auto claims with a potential to explode. Other lines of insurance also have a burning need for AI. For example, due to competitive pressures driving down premiums, auto insurers are experiencing tight profit margins, and AI can help hold down costs. One way is by identifying claims with a high likelihood of exploding in terms of expenses. This might be due to a number of factors, including the type of injury and vehicle damage. By having this early warning, examiners can pay closer attention to these cases and aggressively manage them in the hopes of having them stay on track.

As first published in Digital Insurance.

Are Insurers Ready for Voice Search?

Who do property and casualty insurance customers turn to when they need help?

In the past, answers have included insurance agents, customer helplines and company websites. Today, however, customers are increasingly likely to consult Alexa, Siri or Cortana.

As voice assistants gain popularity in homes, in cars and on smartphones, they’re also gaining traction as a marketing tool. Here, we look at the ways in which insurance companies are using voice assistants as part of their marketing and sales strategy, as well as what to expect in the near future.

How Voice Assistants Are Changing Marketing

Voice assistants commonly come in one of two forms: wireless speakers that can be placed in the home or office, or as built-in tools on smartphones. iPhones and various Android devices have had them for a few years now.

In some ways, voice assistants work similarly to visual or text-based tools like smartphone apps and Google search bars. The user asks a question or enters a command, and the device responds to it. Voice assistants like Alexa even offer apps, or “skills,” that work similarly to smartphone apps — except they rely on audio rather than visuals to share information, TechCrunch’s Sarah Perez writes.

The audio-based approach changes the ways in which both search results and apps work on voice assistant devices. A text-based Google search, for instance, returns a list of links from which the user can choose. A voice-based search, however, tends to return the single response the AI thinks best fits the user’s query.

Some experts praise this option for its speed and flexibility. “Since voice flattens menus, it will make daily tasks far easier to complete,” Jelli CEO Mike Dougherty says. Yet it also puts additional pressure on marketing teams to ensure that their content gets chosen by the various search engines that inform each voice-based device, says Richard Yao, senior associate of strategy and content at IPG Media Lab.

Voice assistants haven’t just changed how search results are presented. They have also changed how users launch searches in the first place, says More Visibility’s Jill Goldstein. While text-based searches tend to focus on two or three keywords, voice-based searches use full, natural-language sentences. These often start with question words like “what,” “how” or “when.”

See also: Insurtech Starts With ‘I’ but Needs ‘We’  

These questions give marketers insight into where shoppers are in their buying journey and how best to meet their needs — but only if marketing teams are collecting and using this information, says Tyler Riddell, vice president of marketing for eSUB Construction Software.

Not only are marketing teams learning to adapt to the differences between audio and visual, but they’re also learning how to adapt to a search tool that adapts itself.

Because voice assistants use artificial intelligence and machine learning, they can adapt to changes in search terms, says Gartner analyst Ranjit Atwal. The onboard AI is designed to learn over time, gaining a better sense of how users frame their queries and the sort of information they may be looking for.

‘Alexa, Find Me Auto Insurance’: The Rising Demand for Voice Search

Based on recent sales trends, 55% of U.S. households are expected to have a smart home speaker, with voice assistance enabled, in their houses by the end of 2019, Dara Treseder at Adweek reports. Voice assistants are also a mainstay of many smartphones, from Apple’s Siri to Google’s voice search option triggered by saying, “OK, Google.”

Insurance customers increasingly prefer to include digital channels in their search for property and casualty insurance. With voice assistants occupying millions of smartphones and a wide range of other devices, customers increasingly prefer to rely on these tools, as well.

Nearly half (46%) of insurance customers already use voice search tools at least once per day, according to Shane Closser at Property Casualty 360. One in four want their voice assistants to be able to give them more information on insurance agents and products. One in three wanted to use voice assistants to book appointments with a particular insurance agent.

Service-based companies that offer “highly complex and highly personal” services are uniquely suited to thrive in the voice search era, says Adweek’s Julia Stead. While Stead focuses on travel, finance and healthcare, her analysis applies to P&C insurers, as well, because these companies also offer services that have long been accessed via voice (phone), are tailored to the needs of each customer and often require access at odd locations or hours.

And while the conversation about tech innovation often focuses on younger users, voice assistants are increasingly popular with older insurance customers.

See also: Future of Insurance Looks Very Different  

Lauryn Chamberlain at GeoMarketing.com says that 37% of consumers age 50 and older say they use a voice assistant, often because simply speaking to a smart speaker or phone is easier than tapping, swiping or reducing a question to its key search terms. In other words, older users can think of their voice assistants as a helpful background entity rather than as a device.

In short, voice assistants are cutting across demographics. They’re entering more homes and workspaces. And insurance customers want to use them to secure coverage.

How P&C Insurers Are Incorporating Voice Into Their Marketing

Several insurance companies are already experimenting with voice assistant tools as part of their own marketing process, according to Danni Santana at Digital Insurance. For instance, Nationwide, Liberty Mutual (and subsidiary SafeCo) and Farmers have all launched Amazon Echo Skills.

Progressive, meanwhile, joined Google Home in March 2017, the first insurance carrier to do so, according to Rachel Brown at Mobile Marketer.

Other insurance companies have experimented with different approaches. Amica Mutual Insurance, for example, launched an Alexa skill that doesn’t connect users to their individual accounts. Rather, it offers information in more than a dozen categories to help users better understand billing, discounts, storm preparation and more.

With the development of Alexa skills and similar tools, brands are thinking about how a voice assistant’s sound affects their brand development, says Jennifer Harvey, VP of branding and communications at Bynder. The choice of voice tone, pitch and speed can all send a powerful message about an insurer’s brand and culture, whether it’s reassuring, serious, cheerful or anything in between.

One of the big opportunities for insurance companies and voice assistants is access. Currently, voice assistants can take on many simple tasks but can’t always handle a transaction as complex as ensuring a customer receives the right home or auto coverage for their needs. Yet developments in AI and voice recognition indicate this may change. “Alexa is already capable of placing a complicated pizza order,” says Inbal Lavi, CEO of Webpals Group, “underscoring that voice assistants will act as more than middlemen.”

For now, however, even the digital middleman approach can benefit potential and current P&C insurance customers and the companies that serve them. “We want to enable easy access for our customers,” says Alexander Bernert, head of brand management at Zurich Insurance. “Consumers do not necessarily think of taking out disability insurance between 9 am and 5 pm, but maybe even shortly before midnight.”

It can be tough to reach an insurance agent shortly before midnight. But a voice assistant can find one, provide information and even schedule an appointment — making it easier for potential customers to turn into actual purchasers.

In a world where insurance customers already do research and contact insurers via multiple channels, voice assistants are a natural frontier for insurance marketing.

Removing Language Barriers for Insurers

One of the bigger issues at InsureTech Connect this year, I expect, is a result of advances in globalization and technology: How can providers more efficiently address multilingual needs without, say, a lot of Rosetta Stone? The challenges aren’t confined to insurers operating internationally; increasingly, geographic diversity presents this challenge within nations.

For example, what happens when an American customer traveling in Spain needs help urgently, but will be routed to the nearest call center–in Germany? And that office doesn’t have a translator?

Perhaps the matriarch of a small family business, run by non-English speakers in the Midwest, needs to purchase a policy? How can you upsell? Google Translate alone won’t work in such a complex situation; we need technology that quickly understands tone, slang and cultural context in addition to intent.

Even Facebook is just now getting into language-barrier solutions. How we communicate with potential customers affects the way they receive the information–and that can go well, or not.

This need extends to the point people for customer acquisitions: brokers. But after underwriting, it is the provider–not the broker–that does all the serving. If brokers aren’t adopting strong translation technology programs, those customers are lost. (Which is why we’re seeing insurance companies providing their brokers with materials in different languages.)

Luckily, advances in automation, machine learning and artificial intelligence have shown to cut costs, boost efficiency and improve capabilities at scale, across industries. For insurers, technology can now provide multi-language support through an automated customer service system, a capability we didn’t have just five years ago.

See also: 26 Most Important Words in Business  

In legacy automated customer service systems, the program looks for individual keywords within an interaction. But today, technology can absorb 100 paragraphs–a library of text that reflects a displeased customer. Agents can even, for example, see whether the customer hung up mid-call, or used profanity.

Simply put, each of those 100 paragraphs is assigned a numerical number–some as long as 100 digits, each with a label that reflects those human details that can get lost in automated service–anger, happiness or anxiety. It’s a sentient technology.

Codifying language using a binary library speeds up processing time by making it easier to find complex combinations of words, as opposed to the previous method of searching for keywords.

Another weakness with the traditional keyword search is that it can throw off algorithms. If a customer declares, “I’m red in the face with your service!” the program will not be able to interpret it for what it is–especially when translating across languages.

Insurers can no longer keep up with the old model of staffing a few well-versed interpreters now that people opt for digital platforms, like text. That’s largely because consumers are opting for text-based communications, like email or messaging apps, over call centers. Translation needs have shifted to digital text, which demands an entirely new solution.

Artificial Intelligence can digest a call in one language, translate it and transcribe it as a ticket–even assigning a score to each conversation. This score is important, because it determines the routing and triage process. If Company X decided to challenge itself and escalate any conversation that scored under 80%, the company can do that–and it can all be automated.

For the past few years, Microsoft has been on the forefront of this sector, but still has drawbacks: Users must train the system over time, because the program is industry-agnostic in its application. When this same cutting-edge technology is built exclusively for the insurance industry, it becomes far more powerful. Over the last three years, we’ve seen that 90% of customer questions will be the same, with only 10% unique to the system’s “brain.” That promises better accuracy and a smoother move across the pipeline.

See also: Language and Mental Health

Because this is a translation issue, it’s no surprise that we are seeing a lot more demand for multilingual support in the travel industry, particularly with EU carriers, which serve a diverse range of demographics. A close second is demand we’ve seen in small business property and casualty insurers. Research has show that Russian is an important language to accommodate, as the population is traveling more.

The value of investing in advanced translation technology is transparent. Policies are sold in a process that involves around eight touchpoints–the referral from a friend, the initial phone call, the in-person visit, the medical exam, follow-up calls, to name some. Customer experience determines wins and retention. When a broker is selling a life insurance policy to someone–and can deftly communicate in their language—it makes sense that customer affinity (and thus their sales) would go up. Add in the ability to do all of this across platforms–messaging apps, email, chatbots, Alexa or social media–and it will make a difference in the bottom line.

CES: User Interface Is Front and Center

This year’s CES is no less mind-boggling than in prior years. With 2.7 million square feet of exhibition space, about 4,000 exhibitors, hundreds of sessions and 180,000 people, it is virtually impossible to take it all in. However, there are a few big themes that always emerge, along with a variety of interesting new products – some are potential game-changers while others are head-scratchers. But, I’ll save a more in-depth analysis for another blog to concentrate on one overarching theme I’ve seen from CES2018 – the prominence of the user interface (UI).

This emphasis on the UI is especially interesting because CES has historically been considered a “hardware” show, with the latest and greatest statistics touted by tech companies. Metrics related to speeds, capacities, pixels, size (some devices keep getting bigger while others keep getting smaller) and other units of measure dominate the discussions and marketing materials. But one prevalent thread throughout much of CES2018 is the dramatic expansion and innovation regarding how we interact with computers and the world around us.

See also: Rise of the Machines in Insurance  

Start with the fact that voice assistants are increasingly embedded into new solutions – Alexa, Google Assistant, Cortana, Bixby and others from prominent tech brands are leveraged in home devices, vehicles, mobile devices and many other smart things. Next, consider that haptics and gestures are becoming more advanced and being used to control more devices. New car company BYTON unveiled a car that allows interactions via five simple hand gestures (and that vehicle also has a 49-inch touchscreen and has integrated Amazon’s Alexa). Also, interactions based on our movements continue to be enhanced in the VR world.

Another area in which interaction is rapidly advancing is the use of biometrics. Fingerprints are already broadly used to unlock devices and to gain access to other digital assets, but we increasingly see solutions based on iris scans, facial recognition, hand geometry and other unique aspects of human physiology. We can all hope that the days of the password are numbered (YAY!).

Chatbots are emerging in many places, and people are getting used to interacting with them for sales advice, customer service and tech problems. Many still need to be infused with more AI to perform at a higher level, but there is a distinct trending toward more chatbot use. It is also likely that we will see a resurgence of avatars to give more personality to chatbots. At CES2018, I had my face scanned, and a highly accurate 3D model of my head was created in less than a minute. While the early applications of these types of 3D digital capture and creation tools are designed for virtual reality, using the tools for customer interaction is a natural extension.

See also: Cyber Threats: Big One Is Out There  

Add to this mix the amazing advances in augmented and virtual reality, the appearance of all manner of screens (every size, shape and location possible), tech that adds the sensations of touch and smell to our virtual interactions with machines, and you get a formula that engages all our senses. The digital, connected world is in its infancy and is poised to transform our daily lives. One thing is clear, the way we interact with the world around us will be based on the types of UI advancements that are so front and center at CES2018.

One final wish: Imagine a world without passwords, where TV remotes are a thing of the past, using your fingers to type on a keyboard is rare, and mice only show up in the barn. Sounds like nirvana to me.