Tag Archives: smart devices

Smart Home Devices: the Security Risks

Smart devices have become a popular topic in the P&C insurance world. Tools like smart thermostats, smoke detectors and water sensors offer the potential to halt property damage before it starts, protecting insurance customers from injury, property loss or both. Yet these devices come with risks.

Smart devices often represent the most vulnerable point on any given network, exposing customers and insurers alike to potential risks. Insurance companies that understand these risks are better-poised to protect both customers and themselves.

The Rising Trend of Smart Device Use

Smart home devices were a wildly popular gift during the 2018 holiday season. Amazon broke records for sales of its Echo and Alexa devices, Voicebot’s Bret Kinsella says. Sales of smart sensors, security systems, wearable devices and smart toys were also strong.

Currently, the most common smart devices used in private homes are televisions and digital set-top boxes, says Gartner research director Peter Middleton. Initially more popular among businesses, tools like smart electric meters and security cameras are becoming more popular among homeowners.

As more people use smart devices, insuring these devices becomes more important. Even Amazon has announced an interest in offering homeowners insurance to complement its smart devices like Alexa speakers and Ring Alarm systems, says Julie Jacobson at CEPro.

Growing Security Concerns for the Internet of Things

As reports of data theft, hacking and other malfeasance reach the news, concerns about security and privacy in the smart device realm grow. For instance, a distributed denial of service (DDoS) attack in 2016 incapacitated websites for internet users across the East Coast of the U.S. The attack was launched from an army of smart devices conscripted by malware, says Lisa R. Lifshitz, who works in internet law and cybersecurity. In this attack, many of the device’s owners didn’t even know they were involved.

These events have raised concerns about device security among both government regulators and private device owners. Insurers seeking to offer smart devices to customers can play a role, as well.

See also: Smart Home = Smart Insurer!  

Laws and Regulations Address Smart Device Security

Most laws and regulations to address smart device security are still in their infancy. Although the U.K. introduced guidelines for improving IoT security in 2018, the guidelines remain voluntary. This means that not all manufacturers will adhere to them, says Rory Cellan-Jones, a technology correspondent for the BBC.

In September 2018, California became the first U.S. state to pass a law addressing smart device security. The bill sets minimum security requirements for smart device manufacturers selling their devices in California. It takes effect Jan. 1, 2020.

Rather than listing specific requirements, the California law sets a standard for determining whether security is reasonable. For instance, the security features must be appropriate to the device’s nature and function. They must also be designed to protect the device and its information from unauthorized access, modification or other forms of tampering, say Jennifer R. Martin and Kyle Kessler at Orrick.

Customer Interest in Security Has Increased

As smart devices become more popular, so do demands for greater security and privacy regulations. A 2018 study by Market Strategies International found that people who use smart devices at home or at work are twice as likely to believe that governments should regulate the devices.

“We believe that these workers have already seen the massive potential of the IoT and recognize that the risks – data security, privacy and environmental – are very real,” explains Erin Leedy, a senior vice president at Market Strategies. With a sense of both the potential and the risks, smart device users become more interested in stronger regulations to protect privacy.

A 2017 study by digital platform security firm Irdeto polled 7,882 smart device users in six different countries worldwide. Researchers found that 90% of those polled believe that smart devices need built-in security. Yet, respondents also said they too had a role to play in keeping themselves secure: 56% said that users and manufacturers share responsibility to prevent their devices from being hacked, security director Mark Hearn says.

Consumers understand that their devices can pose risks, and they’re willing to join the fight to protect their privacy and data security. Insurance companies can help them do so by providing the information they need to make smart decisions with smart devices.

Who Controls Your Customers’ Devices?

When today’s smart home devices were designed, the main goal was to simplify tasks and make life more efficient. Security took a backseat to functionality, Fortinet’s Steve Mulhearn says. To function well, smart home devices must integrate seamlessly with other devices — meaning they’re often the weakest security point on a network.

Hackers have noticed these weaknesses and are taking advantage of them. In August 2018, the Federal Bureau of Investigation issued a public service announcement warning that IoT devices could be hacked, conscripting them into malicious or illegal online activities.

“Everything from routers and NAS devices to DVRs, Raspberry Pis and even smart garage door openers could be at risk,” says Phil Muncaster at Infosecurity Magazine. While some devices are at higher risk than others, no smart device is totally safe from attempts to use it for ills like click fraud, spam emailing and botnet attacks.

Helping Customers Understand and Address Smart Device Risks

Most smart device users want to play a role in preventing privacy and security breaches. Yet, they don’t always know how to participate effectively.

Helpnet Security managing editor Zeljka Zorz recommends that homeowners adopt smart devices only after asking and answering two questions:

  • Will the device improve the quality of my life/fill a need I have?
  • Am I satisfied with the level of security and privacy the manufacturer provides users?

Insurers seeking to incorporate smart devices into their business and their customers’ lives can help by providing answers to both questions.

As Steve Touhill explains on the Resonate blog, demonstrating the usefulness of smart devices can help insurers attract new customers. Smart device owners are 42% more likely to change insurance companies in the coming year. They’re also more open to embracing insurers that offer smart device discounts or support.

Insurers can help customers protect themselves by providing information on privacy and security issues. Options include comparisons of security options for various devices, information on changing usernames and passwords, how-to guides for installing regular updates and checklists for spotting signs of cyber tampering.

When presented as best practices for using smart home devices, these steps can help homeowners and insurers address security risks without raising undue alarm.

Property and casualty insurers that encourage smart device use play an important role in influencing how customers use their devices. While this relationship can be beneficial for both insurers and customers, insurers that enter it face further privacy and security complications.

Protecting Customer Privacy

Insurance companies will need to consider how to protect customer privacy while still gathering relevant data from smart home devices.

This is because smart devices offer the potential to provide more data to insurance companies, changing everything from policy recommendations to underwriting accuracy, Mobiquity’s Sydney Fenkell says.

See also: How Smart Is a ‘Smart’ Home, Really?

Gathering this data requires insurance companies to be smart about protecting the privacy of customers and the security of the information received.

“It is not a matter of if but when these systems will be compromised, and the consequences could be much more severe than lost Social Security numbers,” says Dimitri Stiliadis, chief technology officer at Aporeto.

Moreover, P&C insurers will also need to protect their internal networks when communication with these devices presents a weak point.

Being Smart About Smart Device Data Use

The use of smart device data was recently brought to light by an announcement from the insurance company John Hancock. It made public the company’s intention to incorporate information from fitness wearables like the Garmin or FitBit into calculations of life insurance premiums.

This raised a number of concerns with customers, says Chris Boyd, a MalwareBytes senior threat researcher who goes by the pseudonym paperghost. Boyd notes that these devices often have weak security, which means that a user’s personal data could be altered — affecting  insurance premiums.

Similar concerns arise for users seeking to link smart devices with their auto, homeowners or renters insurance. A hacked or malfunctioning device that reports multiple loss events, or that fails to report events that did happen, could affect customers’ insurance rates. Unless, however, human intervention in the system verified the event.

For insurers, one of the best early principles to adopt may be one of transparency, says Chris Middleton at Internet of Business. When consumers know what information their smart home device collects and transmits, and under what security protocols or safeguards, they are better-equipped to understand and use the device in a way that benefits both their interests and those of their insurer.

Do Health Apps Threaten Privacy?

The growing use of smartphone apps and wearable devices to generate personal health and lifestyle data poses a dilemma for privacy. While individuals have much to gain using apps to help them manage health concerns, the privacy of the data itself may be at risk.

Consumer-grade devices that link across internet networks are rather vulnerable to hacking. The levels of security that can be tolerated by users fall short of enterprise networks. The portability of wearables and smart devices, carelessness with passwords and lack of encryption mean confidential data is much more at risk of being stolen.

See also: 5 Apps That May Transform Healthcare  

Apps use a program interface (API) to access sensors in devices themselves — GPS, messages, even the camera — and to collect data. Many apps combine data to draw conclusions (accurate or otherwise) about the user’s health. Some insurers are already using activity data from fitness trackers to enhance products. It seems likely the trend will continue as apps become more sophisticated and hardware develops broader appeal.

U.S. federal and state laws require published policies concerning the use, disclosure and safeguarding of personal data by mobile apps. Health data are subject to special restrictions. In addition to imposing restrictions on sale and disclosure on all personal data on apps, EU data protection directives and national laws have more restrictions for health data; for example, explicit consent requirements. Apps must comply with all applicable legal requirements for processing health data and personal data more generally, including consent requirements of various levels of specificity and explicitness for different types of uses and disclosures of different types of personal data.

It may not occur to most users of a fitness app that their personal data will be disclosed to the device manufacturers, which may sell it to third-party advertisers or share it with data aggregators. The terms and conditions of apps are not always read, or the developer is based beyond national legal boundaries. The relatively short life cycle of many apps could also mean personal data may end up lost as the apps become defunct.

A survey by the Global Privacy Enforcement Network found that, in 85% of the 1,200 apps reviewed, the owners failed to clearly explain how they were collecting, using and disclosing personal information. EMEI (unique serial) numbers of smartphones make identification of individuals simple, and many app users mistakenly believe their information stays private.

See also: Wearable Tech Raises Privacy Concerns  

I have previously written about how wearables and apps that use smartphones as a hub can play an important role in life and health insurance (see my slideshare: The Growing Impact of Wearables on Digital Health and Insurance). Research in the U.K. shows half the population now monitors their health problems this way, and 95% of doctors see more patients bringing their own data to appointments. The trend is expected to continue — more than 140 million wearables are expected to be sold in 2020, up from around 70 million in 2014.

Underwriters and claims assessors will process increasing levels of digital health data in their day-to-day work. However, if patients cannot believe the health data they store in apps is private, they may resist calls from clinicians to use them. It’s important to address concerns over data privacy or failures to protect individual’s sensitive information, so patients’ resistance does not stall this innovation.

© Reproduced with the permission of General Reinsurance AG, 2017.

Smart Things and the Customer Experience

The inanimate world around us is coming alive, powered by smart things and AI. It is difficult to name an object for which there is not a smart version.

Garage doors, thermostats, doorbells, appliances? Check.

Shoes, belts, hats, shirts? Check.

Cars, trucks, boats, drones? Check.

Just about anything you can imagine, and some bizarre things that would probably never cross your mind, have smart versions that connect to the internet and can be controlled by mobile apps or even take action on their own. The potential is great, and the implications for insurance are many. But one thing about smart things that has a mixed record so far is how humans communicate with them. In some cases, the customer experience is well-thought-out and will contribute to adoption. In other cases, the experience is downright awful.

Without naming specific companies, here are a few examples of good and bad experiences with smart things.

  • Smart TVs:
    I am starting here because some of these are terribly frustrating. Many require interaction via remote control devices, pop-up keyboards on the TV screen and the down-down-over-over-over maneuvering on the keyboard for EACH LETTER. It reminds me of the early texting days with triple taps.
  • Smart tags:
    Small devices that attach to keys, slide into wallets or get packed into suitcases are widely available. I’ve tried many of these devices and have discovered that some are simple, fast and easy to install and use, while others are a nightmare. One device I ordered was extremely hard just to get out of the package! Another one required you to slide it open to install a battery, but I almost gave up trying to pry it open. Alternatively, I have some that I use that took me less than a minute to set up, and they just work.
  • Telematics devices:
    There seems to be a migration away from dongles, which is a good thing. In some cars, you have to be a contortionist to get your body into position to plug the dongle into the OBD port. Mobile-app-based telematics are easier to set up, and the user interfaces are usually modern.
  • Wearables:
    I’ve had three different fitness wearables. Generally, the experience is good, although sometimes the data entry to set up a profile and do regular logging gets tedious.
  • Vehicle information/entertainment systems:
    The ability to initiate a phone call or change the radio station with a voice command is great – when it works. There are some commands that are just never interpreted correctly, or never interpreted at all.

See also: How to Make Smart Devices More Secure  

I could continue with examples of smart home devices, virtual reality/augmented reality headsets and glasses and other smart objects. Many of you can relate from your own experiences: some are slick, easy and fun – and others tedious and frustrating. There are several lessons here that insurers should keep in mind in any venture where they are providing or leveraging smart devices to policyholders.

  • Recognize that customer experience goes beyond the mobile app.
    The ordering, shipping, opening the box and reading the initial instruction booklet are all part of the experience. Some insurers discovered how important this can be after sending out dongles for telematics devices.
  • Make sure it works!
    I have returned more than one smart item, including a bathroom scale that was supposed to synch with a fitness wearable that never worked, even after several calls to tech support. It is the ultimate poor customer experience when something does not work as advertised.
  • Resist the urge to collect too much information.
    Especially during set-up, just collect what is minimally required to get it going, not extra information that you desire for marketing and other purposes. When an individual buys a smart device, he is anxious to get it up and running.
  • Ensure that tech support is accessible.
    “Fill out this form, and we will contact you within the next 48 hours” is not a good way to go. Most people are excited about their new device and don’t want to wait this long for a response. At the very least, provide a live chat session.

See also: ‘It’s the Customer Experience, Stupid’  

The connected world of smart things is exciting and offers many possible ways to enrich our daily lives, improve business operations and make the world safer. The functionality of a smart device is very important. But don’t forget that the customer experience will play a large role in the adoption of smart things.

How to Make Smart Devices More Secure

Smart-television maker Vizio agreed to pay a penalty this month for spying on 11 million customers. According to the Federal Trade Commission, the company captured second-by-second information on what customers viewed, combined it with their gender, age and income and sold it to third parties.

How much was the fine for Vizio, which has sales in excess of $3 billion? It was $2.2 million — barely a slap on the wrist.

These kinds of privacy breaches are increasingly common as billions of devices now become part of the “Internet of Things” (I.o.T.). Whether it be our TV sets, cars, bathroom scales, children’s toys or medical devices, we are already surrounded by everyday objects equipped with sensors and computers. And the companies that make them can get away with being careless with consumer security — and with stealing customer data.

Vizio has been accused of exposing its customers to hackers before. In November 2015, security researchers at Avast demonstrated how easy it was for hackers to gain complete access to the WiFi networks that Vizio’s TVs were connected to and that it recorded customer data even when they explicitly opted out of its terms of service.

See also: ‘Smart’ Is Everywhere, but…  

On Black Friday in 2015, hackers broke into the servers of Chinese toymaker VTech and lifted personal information on nearly five million parents and more than six million children. The data haul included home addresses, names, birth dates, email addresses and passwords. Worse still, it included photographs and chat logs between parents and their children. VTech paid no fine and changed its terms of service to require that customers acknowledge their private data “may be intercepted or later acquired by unauthorized parties.”

Regulations and consumer protections are desperately needed.

One option would be to hold the manufacturers strictly liable for these hacks, to financially motivate them to improve product security. In the same way that seat belt manufacturers are responsible for the safety of their products, I.o.T. device makers would be presumed to be liable unless they could prove that they had taken all reasonable precautions. The penalties could be high enough to put a company out of business.

But this would be inequitable. One of the factors enabling such hacking is that users don’t use sufficiently complex passwords and thus leave the front door unlocked. It could also stifle innovation, with the big players avoiding the possibility of extreme penalties by becoming averse to innovations, and small players avoiding entering the market because they lack the resources to handle possible litigation.

Duke School of Law researcher Jeremy Muhlfelder says that copyright law has a history of Supreme Court cases that have ruled on this exact principle, of not wanting to curb the “next big thing” by holding innovators liable for their innovations. Innovators themselves wouldn’t, and shouldn’t, be liable for how carelessly their innovations are incorporated into new products. But imposing strict liabilities on manufacturers, because it would lead indirectly to canceling the rewards of innovation, might not be legally realistic either.

A more reasonable solution may be along the lines of what attorney Matt Sherer recommends in a paper on regulating artificial intelligence systems that was published in the Harvard Journal of Law and Technology: Impose strict liability but with the potential for pre-certification that removes the liability. I.o.T. devices would be deemed inherently dangerous, and thus the producer would be strictly liable for faults unless an independent agency certifies the devices as secure. This would be similar to the UL certification provided by Underwriters Laboratories, a government-approved company that carries out testing and certification to ensure products meet safety specifications.

See also: Why 2017 Is the Year of the Bot  

Equipment certification is also one of the recommendations that former Federal Communications Commission chairman Tom Wheeler made in a letter to Sen. Mark R. Warner (D-Va.) regarding the government’s response to the October 2016 attack on the internet. He proposed a public–private partnership that creates a set of best practices for securing devices, the certification or self-certification of products, and labeling requirements to make consumers aware of the risks. Wheeler proposed “market-based incentives and appropriate regulatory oversight where the market does not, or cannot, do the job effectively.”

As Wheeler also noted, addressing I.o.T. threats is a national imperative and must not be stalled by the transition to a new president. This is beyond politics. It is a matter of national security and consumer safety.