Tag Archives: internet

Big Data in Insurance: A Glimpse Into 2015

Bernard Marr is one of the big voices to pay attention to on the subject of big data. His recent piece “Big Data: The Predictions for 2015” is bold and thought-provoking. As a P&C actuary, I tend to look at everything through my insurance-colored glasses. So, of course, I immediately started thinking about the impact on insurance if Marr’s predictions come to pass this year.

As I share my thoughts below, be aware that the section headers are taken from his article; the rest of the content are my thoughts and interpretations of the impact to the insurance industry.

The value of the big data economy will reach $125 billion

That’s a really big number, Mr. Marr. I think I know how to answer my son the next time he comes to me looking for advice on a college major.

But what does this huge number mean for insurance? There’s a potential time bomb here for commercial lines because this $125 billion means we’re going to see new commerce (and new risks) that are not currently reflected in loss history – and therefore not reflected in rates.

Maybe premiums will go up as exposures increase with the new commerce – but that raises a new question: What’s the right exposure base for aggregating and analyzing big data? Is it revenue? Data observation count? Megaflops? We don’t know the answer to this yet. Unfortunately, it’s not until we start seeing losses that we’ll know for sure.

The Internet of Things will go mainstream

We already have some limited integration of “the Internet of Things” into our insurance world. Witness UBI (usage-based insurance), which can tie auto insurance premiums to not only miles driven, but also driving quality.

Google’s Nest thermostat keeps track of when you’re home and away, whether you’re heating or cooling, and communicates this information back to a data store. Could that data be used in more accurate pricing of homeowners insurance? If so, it would be like UBI for the house.

The Internet of Things can extend to healthcare and medical insurance, as well. We already have health plans offering a discount for attending the gym 12 times a month. We all have “a friend” who sometimes checks in at the gym to meet the quota and get the discount. With the proliferation of worn biometric devices (FitBit, Nike Fuel and so on), it would be trivial for the carrier to offer a UBI discount based on the quantity and quality of the workout. Of course, the insurer would need to get the policyholder’s permission to use that data, but, if the discount is big enough, we’ll buy it.

Machines will get better at making decisions

As I talk with carriers about predictive analytics, this concept is one of the most disruptive to underwriters and actuaries. There is a fundamental worry that the model is going to replace them.

Machines are getting better at making decisions, but within most of insurance, and certainly within commercial lines, the machines should be seen as an enabling technology that helps the underwriter to make better decisions, or the actuary to make more accurate rates. Expert systems can do well on risks that fit neatly into a standard underwriting box, but anything outside of that box is going to need some human intervention.

Textual analysis will become more widely used

A recurring theme I hear in talking to carriers is a desire to do claims analysis, fraud detection or claims triage using analysis of text in the claims adjusters’ files. There are early adopters in the industry doing this, and there have emerged several consultants and vendors offering bespoke solutions. I think that 2015 could be the year that we see some standardized, off-the-shelf solutions emerge that offer predictive analytics using textual analysis.

Data visualization tools will dominate the market

This is spot-on in insurance, too. Data visualization and exploration tools are emerging quickly in the insurance space. The lines between “reporting tool” and “data analysis tool” are blurring. Companies are realizing that they can combine key performance indicators (KPIs) and metrics from multiple data streams into single dashboard views. This leads to insights that were never before possible using single-dimension, standard reporting.

There is so much data present in so many dimensions that it no longer makes sense to look at a fixed set of static exhibits when managing insurance operations. Good performance metrics don’t necessarily lead to answers, but instead to better questions – and answering these new questions demands a dynamic data visualization environment.

Matt Mosher, senior vice president of rating services at A.M. Best, will be talking to this point in March at the Valen Analytics Summit and exploring how companies embracing analytics are finding ways to leverage their data-driven approach across the entire enterprise. This ultimately leads to significant benefits for these firms, both in portfolio profitability and in overall financial strength.

There will be a big scare over privacy

Here we are back in the realm of new risks again. P&C underwriters have long been aware of “cyber” risks and control these through specialized forms and policy exclusions.

With big data, however, comes new levels of risk. What happens, for example, when the insurance company knows something about the policyholder that the policyholder hasn’t revealed? (As a thought experiment, imagine what Google knows of your political affiliations or marital status, even though you’ve probably never formally given Google this information.) If the insurance company uses that information in underwriting or pricing, does this raise privacy issues?

Companies and organizations will struggle to find data talent

If this is a huge issue for big data, in general, then it’s a really, really big deal for insurance.

I can understand that college freshmen aren’t necessarily dreaming of a career as a “data analyst” when they graduate. So now put “insurance data analyst” up as a career choice, and we’re even lower on the list. If we’re going to attract the right data talent in the coming decade, the insurance industry has to do something to make this stuff look sexy, starting right now.

Big data will provide the key to the mysteries of the universe

Now, it seems, Mr. Marr has the upper hand. For the life of me, I can’t figure out how to spin prognostication about the Large Hadron Collider into an insurance angle. Well played.

Those of us in the insurance industry have long joked that this industry is one of the last to adopt new methods and technology. I feel we’ve continued the trend with big data and predictive analytics – at least, we certainly weren’t the first to the party. However, there was a tremendous amount of movement in 2013, and again in 2014. Insurance is ready for big data. And just in time, because I agree with Mr. Marr – 2015 is going to be a big year.

Grave Threat to the Electric Grid and the Internet

On July 23, 2012, a plasma cloud of gigantic proportions raced toward Earth at nearly 2,000 miles per second, about four times the normal speed for a solar eruption. Had the gigantic cloud hit the Earth – as a similar eruption did in the mid-1800s – it would have devastated much of the world’s power grids and many satellites, taking down much of our telecommunications systems and GPS. The damage would have run into the trillions of dollars and would have taken months or even years to reverse.

The good news: While the cloud passed through the Earth’s orbital path and hit Goddard’s Stereo A satellite, the cloud missed us.

The bad news: Not by much. If the storm had happened just nine days earlier, the Earth would have been directly in its path.

The worse news: We are overdue for a direct hit. Such huge storms are expected to hit the Earth every 150 years, and the last one struck 155 years ago.

“Had [the 2012 magnetic storm] hit Earth, it probably would have been like the big one in 1859, but the effect today, with our modern technologies, would have been tremendous,” UC Berkeley research physicist Janet G. Luhmann said in March, when researchers from UC Berkeley and China reported their analysis.

Daniel Baker, of the Laboratory for Atmospheric and Space Physics at the University of Colorado, said that, if the storm had hit, “we would still be picking up the pieces.”

A severe space weather event that causes major disruption to the U.S. electricity network and to communications would have obvious implications for the insurance industry. If businesses, public services and households are without power and, thus, many forms of communication, for a sustained period, insurers may be exposed to unprecedented claims for business interruption and many other issues.

So, every insurer, broker and business that relies on the power and telecommunications grids – in other words, almost everybody these days – should consider space weather scenarios and start planning for them.

We have some historical record of the devastation that would occur because, for five days beginning on Aug. 28, 1859, astronomers Richard Carrington and Richard Hodgson witnessed sunspots on the sun’s surface. The solar flare caused two major coronal mass ejections, or CMEs, which typically unleash energies equal to that of about a billion hydrogen bombs, according to scientists. One of these CMEs was so much more powerful than the norm that it traveled to Earth in 17.6 hours, while such journeys usually take three to four days.

The huge storm – now known as a Carrington Event – caused major disruptions to the nascent telegraph system, and the effects continued into September. The New York Times reported on Sept. 5 that a telegraph operator by the name of Frederick Royce said, “My forehead grazed a ground-wire which runs down the wall near the sounder. Immediately, I received a very severe electric shock, which stunned me for an instant. An old man who was sitting facing me, and but a few feet distant, said that he saw a spark of fire jump from my forehead to the sounder.”

As late as Sept. 19, the Gettysburg Compiler reported that a telegraph operator “chanced to touch the wire and was thrown in violence of a shock he received across the room.”

Telegraph lines melted or suffered spotty service. “The French telegraph communications at Paris were greatly affected, and on interrupting the circuit of the conducting wire strong sparks were observed. The same thing occurred at the same time at all the telegraphic stations in France,” the Illustrated London News reported.

The Philadelphia North American & United States Gazette noted: “The telegraph operators throughout the East report a very brilliant display of auroral light, which though very fine to look at, has as usual greatly hindered the transmission of messages over the wires.”

If such a storm were to happen today, a study by the National Academy of Sciences calculates, the total economic impact could top $2 trillion. That would be 20 times the costs of a Hurricane Katrina. Huge power transformers, fried by a Carrington Event, would take years to repair. Lloyd’s of London put the figure at as much as $2.6 trillion in a 2013 document called “Solar Storm Risk to the North American Electric Grid.”

The effect of a Carrington Event would be so much greater today than it was in 1859 because we are so much more connected, supported by the electrical grid and satellites. The schematic of the U.S., below, highlights how New England, the Middle Atlantic, the Upper Midwest and the Northwest would feel significant fallout from a direct solar blast. The outlined sections indicate areas of probable power system collapses. Those areas are home to more than 130 million people.

graph3

Our various infrastructures – i.e., electric power, transportation, water and banking – are intimately connected. That means that a serious impact on our power grid would unleash a domino effect on the remaining systems. This, in turn, would have a large effect each one of our daily activities.

The Lloyd’s of London study in 2013 concluded that:

• A Carrington-level geomagnetic storm is practically inevitable. While the odds of an extreme storm is relatively low at any one time, it’s virtually inevitable one will occur eventually. Historical auroral records suggest a return period of 50 years for severe storms and 150 years for very extreme storms, like the Carrington Event.
• The risk of intense geomagnetic storms is greater when we approach the peak of a solar cycle. Solar activity follows an 11-year cycle. For the current cycle, the geomagnetic storm risk is projected to peak in early 2015.
• As the North American power infrastructure ages, the risk of a catastrophic outage grows with each solar cycle peak. The potential exists for long-term, widespread power outages.
• Weighted by population, the highest risk of storm-induced power outages in the U.S. is along the Atlantic corridor between Washington and New York City. This takes into account risk factors such as magnetic latitude, distance to the coast, ground conductivity and transmission grid properties. Other high-risk regions are the Midwest states, such as Michigan and Wisconsin, and regions along the Gulf Coast.
• The total U.S. population at risk of extended power outage from a Carrington-level storm ranges from 20 million to 40 million. The outages could last from 16 days to as long as two years. The duration of outages will depend mainly on the availability of replacement transformers. If new transformers must be ordered, the lead time is likely to be at least five months.
• Storms weaker than Carrington-level could produce a small number of damaged transformers – from 10 to 20 – but the potential damage to densely populated regions along the Atlantic Coast is still significant. The total number of damaged transformers is less relevant for prolonged power outages – their concentration is what matters. The failure of a small number of transformers serving a highly populated area could trigger a prolonged outage.

Insurers may want to consider scenario planning that looks at two issues, in particular, to be ready for a Carrington Event. First, they may want to examine the operational constraints that could occur – insurers, for instance, might be unable to receive premium payments or issue policies and invoices while addressing policyholder claims. Second, insurers may seek to assess their potential overall exposure to claims for business interruption or other losses that a Carrington Event could cause.

Developing contingency plans for the possibility that a disaster will zap plants, buildings and equipment may also prove beneficial. This plan could focus on how companies might continue to operate even during a long-term interruption. This could include involving backup generators for critical systems, redundant computer systems and, for companies that are big enough, locations in areas that are less likely to be affected by a major storm and that can be staffed up in an emergency. It’s worth remembering that other companies will be scrambling to do the same thing – so options for relocating critical functions may be more limited in densely populated areas.

Companies may look to engage their full risk-management expertise, including a discussion about business-interruption insurance. Directors and trustees likely need to devote particular focus, because of their positions of responsibility. And, if you’re in management, you may want to ask yourself: “Is our board informed about the risks of potential exposure to a Carrington Event, or something similar?”

If not, it may be time to act.

The materials referenced herein are prepared by the author and as such do not necessarily represent the views or opinions of OneBeacon Professional Insurance. OneBeacon Professional Insurance makes no claims concerning the completeness or accuracy of these materials and takes no responsibility for supplementing, updating or correcting such materials.

This document is for general informational purposes only and does not constitute and is not intended to take the place of legal or risk management advice. Parties should contact their own personal counsel for any such advice.

How the ‘Internet of Things’ Affects Strategic Planning

When it comes to technology, the boardroom has been learning a new language: mobile, social, cloud, cyber security, digital disruption and more. Recently the National Association of Corporate Directors released an eight-part video series on the board’s role: The Intersection of Technology, Strategy and Risk. We have spent much of the past year focused on cyber security, an essential discussion given the widespread theft of intellectual property, privacy invasions and data breaches. A report on cyber crime and espionage by the Center for Strategic and International Studies (CSIS) in Washington, D.C., last year estimated that cyber crime costs the global economy $300 billion a year – an entire industry is growing around hacking! Research by PwC shows cyber insurance is the fastest-growing specialty coverage ever – around $1.3 billion a year in the U.S. As our boardroom agendas often get filled with discussions on risk, I asked Frontier Communications board director Larraine Segil how to shift the conversation to strategy. Larraine has a keen focus on opportunity and suggested we delve into solutions for governing “The Internet of Things.”

What exactly is the Internet of Things, and what are the implications for business strategy?

Think about connecting any device with an on and off switch to the Internet and to each other. This includes everything from cell phones, thermostats and washing machines to headphones, cameras, wearable devices and much more. This also applies to components of machines – for example, the jet engine of an airplane. If the device has an on and off switch, then chances are it can be a part of the Internet of Things. The technology research firm Gartner says that by 2020 there will be more than 26 billion connected devices. Think about Uber, the company that connects a physical asset (car and driver) to a person in need of a ride via a website. That simple connection has disrupted the taxi industry.

Airbnb has done the same for the lodging industry by directly connecting people with spaces to rent to those in need of accommodations.

What does this mean to for our companies? Larraine, what are you thinking when you hear about the Internet of Things for business opportunities? As a director, how can you help directors govern in this fast-moving digital age?

Frontier Communications provides connectivity services to a national customer base primarily in rural areas and is integrally involved in the Internet of Things. Frontier has a number of strategic alliances with companies that develop and market those very devices – or “things” – such as the Dropcam camera, a cloud-based WiFi video monitoring service with free live streaming, two-way talk and remote viewing that makes it easy to stay connected with places, people and pets, no matter where you are. Other alliances expanding the “things” will be introduced in the rest of 2014.

As a director, it is critical to be educated constantly about new trends, products and opportunities – competition is fast-moving, and customers are better-educated about their options than ever before. Strategically, the board has to think way ahead of the present status quo – and with the help of management and outside domain experts, explore opportunities for alliances. This requires using strategic analysis at every board meeting (not just at one offsite a year) and welcoming constant director education and brainstorming both within and outside of the company’s industry. The board should continually identify and evaluate strategic directions to keep the company fresh and nimble.

Remembering that we’ve only just begun, here are some critical questions boards should be asking about technology and the Internet of Things:

1. Are you including strategic discussions around technology at every board meeting?
2. Do your strategic directions include alliances within and outside of your industry?
3. How would you assess your current level of interaction with the chief information officer and chief technology officer? What can be done to improve the effectiveness of communications with them?
4. As a board, how are you helping to guide your company in innovative directions, taking into consideration disruptive technologies, competitor alliances and new ideas or skills coming from outside your industry?

Look Up, Look Out, Think New!

“The stalking weasel has its nose to the ground. It never hears the descent of the hawk.       Until. . . ”  

-Andrew Vachss, author

Are you like the weasel? Are you so focused on what you’re doing that you don’t hear the hawk that will soon take you out of the marketplace?

You may very well be staring at your hawk as you read this article!

You see, one of the “hawks” in today’s world is technology — iPhones, iPads, other mobile devices, the Internet, social media, Siri, artificial intelligence, big data, 3-D printing, etc. These “hawks,” along with a global economy, shifting demographics and new power players, have made the world a very dangerous place for “weasels” like you and me.

From Mohan Nair’s book, Strategic Business Transformation, we learn that the following companies were all profitable in 2007 and by 2010 were dead:

American Home Mortgage
Bombay Company
Comp USA
Circuit City
Lehman Brothers
Levitz Furniture
Linens and Things
Mervyn’s
Sharper Image
Wachovia
Ziff Davis
Bearing Point
Charter Communications
KB Toys
Monaco Coach
R.H. Donnelley
Silicon Graphics
Hollywood Video

From this same source, we also learn that in yesterday’s world 80% of change was cyclical and 20% was structural or transformational. Tomorrow, the opposite will be true – 80% will be structural and only 20% incremental.

Time, place and pace have changed. Today we live in a 24/7/365 world without borders and with an expectation of instant gratification. We want what we want, and we want it now! Don’t believe me? Google it!

In the days of Ozzie and Harriet, big threes dictated to a mass market. The auto industry was defined by GM, Ford and Chrysler. Broadcast television was owned by CBS, NBC and ABC. The magazine industry was controlled by Time, Life and Look.

Fast forward two or three decades, and power has fragmented. Today, more than 40 automobile manufacturers sell hundreds of models of cars in the U.S. Visit any newsstand in your town or the equivalent on your computer, and you can find magazines specializing in everything from fly-fishing to quilting to cigars to Sudoku. You can view hundreds of broadcast, cable or other channels — and from any screen you own, not just from a TV.

We are no longer a mass market but rather are a series of niches being served by specialty manufacturers and distributers using mass customization to meet the demands of each niche. In fact, as we walk toward the horizon of unlimited possibilities that is tomorrow, we see where each of us is a niche of one whose needs will be served uniquely.

Big data allows manufacturers and distributors (and others) to know our wants and needs before we even express them. (Yes, we are sacrificing privacy for convenience.) Innovation is now at the point where 3-D printers can manufacture body parts for us.

The market has switched from selling to facilitating buying: A Wall Street Journal headline in July 2012 read, “The Customer as a God.”

The Baby Boomers (a.k.a. hippies) are finally on the “center stage” of life but are being told to exit stage left so Gen Xers can have their turn. Waiting behind the curtain with the Gen Xers are the Millennials and the Gen C — and they will not be as patient as the Gen Xers have been. The Millennials and Gen C are the new world. They don’t want to intern under us. They want to do their own thing. Now.

The boomers and their world of analog are about yesterday. Selling products, developing relationships, drinks at the City Club, civic and church groups, letters, prospecting and cold calls: These are not tomorrow’s world.

As Scott Walchek wrote on this site, “The Last Analog Generation — let’s call them LAGards — are departing, and in their wake a fascinating new world is emerging.”

Gen C — the newest generation and the only digital natives currently on the planet — were born into tomorrow and don’t give a damn how we did it back in the day. As a Booz & Co. study says, “They are Generation C (born after 1990) — connected, communicating, content-centric, computerized, community-oriented, always clicking. By 2020, they will make up 40% of the population in the U.S., Europe and the BRIC countries, and 10% of the rest of the world, and by then they will constitute the largest group of consumers worldwide.”

Their biggest impact is that as teenagers they are not learning from us (their parents and grandparents) how to be good consumers. They are teaching us: how to use the power of technology and social media to be great consumers. They are teaching us how to buy in a digital and nonverbal world.

As a result, every manufacturer, distributor and salesperson will be changed sooner rather than later and much more deeply than they would change if left to their own devices.

Today is/was a world driven by products and services and product and service sellers. Tomorrow is about being defined and driven by clients. Siri will know more about products and services (even those you offer) than you do. Product knowledge will not be as important as understanding a client — a finger on their pulse. Tomorrow, you must be an expert in your client and their industry. You must build intimacy with each client and affinity with his or her world. Within this focus and framework, they will choose to buy from you — you won’t sell to them.

Each of us is who, what and where we are today because of the way we think. If we want to change, survive, prosper and enjoy longevity, we must think new! We must innovate or evaporate.

Make Your Data a Work-in-Process Tool

Heard recently: “Our organization has lots of analytics, but we really don’t know what to do with them.”

This is a common dilemma. Analytics (data analysis) are abundant. They are presented in annual reports and published in colorful graphics. But too often the effort ends there. Nice information, but what can be done with it? 

The answer is: a lot. It can change operations and outcomes, but only if it is handled right. A key is producing an analytics delivery system that is self-documenting.

Data evolution

Obviously, the basic ingredient for analytics is data. Fortunately, the last 30 years have been primarily devoted to data gathering.

Over that time, all industries have evolved through several phases in data collection and management. Mainframe and minicomputers produced data, and, with the inception of the PC in the '80s, data gathering became the business of everyone. Systems were clumsy in the early PC years, and there were significant restrictions to screen real estate and data volume. Recall the Y2K debacle caused by limiting year data to two characters.

Happily for the data-gathering effort, progress in technology has been rapid. Local and then wide area networks became available. Then came the Internet, along with ever more powerful hardware. Amazingly, wireless smartphones today are far more powerful computers than were the PCs of the '80s and '90s. Data gathering has been successful.

Now we have truckloads of data, often referred to as big data. People are trying to figure out how to handle it. In fact, a whole new industry is developing around managing the huge volumes of data. Once big data is corralled, analytic possibilities are endless.

The workers’ compensation industry has collected enormous volumes of data — yet little has been done with analytics to reduce costs and improve outcomes.

Embed analytic intelligence

The best way to apply analytics in workers’ compensation is to create ways to translate and deliver the intelligence to the operational front lines, to those who make critical decisions daily. Knowledge derived from analytics cannot change processes or outcomes unless it is embedded in the work  of adjusters, medical case managers and others who make claims decisions.

Consulting graphics for guidance is cumbersome: Interpretation is uneven or unreliable, and the effects cannot be verified.  Therefore, the intelligence must be made easily accessible and specific to individual workers.

Front line decision-makers need online tools designed to easily access interpreted analytics that can direct decisions and actions. Such tools must be designed to target only the issues pertinent to individuals. Information should be specific.

When predictive modeling is employed as the analytic methodology, certain claims are identified as risky. Instead, all claims data should be monitored electronically and continuously. If all claims are monitored for events and conditions predetermined by analytics, no high-risk claims can slip through the cracks. Personnel can be alerted of all claims with risky conditions. 

Self-documenting

The system that is developed to deliver analytics to operations should automatically self-document; that is, keep its own audit trail to continually document to whom the intelligence was sent, when and why. The system can then be expanded to document what action is taken based on the information delivered.

Without self-documentation, the analytic delivery system has no authenticity. Those who receive the information cannot be held accountable for whether or how they acted on it. When the system automatically self-documents, those who have received the information can be held accountable or commended.

Self-documenting systems also create what could be called Additionality. Additionality is the extent to which a new input adds to the existing inputs without replacing them and results in something greater. When the analytic delivery system automatically self-documents guidance and actions, a new layer of information is created. Analytic intelligence is linked to claims data and layered with directed action documentation.

A system that is self-documenting can also self-verify, meaning results of delivering analytics to operations can be measured. Claim conditions and costs can be measured with and without the impact of the analytic delivery system. Further analyses can be executed to measure what analytic intelligence is most effective and in what form and, importantly, what actions generate best results.

The analytic delivery system monitors all claims data, identifies claims that match analytic intelligence and embeds the interpreted information in operations. The data has become a work-in-process knowledge tool while analytics are linked directly to outcomes.