Tag Archives: data visualization

3 Skills Needed for Customer Insight

While working in Amsterdam, I was reminded how insight analysts and leaders can shine brightly in very different contexts.

In the Netherlands, a mixture of training and facilitation was helping an events business. What struck me was the similarity of the challenges faced by their insight teams to the challenges I see in the U.K.

The more I work with insight leaders across sectors and geographies, the more I see how much they benefit from highly transferable skills. Here are three that are relevant to very different businesses and locations:

Prioritization

I’ve yet to work with a company where this isn’t a challenge, at least to some extent. As more and more business decisions require considering the customer, it’s not surprising that demand for data, analysis and research continues to rise. Most insight teams are struggling to meet the demand of both regular reporting (“business-as-usual”) tasks and the range of questions or projects coming in from business leaders. There have been many attempts to solve this struggle, including “projectizing” all requests (which tends to come across as a bureaucratic solution to reduce demand for information) and periodic planning sessions (using Impact/Ease Matrix or similar tools). In today’s fast-changing businesses, I’ve found that local prioritization within “the bucket method” works best.

What I mean by the “bucket method” is the identification of the silos (mainly for decision-making) that are most powerful in your business. This often follows your organizational design, but not always. Is your business primarily structured by channel, product, segment or some other division of profit and loss accounts? Each silo should be allocated a “bucket” with a notionally allocated amount of insight resource, which is based on an appropriate combination of profit potential, strategic fit and proven demand (plus acted-on results) Regular meetings should be held between the insight leader and the most senior person possible within that silo. Where possible, the insight leader should meet with the relevant director.

The bucket principle relates to the idea that, when something is full, it’s full. So, in reviewing progress and any future requirements with the relevant director, you challenge him to make local prioritization calls. Going back to the bucket metaphor, adding more requires removing something else—unless the bucket wasn’t already full. Due to human nature, I haven’t seen the bucket principle work company-wide or group-wide. However, it can work very well in the local fiefdoms that exist in most businesses. In fact, it can support a feeling that the insight team is close to the business unit and is in the trenches with them to help achieve their commercial challenges.

Buy-In

When trying to diagnose why past insight work has stalled or why progress isn’t being made, stakeholders often identify an early stage in the “project.” The nine-step model used by Laughlin Consultancy has a step (prior to starting the technical work) called “buy-in.” It takes a clear plan or design for the work needed and sends it back to the sponsoring stakeholder to ensure it will meet the requirements. Often, this practice is missed by insight teams. Even mature customer insight teams may have mastered asking questions and getting to the root of the real business need behind a brief, but they then just capture that requirement in the brief. Too few interpret that need and provide a clear description of what will be delivered.

There are two aspects of returning to your sponsor to achieve buy-in that can be powerful. First is the emotional experience of the business leader (or multiple stakeholders, if needed) feeling more involved in the work to be done. As Alexander Hamilton famously said, “Men often oppose a thing merely because they have had no agency in planning it, or because it may have been planned by those whom they dislike.” It’s so important in the apparently rational world of generating insight to remember the importance of emotions and relationships within your business. Paying stakeholders the compliment of sharing the planned work with them ensures the intended deliverable will meet their needs and is something that often helps.

The other benefit of becoming skilled at this buy-in stage is learning to manage expectations and identify communication requirements. With regard to expectations, you should set realistic timescales (which, first, requires effective planning and design), along with openly sharing any risks or issues so that they don’t come as a surprise. Communication—and asking how much a sponsor wants to be kept in the loop—can make a real difference to keeping your sponsor happy. Some sponsors will be happy with radio silence until a task is complete or a decision is needed (they value not being disturbed). Others will lose confidence in your work unless they hear regular progress updates. It’s best not to confuse one with the other.

Communication

Training customer insight analysts in softer skills often results in a significant portion of the course focusing on the presentation of findings. This isn’t surprising, because, in many ways, that’s the only tangible product insight teams can point to, prior to driving decisions, actions and business results. Too frequently, I hear stories of frustrated insight teams that believe the business doesn’t listen to them, or I hear from business leaders that their insight team doesn’t produce any real insights.

Coaching, or just listening to others express such frustrations, regularly reveals that too many analytics and research presentations take the form of long, boring PowerPoints, which are more focused on showing the amount of work that’s been done than presenting clear insights. While it’s understandable that an analyst who has worked for weeks preparing data, analyzing and generating insights wants her effort rewarded, a better form of recognition is having the sponsor act on your recommendations. Often, that’s more likely to occur based on a short summary that spares readers much of the detail.

Data visualizationstorytelling and summarizing are all skills necessary to master on the road to effective communication. Most communication training will also stress the importance of being clear, concrete, considerate, courteous, etc. Many tabloids have mastered these skills. Love them or hate them, tabloid headline writers are masters of hierarchies of communication. Well-crafted, short, eye-catching headings are followed by single-sentence summaries, single-paragraph summaries and then short words, paragraphs and other line breaks to present the text in bite-sized chunks.

Transferable skills

Insight analysts and leaders who master such crafts as prioritization, buy-in and communication could probably succeed in almost any industry and in many different countries. Many directors will attest to the fact that sideways moves helped their careers. A CV demonstrating the ability to master roles in very different contexts is often an indication of readiness for a senior general management role.

How to Resist Sexy Analytics Software

Who’s made the mistake of buying apps or sexy analytics software just based on appearance?

Go on, own up. I’m sure at one time or other, we have all succumbed to those impulse purchases.

It’s the same with book sales. Although it should make no difference to the reading experience, an attractive cover does increase sales.

But if you approach your IT spending based on attractiveness, you’re heading for trouble.

Now you may be thinking. Hold on, that’s what my IT department is there to protect against. That may be the case in your business, but as Gartner has predicted, by 2017 the majority of IT spending in companies is expected to be made by the CMO, not the CIO.

There are advantages to that change. Software will need to be more accessible for business users and able to be configured without IT help, and the purchasers are likely to be closer to understanding the real business requirements. But, as insight teams increase their budgets, there are also risks.

This post explores some of the pitfalls I’ve seen business decision makers make. Given our focus as a blog, I’ll be concentrating on the purchase of analytics software on the basis of appearance.

1. The lure of automation and de-skilling:

Ever since the rise of BI tools in the ’90s, vendors have looked for ways to differentiate their MI or analytics software from so many others on the market. Some concentrated on “drag and drop” front ends, some on the number of algorithms supported, some on their ease of connectivity to databases, and a number began to develop more and more automation. This led to a few products (I’ll avoid naming names) creating what were basically “black box” solutions that you were meant to trust to do all the statistics for you. They became a genre of “trust us, look the models work” solutions.

Such solutions can be very tempting for marketing or analytics leaders struggling to recruit or retain the analysts/data scientists they need. Automated model production seems like a real cost saving. But if you look more deeply, there are a number of problems. Firstly, auto-fitted models rarely last as long as ‘hand crafted’ versions, and tend to degrade faster as it is much harder not to have overfitted the data provided. Related to this, such an approach does not benefit from real understanding of the domain being modeled (which is also a pitfall of outsourced analysts). Robust models benefit from variable and algorithm selection that are both appropriate to the business problem and know the meaning of the data items, as well as any likely future changes. Lastly, automating almost always excludes meaningful “exploratory data analysis,” which is a huge missed opportunity as that stage more often than not adds to knowledge of data and provides insights itself. There is not yet a real alternative to the benefits of a trained statistical eye during the analytics and model building process.

2. The quick fix of local installation:

Unlike all the work involved in designing a data architecture and appropriate data warehouse/staging/connectivity solution, analytics software is too often portrayed as a simple matter of install and run. This can also be delusory. It is not just the front end that matters with analytics software. Yes, you need that to be easy to navigate and intuitive to work with (but that is becoming a hygiene factor these days). But there is more to consider round the back end. Even if the supplier emphasizes its ease of connectivity with a wide range of powerful database platforms. Even if you know the investment has gone into making sure your data warehouse is powerful enough to handle all those queries. None of that will protect you from lack of analytics grunts.

See Also: Analytics and Survival in the Data Age

The problem, all to often, is that business users are originally offered a surprisingly cheap solution that will just run locally on their PCs or Macs. Now, that is very convenient and mobile, if you simply want to crush low volumes of data from spreadsheets or data on your laptop. But the problem comes when you want to use larger data sources and have a whole analytics team trying to do so with just local installations of the same analytics software (probably paid for per install/user). Too many current generation cheaper analytics solutions will in that case be limited to the processing power of the PC or Mac. Business users are not warned of the need to consider client-server solutions, both for collaboration and also to have a performant analytics infrastructure (especially if you also want to score data for live systems). That can lead to wasted initial spending as a costly server and reconfiguration or even new software is needed in the end.

3. The drug of cloud-based solutions:

With any product, it’s a sound consumer maxim to beware of anything that looks too easy or too cheap. Surely, such alarm bells should have rung earlier in the ears of many a marketing director who has ended up being stung by a large final “cost of ownership” for a cloud-based CRM solution. Akin to the lure of fast-fix local installation, cloud-based analytics solutions can promise even better, no installation at all. Pending needing firewall changes to have access to the solution, it offers the business leader the ultimate way to avoid those pesky IT folk. No wonder licenses have sold.

But anyone familiar with the history of the market leaders in cloud-based solutions (and even the big boys who have jumped on the bandwagon in recent years), will know it’s not that easy. Like providing free or cheap drugs at first, to create an addict, cloud-based analytics solutions have a sting in the tail. Check out the licensing agreement and what you will need to scale. As use of your solution becomes more embedded in an organization, especially if it becomes the de facto way to access a cloud-based data solution, your users  thus license costs will gather momentum. Now, I’m not saying the cloud isn’t a viable solution for some businesses. It is. But beware of the stealth sales model that is implicit.

4. Oh, abstraction, where are you now I need you more than ever?

Back in the ’90s, the original business objects product created the idea of a “layer of abstraction” or what was called a “universe.” This was configurable by the business (but probably by an experienced power user or insight analyst who knew the data), but more often than not benefited from involvement of a DBA from IT. The product looked like a visual representation of a database scheme diagram and basically defined not just all the data items the analytics software could use, but also the allowed joins between tables, etc. Beginning to sound rather too techie? Yes, obviously software vendors thought so, too. Such a definition has gone the way of metadata, perceived as a “nice to have” that is in reality avoided by flashy-looking workarounds.

The most worrying recent cases I have seen of lacking this layer of abstraction are today’s most popular data visualization tools. These support a wide range of visualizations and appear to make it as easy as “drag and drop” to create any you want from the databases to which you point the software (using more mouse action). So far, so good. Regular readers will know I’m a data visualization evangelist. The problem is that without any defined (or controlled, to use that unpopular term) definition of data access and optimal joins, the analytics queries can run amok. I’ve seen too many business users end up in confusion and have very slow response times, basically because the software is abdicating this responsibility. Come on, vendors, in a day when Hadoop et al. are making the complexity of data access more complex, there is need for more protection, not less!

Well, I hope those observations have been useful. If they protect you from an impulse purchase without having a pre-planned analytics architecture, then my time was worthwhile.

If not, well, I’m old enough to enjoy a good grumble, anyway. Keep safe! 🙂

Helping Data Scientists Through Storytelling

Good communication is always a two-way street. Insurers that employ data scientists or partner with data science consulting firms often look at those experts much like one-way suppliers. Data science supplies the analytics; the business consumes the analytics.

But as data science grows within the organization, most insurers find the relationship is less about one-sided data storytelling and more about the synergies that occur in data science and business conversations. We at Majesco don’t think it is overselling data science to say these conversations and relationships can have a monumental impact on the organization’s business direction. So, forward-thinking insurers will want to take some initiative in supporting both data scientists and business data users as they work to translate their efforts and needs for each other.

In my last two blog posts, we walked through why effective data science storytelling matters, and we looked at how data scientists can improve data science storytelling in ways that will have a meaningful impact.

In this last blog post of the series, we want to look more closely at the organization’s role in providing the personnel, tools and environment that will foster those conversations.

Hiring, supporting and partnering

Organizations should begin by attempting to hire and retain talented data scientists who are also strong communicators. They should be able to talk to their audience at different levels—very elementary levels for “newbies” and highly theoretical levels if their customers are other data scientists. Hiring a data scientist who only has a head for math or coding will not fulfill the business need for meaningful translation.

Even data scientists who are proven communicators could benefit from access to in-house designers and copywriters for presentation material. Depending on the size of the insurer, a small data communication support staff could be built to include a member of in-house marketing, a developer who understands reports and dashboards and the data scientist(s). Just creating this production support team, however, may not be enough. The team members must work together to gain their own understanding. Designers, for example, will need to work closely with the analyst to get the story right for presentation materials. This kind of scenario works well if an organization is mass-producing models of a similar type. Smooth development and effective data translation will happen with experience. The goal is to keep data scientists doing what they do best—using less time on tasks that are outside of their domain—and giving data’s story its best possibility to make an impact.

Many insurers aren’t yet large enough to employ or attract data scientists. A data science partner provides more than just added support. It supplies experience in marketing and risk modeling, experience in the details of analytic communications and a broad understanding of how many areas of the organization can be improved.

Investing in data visualization tools

Organizations will need to support their data scientists, not only with advanced statistical tools but with visualization tools. There are already many data mining tools on the market, but many of these are designed with outputs that serve a theoretical perspective, not necessarily a business perspective. For these, you’ll want to employ tools such as Tableau, Qlikview and YellowFin, which are all excellent data visualization tools that are key to business intelligence but are not central to advanced analytics. These tools are especially effective at showing how models can be used to improve the business using overlaid KPIs and statistical metrics. They can slice and dice the analytical populations of interest almost instantaneously.

When it comes to data science storytelling, one tool normally will not tell the whole story. Story telling will require a variety of tools, depending on the various ideas the data scientist is trying to convey. To implement the data and model algorithms into a system the insurer already uses, a number of additional tools may be required. (These normally aren’t major investments.)

In the near future, I think data mining/advanced analytics tools will morph into something able to contain more superior data visualization tools than are currently available. Insurers shouldn’t wait, however, to test and use the tools that are available today. Experience today will improve tomorrow’s business outcomes.

Constructing the best environment

Telling data’s story effectively may work best if the organization can foster a team management approach to data science. This kind of strategic team (different than the production team) would manage the traffic of coming and current data projects. It could include a data liaison from each department, a project manager assigned by IT to handle project flow and a business executive whose role is to make sure priority focus remains on areas of high business impact. Some of these ideas, and others, are dealt with in John Johansen’s recent blog series, Where’s the Real Home for Analytics?

To quickly reap the rewards of the data team’s knowledge, a feedback vehicle should be in place. A communication loop will allow the business to comment on what is helpful in communication; what is not helpful; which areas are ripe for current focus; and which products, services and processes could use (or provide) data streams in the future. With the digital realm in a consistent state of fresh ideas and upheaval, an energetic data science team will have the opportunity to grow together, get more creative and brainstorm more effectively on how to connect analytics to business strategies.

Equally important in these relationships is building adequate levels of trust. When the business not only understands the stories data scientists have translated for them but also trusts the sources and the scientists themselves, a vital shift has occurred. The value loop is complete, and the organization should become highly competitive.

Above all, in discussing the needs and hurdles, do not lose the excitement of what is transpiring. An insurer’s thirst for data science and data’s increased availability is a positive thing. It means complex decisions are being made with greater clarity and better opportunities for success. As business users see results that are tied to the stories supplied by data science, its value will continue to grow. It will become a fixed pillar of organizational support.

This article was written by Jane Turnbull, vice president – analytics for Majesco.

Leveraging the Power of Data Insights

The vast majority of insurance companies lack the infrastructure to mobilize around a true prescriptive analytics capability, and small- and medium-sized insurers are especially at risk, in terms of leveraging data insights into a competitive advantage. Small- and medium-sized insurers are constrained by the following key resource categories:

    • Access and ability to manage experienced data scientists
    • Ability to acquire or develop data visualization, machine learning and artificial intelligence capability
    • Experience and staff to manage extensive and complex data partnerships
    • Access to modern core insurance systems and data and analytics technology to leverage product innovation insights and new customer interactions

Changing customer behaviors, non-traditional competition and internal operational constraints are putting many traditional insurance companies—especially the smaller ones—at risk from a retention and growth perspective. The marketplace drivers create several pain points or constraints for small and medium size insurers, such as can be seen in the following graphic:

Screen Shot 2016-02-15 at 2.53.12 PM
This is excerpted from a research report from Majesco. To read the full report, click here.

Don’t Use a “Me, Too” Strategy on UBI

I recently attended an outstanding industry conference. It really was one of the best-produced conferences I have attended since I started presenting at conferences (too) many years ago. The keynote speakers were all insightful celebrities, and the swag was better than what Santa has delivered to my house for the past two Christmases. Oh, yeah, and the presentations weren’t bad, either.

During some of the breakout sessions, I overheard some conference attendees discussing their overarching strategy for usage-based insurance (UBI). I heard a couple of the usual comments, “We’re waiting for all cars to have embedded connected car technology,” and “We’re looking for a smartphone solution.”

However, I did hear a new comment that a couple of attendees admitted their companies have employed: the “me, too” strategy. I was a little caught off guard when I heard it. It is not an ideal strategy for deploying a UBI program. But it is plausible that those trying to minimize policyholder attrition would make the attempt.

In a “me, too” strategy, company X creates its own version of a product that already exists in the market, so that, when it is mentioned that competitor Y has this product, company X can say, “Me, too!”

This strategy might be more effective when selling hammers. Obviously, UBI is much more dynamic and should be part of a much larger strategy of improving risk management, pricing accuracy and policyholder intelligence. The data and analytics created from gathering valuable driving data have a wealth of utility.

Forgive me, I got lost in my own UBI infomercial…

Let’s get back to the harmful impact of the “me, too” strategy on UBI. Typically, products launched under this strategy are not properly funded beyond product launch, as the program goal and the launching of the product are one and the same. Any long-term goals do not include improvements to the product, and any real value of the product is rarely realized.

The immediate negative impact of employing a “me, too” strategy is seen in the lack of resources to properly distribute, manage and improve the product. The approach also relieves anyone of responsibility for the product (or program), thus no one is required to show results or improvement.

In the long term, policyholders must endure the brunt of the “me, too” strategy. Their experience with an insufficient UBI product is poor, at best. Participation in the program steadily decreases, or stagnates. Either way, participant numbers never come close to those listed in the business case. Moreover, the product and program are disparaged in the market, and the company and ecosystem all receive negative marks from policyholders.

There are certainly better strategies that offer greater returns in the short and long term. I encourage those considering a UBI program to take a long-term approach. Gather information from multiple sources and pay just as much attention to the back-end management system for the overall program as you do the bells and whistles. It is the back-end management that will ultimately deliver a best-in-class UBI product as well as the data analytics, and more of the true program value.

Here is a bit of information that will be helpful when comparing back-end management systems. Look for the following:

  1. Responsive dashboards that provide visibility into key performance metrics so you can make informed decisions about your program.
  2. Data visualizations and program analytics embedded within the product to help you understand month-over-month program growth and analyze impacts of marketing campaigns, seasonal effects and geographical adoption.
  3. Program diagnostics that provide you with detailed business intelligence to manage program health and identify areas for follow-up, such as potentially fraudulent behaviors.
  4. Flexible reports that support online viewing, scheduling and exporting to fit within your best practices and business needs.
  5. Flexible enrollment capabilities that support all stages of program growth with enrollment interfaces that support single to bulk enrollments, all backed by fully automated enrollment integration.
  6. Integrated device tracking tools that provide full insights into shipping and the cadence of devices reaching your customers.
  7. Comprehensive logistics tracking that is available throughout the account lifecycle — from enrollment, shipping, delivery, installation and continuing data reporting for each user.

I hope this is helpful information. If you have attended any high quality connected car or UBI conferences, please drop me a note, as I am doing my own planning for 2016.