Tag Archives: insight

Are You Reinventing Wheel on Analytics?

Once your analysts have a clear business question to answer, do they start new analysis each time, potentially reinventing the wheel?

After creating or leading data and analytics teams for many years, I began to notice this pattern of behavior. What we seemed to lack was a consistent knowledge management solution or corporate memory that could easily spot what should be remembered.

Funnily enough, as I became convinced of the need for holistic customer insight, I found a partial answer among researchers.

Avoiding reinvention is such an important issue for analytics and insight teams that I’ll use this post to share my own experience.

The lack of secondary research approach for analytics

Researchers do a somewhat better job than insight teams because of their understanding of the need for secondary research. Experienced research analysts/managers will be familiar with considering the potential for desk research, or searching through past research, to answer the question posed. Perhaps because of the more obvious cost of commissioning new primary research (often via paying an agency), researchers make more effort to first consider if they already have access to information to answer this new question.

But, even here, there does not appear to be any ideal or market-leading knowledge management solution. Most of the teams I have worked with use an in-house development in Excel, interactive PowerPoint slides with hyperlinks to file structures or intranet-based research libraries. Whichever end-user computing or groupware solution is used, it more or less equates to an easier to navigate/search library of all past research. Normally, a user can search by keywords or tags, as well as through a prescribed structure of research for specific products/channels/segments etc.

See also: Why Customer Experience Is Key  

Some research teams use this very effectively and also recall those visualizations/graphics/VoxPops that worked well at conveying key insights about customers. It is worth investing in these area as it can save a significant amount of research budget to remember and reuse what has been learned already.

However, while also leading data or analytics teams (increasingly within one insight department), it became obvious that such an approach did not exist for analytics. At best, analysts used code libraries or templates to make coding quicker/standardized and to present results with a consistent professional look. Methodologies certainly existed for analysis at a high-level or for specific technical tasks like building predictive models, but there was no consistent approach to recording what had been learned from past analysis.

I’ve seen similar problems at a number of my clients. Why is this? Perhaps a combination of less visible additional costs (as analysts are employed already) and the tendency of many analysts to prefer to crack on with the technical work together conspire to undermine any practice of secondary analytics.

The many potential benefits of customer insight knowledge management

Once you focus on this problem, it becomes obvious that there are many potential benefits to improving your practice in this area.

Many analytics or BI leaders will be able to tell you their own horror stories of trying to implement self-serve analytics. These war stories are normally a combination of the classic problems/delays with data and IT projects, plus an unwillingness from business stakeholders to actually interrogate the new system themselves. All too often, after the initial enthusiasm for shiny new technology, business leaders prefer to ask an analyst than produce the report they need themselves.

So, one potential advantage of a well-managed and easily navigable secondary analytics store is a chance for business users to easily find past answers to the same question or better understand the context.

But the items stored in such an ideal knowledge management solution can be wider than just final outputs (often in the form of PowerPoint presentations or single dashboards).

I have seen teams benefit from developing solutions to store and share across the team:

  • Stakeholder maps and contact details
  • Project histories and documentation
  • Past code (from SQL scripts to R/Python packages or code snippets)
  • Metadata (we’ve shared more about the importance of that previously; here I mean what’s been learned about data items during an analysis)
  • Past data visualisations or graphics that have proved effective (sometimes converted into templates
  • Past results and recommendations for additional analysis or future tracking
  • Interim data, to be used to revisit or test hypotheses (suitably anonymized)
  • Output presentations (both short, executive style and long full documentation versions)
  • Recommendations for future action (to track acting on insights, as recommended previously)
  • Key insights, summarized into a few short sentences, to accumulate key insights for a specific segment, channel or product

Given this diversity and the range of different workflows of methodologies used by analysts, it is perhaps not surprising that the technical solutions tried vary as well.

Where is the technology analytics teams need for this remembering?

As well as being surprised that analytics teams lack the culture of secondary analytics, compared with the established practice of secondary research, I’m also surprised by a technology gap. What I mean is the lack of any one ideal, killer-app-type technology solution to this need from insight teams.

Although I have led and guided teams in implementing different workarounds, I’ve yet to see a complete solution that meets all requirements.

See also: Why to Refocus on Data and Analytics  

An insight, data or analytics leader looking to focus on this improvement should consider a few requirements. First off, the solution needs to cater with storing information in a wide variety of formats (from programming code to PowerPoint decks, customer videos to structured data sets, as well as the need to recognize project or “job bag” structures). Next, it has to be quick and easy to store these kinds of outputs in a way that can later be retrieved. Any solution that requires detailed indexing, accurate filing in the right sub-folder or extensive tagging just won’t get used in practice (at least not maintained). Finally, it also has to be quick and easy to access everything relevant from only partial information/memories.

Imperfect solutions that I have seen perform some parts of this well are:

  • Bespoke Excel or PowerPoint front-ends with hyperlinks to simple folder structures
  • Evernote app, with use of tags and notebooks
  • SharePoint/OneNote and other intranet-based solutions for saving Office documents
  • Databases/data lakes capable of storing unstructured or structured data in a range of file formats
  • Google search algorithms used to perform natural language searches on databases or folders

These can all fulfill part of the potential, but the ideal should surely be a simple as asking Alexa or Siri and having all completed work automatically tagged and stored appropriately. I’m sure it’s not behind the capabilities of some of the data and machine learning technologies available today to deliver such a solution. I encourage analytics vendors to focus more on this knowledge management space and less on just new coding and visualisations.

Do you see this need? How do you avoid reinventing the wheel?

I hope this petition has resonated with you. Do you see this need in your team?

Please let us know if you’ve come across an ideal solution. Even if it is far from perfect, it would be great to know what you are using.

Share your experience in comments boxes below, and I may design a short survey to find out how widely different approaches are used.

Until then, all the best with your insight work and remembering what you know already.

4 Ways to Avoid Being a Foolish Leader

April Fools’ Day is just one day a year, but there are common mistakes an insight leader is prone to (and that could end up making him look like a fool) all year ’round.

This isn’t surprising when you consider the breadth of responsibility within the customer insight leadership role. Such leaders have multi-disciplinary technical teams to manage and an increasing demand across from areas of modern business to improve decisions and performance.

Like most of the lessons I’ve learned over the years, the following has come from getting it wrong myself first. So, there’s no need for any of my clients or colleagues to feel embarrassed.

Beyond the day of pitfalls for the gullible, then, here are four common — but foolish — mistakes I see customer insight leaders still making.

1. Leaving data access control with IT

Data ownership and data management are not the sexiest responsibilities up for grabs in today’s organizations. To many, they appear to come with a much greater risk of failure or at least blame than any potential reward. However, this work being done well is often one of the highest predictors of insight team productivity.

Ask any data scientist or customer analyst what they spend most of their time doing, and the consistent answer (over my years of asking such questions) is “data prep.” Most of the time, significant work is needed to bring together the data needed and explore, clean and categorize it for any meaningful analysis.

But, given the negative PR and the historical role of IT in this domain, it can be tempting for insight leaders to leave control of data management with IT. In my experience, this is almost always a mistake. Over decades (of often being unfairly blamed for anything that went wrong and that involved technology), IT teams and processes have evolved to minimize risk. Such a controlled (and, at times, bureaucratic) approach is normally too slow and too restrictive for the demands of an insight team.

I’ve lost count of how many capable but frustrated analysts I have met over the years who were prevented from making a difference because of lack of access to the data needed. Sometimes the rationale is data protection, security or even operational performance. At the root, customer insight or data science work is, by nature, exploratory and innovative, and it requires a flexibility and level of risk that run counter to IT processes.

See also: 3 Skills Needed for Customer Insight

To avoid this foolish mistake, I recommend insight leaders take on the responsibility for customer data management. Owning flexible provision of the data needed for analysis, modeling, research and database marketing is worth the headaches that come with the territory. Plus, the other issues that come to light are well worth insight leaders knowing well — whether they be data quality, data protection, or something regulation- or technology-related. Data leadership is often an opportunity to see potential issues for insight generation and deployment much earlier in the lifecycle.

2. Underestimating the cultural work needed to bring a team together

Data scientists and research managers are very different people. Data analysts, working on data quality challenges, see the world very differently from database marketing analysts, who are focused on lead performance and the next urgent campaign. It can be all too easy for a new insight leader to underestimate these cultural differences.

Over more than 13 years, I had the challenge and pleasure of building insight teams from scratch and integrating previously disparate technical functions into an insight department. Although team structures, processes and workflows can take considerable management time to get working well, I’ve found they are easy compared with the cultural transformation needed.

This should not be a surprise. Most research teams have come from humanities backgrounds and are staffed by “people people” who are interested in understanding others better. Most data science or analysis teams have come from math and science backgrounds and are staffed by “numbers people” who are interested in solving hard problems. Most database marketing teams have come from marketing or sales backgrounds and are more likely to be motivated by business success and interested in proving what works and makes money. Most data management teams have come from IT or finance backgrounds and are staffed by those with strong attention to detail, who are motivated by technical and coding skills and who want to be left alone to get on with their work.

As you can see, these types of people are not natural bedfellows. Although their technical expertise is powerfully complementary, they tend to approach each other with natural skepticism. Prejudices that are common in society and education often fuel both misunderstanding and a reluctance to give up any local control to collaborate more. Many math and science grads have grown up poking fun at “fluffy” humanities students. Conversely, those with a humanities background and strong interest in society can dismiss data and analytics folk as “geeky” and as removed from the real world.

So, how can an insight leader avoid this foolish oversight and lead cultural change? There really is no shortcut to listening to your teams, understanding their aspirations/frustrations/potential and sharing what you learn to foster greater understanding. As well as needing to be a translator (between technical and business languages), the insight leader also needs to be a bridge builder. It’s worth remembering classic leadership lessons such as “you get what you measure/reward,” and “catch people doing something right.” So, ensure you set objectives that require cooperation and recognize those who pioneer collaboration across the divides. It’s also important to watch your language as a leader — it should be inclusive and value all four technical disciplines.

3. Avoiding commercial targets because of lack of control

Most of us want to feel in control. It’s a natural human response to avoid creating a situation where we cannot control the outcome and are dependent on others. However, that is often the route to greater productivity and success in business.

The myth still peddled by testosterone-fueled motivational speakers is that you are the master of your own destiny and can achieve whatever you want. Collaboration, coordination and communication are key to making progress in the increasingly complex networks in today’s corporations. For that reason, many executives are looking for those future leaders who have a willingness to partner with others and to take risks to do so.

Perhaps it is particularly the analytical mindset of many insight leaders that makes them painfully aware of how often a target or objective is beyond their control. When a boss or opportunity suggests taking on a commercial target, what strikes many of us (at first) is the implied dependency on other areas to deliver, if we are to achieve it.

See also: The Science (and Art) of Data, Part 1

For that reasons, many people stress wanting objectives that “measure what they can control’.” Citing greater accountability and transparency for their own performance can be an exercise in missing the point. In business life, what customer insights can produce on their own is a far-smaller prize than what can be achieved commercially by working with other teams. Many years ago, I learned the benefit of “stepping forward” to own sales or marketing targets as an insight leader. Although many of the levers might be beyond my control, the credibility and influencing needed were not.

Many insight leaders find they have greater influence with their leaders in other functions after taking such a risk. Being seen to be “in this together” or “on the spike” can help break down cultural barriers that have previously prevented insights being acted upon and that generate more profit or improve more customers’ experiences.

4. Not letting something fail, even though it’s broken

A common gripe I hear from insight leaders (during coaching or mentoring sessions) is a feeling of suffering for “not dropping the ball.” Many are working with disconnected data, antiquated systems, under-resourced teams and insufficient budgets. Frankly, that is the norm. However, as aware as they are of how much their work matters (because of commercial, customer and colleague impact), they strive to cope. Sometimes, for years, they and their teams work to manually achieve superhuman delivery from sub-human resources.

But there is a sting in the tale of this heroic success. Because they continue to “keep the show on the road,” their pleas for more funds, new systems, more staff or data projects often fall on deaf ears. From a senior executive perspective (used to all the reports needing more), the evidence presents another “if it ain’t broke, don’t fix it” scenario. They may empathize with their insight leader but also know they are managing to still deliver what’s needed. So, requests get de-prioritized.

In some organizations, this frustration can turn to resentment when insight leaders see other more politically savvy leaders get investment instead. Why were they more deserving? They just play the game! Well, perhaps its time for insight leaders to wake up and smell the coffee. Many years ago, I learned you have to choose your failures as well as your successes. With the same caution with which you choose any battles in business, it’s worth insight leaders carefully planning when and where to “drop the ball.”

How do you avoid this foolish mistake? Once again, it comes back to risk taking. Let something fail. Drop that ball when planned. Hold your nerve. If you’ve built a good reputation, chances are it will also increase the priority of getting the investment or change you need. You might just be your own worst enemy by masking the problem!

Phew, a longer post than I normally publish here or on Customer Insight Leader. But I hope those leadership thoughts helped.

Please feel free to share your own insights. Meanwhile, be kind to yourself today. We can all be foolish at times….

The Lighthouse Family and George Clooney…

We – writers and readers of this site alike – are kindred spirits. We are the “Lighthouse family” because we all want to, or need to, climb to the top of the lighthouse to see what’s over the horizon. We know we can’t always change what’s coming toward us, but we can be ready, perhaps sooner than others, to take action. Every day of my professional career, I have climbed at least some of the spiral steps inside the lighthouse.

When we look over the horizon, we can see all those important issues coming toward us, like digital customers, the impact of regulatory issues, the consequences of the Internet of Things and many others. But these issues don’t come to us as a small drips of news, rather as a tsumani of information. Every day – hour – I get new insight and ideas. I have more opinions in one day than I had annually a decade ago.

We shouldn’t complain. The alternative to being at the top of the lighthouse is, for me, pretty gloomy. It’s about being down at the base of the lighthouse, standing on the rocks, being beaten up by the waves of change. It’s like being in the dark when making important decisions, only wetter.

Often coupled with all this text information is the amount of analytics we receive. We are becoming analytics junkies. Perhaps someone should set up “Analytics Anonymous” for those who can’t live without their data.

But don’t we need to take all this information and the analytics with a pinch of salt? They are only relevant when seen in context. Look at any opinion, set of figures, blog even, at face value, and you are a poorer person. It’s only by understanding the context of the information that you gain real insight.

The context might mean understanding how you are doing compared with your nearest competitor. Or even how your insurance customers are behaving at the supermarket checkout – maybe they have less disposable income, and your drop in revenue might be a function of their personal decisions to spend less on insurance so they can feed their families.

So my point is this: Imagine if all the information we received – data, comment, opinion – had an element of context to it. Let’s call that prospect “Insight 2.0.”

Context is everything. At a personal level, I may not be George Clooney, but at least my wife thinks I’m better looking than the next guy. At least I hope she does…