Tag Archives: customers

The Thorny Issues in a Product Recall

In 1982, people in Chicago began dropping dead from cyanide poisoning, which was linked to Johnson & Johnson’s Tylenol in select drug stores. Johnson & Johnson immediately pulled all Tylenol from the shelves of all stores, not just those in Chicago. It was ultimately determined that the product had been tampered with by someone outside of Johnson & Johnson. But the company’s aggressive actions produced a legend: The Tylenol scare was chalked up as the case to review for an effective brand-preserving (even brand-enhancing) product recall strategy.

In 2011, though, the FDA took the extraordinary step of taking over three Johnson & Johnson plants that produced Tylenol because of significant problems with contamination. This time, Johnson & Johnson could not blame a crazed killer, only itself. A company that should have learned from its own celebrated case study had not retained that knowledge 30 years later.

The problems associated with recalls often aren’t the recall itself. In a recall, stores pull the products, and the media helps get the message to those who have already purchased the product to return them for refunds, replacement, repair or destruction.

One problem crops up when companies are too slow to move. It was revealed in the press in June 2014, that GM allegedly knew of its ignition switch problems seven years before it recalled the product. The recall that began in February 2014 itself became tortuous as new models were added almost daily to the list of cars that were in danger of electrical shutdown while in motion. The press, the regulators and, of course, the lawyers pounced on GM for its alleged withholding of information for so long and for the seemingly endless additional recall of cars affected by the problem. In 2015, regulators have called meetings with GM and other auto manufacturers mired in what has become an epidemic of recalls to discuss why repairs are dragging on so long.

Denial, lack of information, hunkering down (bunker mentality), secrecy, silo mentality and fears for the impact on the bottom line all contribute to disastrous recalls. With all recalls, there is the cost of the recall, the cost of complete or partial loss or loss of use of certain products, repair costs in some cases (GM), regulatory scrutiny and fines, class action and other lawsuits and the loss of potential income during any shutdown. These can all be big-ticket items, and some companies will not survive these expenses and loss of revenue.

Probably the biggest cost of any recall is the cost to reputation, which can mean loss of existing and future customers. In recent years, lettuce growers and a peanut warehouse did not survive recalls over contaminated products. In the case of primary agricultural producers like growers and peanut warehouses, the processors simply change suppliers, leaving the primary producers without any customers. In the retail market, the competition for shelf space is high. Brands that are recalled that are new or that do not have high customer value are simply barred from shelf space, effectively destroying the ability to market their products.

However, there are others that have strong brand following and even cult-like status in local markets. Blue Bell Creameries (famous for its ice cream) is one such company that has secured an almost cult-like following in the Southern and Midwestern states. Blue Bell, founded in 1907, maintains its headquarters in the small town of Brenham, TX (pop. 16,000).

Problems began when hospitals in Arizona, Kansas, Oklahoma and Texas reported patients suffering from an outbreak of listeria-related diseases, some as early as 2010. Some reports included the deaths of patients. On May 7, the FDA (Food and Drug Administration) and CDC (Centers for Disease Control and Prevention) reported, “It wasn’t until April 2015 that the South Carolina Department of Health and Environmental Control during routine product sampling at a South Carolina distribution center, on Feb. 12, 2015, discovered that a new listeria outbreak had a common source, Blue Bell Chocolate Chip Country Cookie Sandwich and the Great Divide Bar manufactured in Brenham Texas.”

Listeria is a bacteria that can cause fever and bowel-related discomfort and even more significant symptoms, especially in the young and elderly. Listeria can kill. Listeria is found naturally in both soil and water. Listeria can grow in raw and processed foods, including dairy, meat, poultry, fish and some vegetables. It can remain on processing equipment and on restaurant kitchen equipment, and when food comes in contact with contaminated equipment the bacteria finds a ready-made food source in that food and multiples. The FDA has issued guidance reports to food processors, preparers and restaurants on how to prevent listeria contamination. This includes proper preparation techniques, cleaning techniques, hygiene, testing and manufacturing and processing methodologies.

Once Blue Bell understood that its cookie sandwiches and ice cream bars were implicated, the company immediately recalled the products. But soon it became evident to Blue Bell and others that this outbreak might not be limited to the ice cream bars or cookie sandwiches, and Blue Bell recalled all of its product and, to its credit, shut down all manufacturing operations.

The FDA conducted inspections of Blue Bell plants, and in late April and early May produced reports on three plants, noting issues of cleanliness and process that were conducive to listeria growth. The FDA has also reported that Blue Bell allegedly had found listeria in its plants as far back as 2010 but never reported this to the FDA.

As of this writing, Blue Bell plants are still shut down. The FDA investigation has come to a close, but many questions remain. The company has cut 1,450 jobs, or more than a third of its work force, and has said it will reenter the market only gradually, after it has proved it can product the ice cream safely.

The question is whether these things Blue Bell has done: the quick recall, first of the problem products and then all products, and the closure of plants to mitigate contamination issues are enough to save Blue Bell from further damage in the eyes of consumers and the stores that sell the product. There are many tough questions to be answered going forward.

In the intervening months, will competitors replace Blue Bell with their own products that consumers feel will compare favorably? If so, when Blue Bell products are returned to stores will consumers return, or has the stigma of listeria and the acceptance of the taste of comparable products weakened the brand? Will stores give Blue Bell adequate shelf space? And, does Blue Bell have enough of a cult following and viral fan base that once product is back in stores customers will return as if nothing had happened? These are the scary questions that affect all food and drug companies when recalls are from contamination in their own plants or those in their supply chain.

The American consumer seems to have become numb to the endless succession of automobile recalls from just about all manufacturers. We dutifully return our vehicles to the dealer to fix a broken or faulty this or that. Even though many recalls involve parts or processes that could cause car accidents, injuries and deaths, it is as if we have come to accept faulty auto products as the norm.

This is not the case with food-borne illnesses. The fact that a faulty car can kill as easily as a contaminated food product seems not to be an issue as people return again and again to buy new cars from the same car manufacturer that issued five recalls on their last purchased model. However, consumers will shun the food brand that made some people ill. This bifurcated approach to risk makes no sense even in the context of protecting children from harm. The faulty car that mom drives the kids around in every day may have the same probability of injuring or killing her child as the recalled food brand. She doesn’t abandon her car, but she bans the recalled food brand from her table.

In 1990, Perrier discovered benzene in its sparkling water product. It quickly recalled all its product but then hunkered down into a bunker mentality. The lack of communication by Perrier about the problem and what it was doing exacerbated the fears of consumers, and the press speculation and outcry ran high. Perrier had always touted the purity of its water, so toxic benzene shattered this claim. Hunkering down reduced consumer confidence, and many left Perrier for suitable alternative products. Perrier has never regained the market share it had previously.

Blue Bell has taken the time to do things right, to find the causes of the problem and take steps necessary to prevent contamination in the future. But time also means that existing or even new competitors with comparative products will try to fill the shelf space vacated by Blue Bell’s absence. You can be sure that other-region favorites with cult followings that could never before gain a foothold in Blue Bell’s territory have been pressuring retailers to try them out as a replacement for Blue Bell.

Is the Perrier loss of market share inevitable for Blue Bell even if Blue Bell communicates adequately and with transparency? Time will tell. For now, Blue Bell not only has to fix the problems of plant cleanliness, it also needs to address emerging questions about its past operations, such as allegedly not reporting to the appropriate

While we note the good press that surrounded the 1982 Tylenol (external-tampering) recall and have seen so far a good effort by Blue Bell to resolve its own plant contamination issue, ultimately it is contamination that is the problem. Companies can become complacent, let cleanliness slide, use outmoded procedures, not replace older equipment or even ignore warning signs and isolated contamination events. Regional and limited product line companies need to be especially cognizant that even though they have carved out a powerful niche in the marketplace, maintaining this niche is tenuous at best in the highly competitive world of food products. Cleanliness and contamination-free are assumed by consumers. Food processors and manufacturers must do everything possible to keep that assumption from becoming contradicted.

Where to Start on Cyber Security?

Because of the recent and hugely public spate of cyber “events,” the world of cyber security and subsequently cyber insurance is firmly in overdrive. According to the UK Department for Innovation & Skills, 81% of large businesses and 60% of small businesses suffered a cyber-security breach in the last year, and the average cost of breaches to business has nearly doubled since 2013.

We have all seen the headlines, from Sony last year to British Airways earlier this month to the French TV Channel TV5Monde. The severity and importance of each of these has material impacts on not only their ability to do business but also their brand and reputation as a customer, employee and partner.

Sony was clearly hugely public, by far one of the biggest and most public I have seen hit the news for a long time. It was all over most news channels, causing outcry from customers and employees, some of whom threatened to sue their employer or former employer for failing to protect their data. Sony, of course, has had many attacks, including one taking down its PlayStation online platform for days on end. As for BA, the first I heard of this was an email saying, “Someone has accessed your account.” Please come change your password! This is the brand that I trust with my personal details, my location and much more.

Finally, TV5Monde seems to be particularly worrying to me. In a scene that reminded me of the wonderfully played Elliot Carver from 007’s “Tomorrow Never Dies,” the media giant was quite simply disabled, their TV taken off air, their public online presence taken over and more. An attack of this scale and power to me simply highlights what Hollywood has been portraying for years (remember “Die Hard,” where the bad guys take over the airport by hot wiring a few cables nearby?). Interestingly, subsequent reports again point to human error here – for instance, a TV interview showed passwords stuck to Post-It notes.

If we are under any doubt by the frequency, scale and impact of attacks, I found a great website (www.informationisbeautiful.net) recently that visualizes some of the data breaches by year, industry and size, reason and more; see here for the full interactive chart.

data

Cyber threats have been defined by many; however, as with many other critical business issues, lots of other things are being added to the overall “cyber” definition. The recent report from the UK Government on UK cyber security: the role of insurance talks through both the threat and, importantly, the opportunity for insurers.

The World Economic Forum in its 10th Annual Global Risks Report has cyber risks up with water crisis and natural catastrophe and ahead of WMD, infectious disease and fiscal crisis (in terms of likelihood of occurrence). Given what we have all experienced in the last recession, I don’t think we could have a stronger wake up call.

data 2
– Top Global Risks According to the World Economic Forum

For now, and certainly as I write today, there is a small correlation between cyber-attacks and loss of human life. However, as we become ever more connected with IoT (Internet of Things) or IoE (Internet of Everything), future devices will all be connected. In the latest report, the government said that 14 billion objects are already connected to the Internet, 40 million of them in the UK. By 2020, it could be as many as 100 billion worldwide.

The upside of being able to monitor your heart pacemaker or your insulin levels from an app are already upon us; “wearables” is the buzzword for 2015. When these devices move from monitoring to controlling, the threat just increases. A cyber-attack at a local level, shutting down a hospital, airport, city traffic system, taking over a driverless car or airplane – it’s far too easy to paint a picture here.

What’s the role of the insurer in all of this?

The insurance provider has a huge role in this, not only to pick up the pieces when an event occurs, but also across the entire lifecycle. At the outset, we have an opportunity to better educate the market on cyber risks in general, in creating insurance capacity for the event and ultimately better prepare ourselves for the continuing advancement and frequency of attacks.

This goes far beyond the cyber essentials to better prepare small and medium-sized businesses (SMEs) and large enterprises alike. This is not collecting a badge; this is time to get ready for a battle. Not just a battle against cyber threats, but a battle for your reputation and brand. A brand that says to your employees, customers and partners, you can trust me with your information – I have a plan in place that’s tried and tested! The government scheme has covered the bare minimum essentials, which is like passing your driving theory test. We need expert drivers here to navigate roads no one has previously seen.

The UK, and London market specifically, is already well-placed given its deep experience in insuring against specialty risks, but capacity in the market will continue to increase as the threats and frequency of events increases, giving rise to new, more tailored products and opportunities for the entire market. How long will it be before we all have our own personal cyber Insurance policy?

Move to prevention rather than cure

We need to better help organizations truly understand the cost of putting this right after the event. As an example, some estimate that the cost of the Target breach in the U.S. has cost them north of $100 million to correct. In the early earnings call post the event, Target executives said, “The breach resulted in $17 million of net expenses in the fourth quarter…, with $61 million of total expenses partially offset by the recognition of a $44 million insurance receivable.”

Hindsight is wonderful, but perhaps a fraction of this upfront would have saved this money and, importantly, provided time to focus on the business strategy, not remedial work.

Reputation, Reputation, Reputation

It’s already been widely discussed, but insuring an organization’s reputation is challenging for a number of reasons. Of course, almost anything can be insured, but defining what the impact is and then working out what you need to be covered for will no doubt bring additional challenge for something that most would describe as intangible. The Insurance Times has a good piece here on this.

More importantly, what’s the short-, medium- or long-term impact and value on the reputational damage? Take your favorite or most-used retailer, give it all your personal financial data and shopping habits. It then suffers a breach – how likely are you to use or recommend the retailer again? Maybe you would forgive it for one breach; what if it happened again? It’s too easy to move. I read that in the UK you are more “likely to suffer a theft from your bank than a physical burglary” these days.

Does this affect your future choice? How long does it take you to re-establish trust with your customers, employees and partners?

Typically, reputation risk is around 5% to 20% of cyber cost. However, in reality, it’s the gift that can keep on giving, that no one really wants.

What if you are an online-only business? What if you were the ones who disrupted your market through technology and now that has been taken away from you. You don’t have the luxury of physical outlets as a backup or alternative part of your business plan. Dealing with other breaches such as shoplifting has been an occurrence since retail began, but these were isolated to the individual locations.

SMEs, especially, are not as well-equipped. On one hand, digital makes access open to anyone to create a new business, but on the other hand we must now factor in the cost of doing business online, of which cyber is a now business-critical.

What do you think?

Are we prepared and doing enough across the sector?
Is this at the forefront of your business continuity strategy?
Have you a plan in place to protect your employees, customers and partners?
Do you have adequate cover that is well-enough defined?
Are you investing ahead of the curve to prevent it?

The Power of the Right Prototype

I’m a long-time advocate for leveraging prototypes and demos in the enterprise to explore ideas and emerging technologies. And, there’s a particular reason. It’s that precise moment during a prototype presentation when everything in the room changes. Eyes light up. People sit forward in their chairs. The conversation shifts from potential problems to possibilities. Executives become transfixed with ideas of transformation. The power of prototypes to persuade is undeniable.

The practice of prototyping has typically been isolated in certain pockets of the organization. However, as technology purchasing is distributed across the company, more executives can benefit from the prototype’s ability to put an idea or technology into the context of the business. Through a prototype, a technology once considered the latest consumer fad turns into a vehicle to advance enterprise innovation. Prototypes make a big, amorphous idea personal, relatable and feasible.

I was recently onsite with a client who was keen on prototyping, and the first question I asked was: Why? Not because I was skeptical that prototyping would work. Understanding the why behind a prototype is imperative for picking the right prototype for your project. Different business drivers demand different types of prototypes, and prototypes often need to evolve as the lifecycle of the idea or technology matures. Prototypes are more than rough first productions of an object. Prototypes span the dimensions of physical, high-technology and digital and can take various forms; even a workflow diagram can be considered an early prototype.

Prototype-table

For example, PwC used digital story telling via video to communicate our vision of the future of shopping as well as to portend the potential impact of wearables on the insurance industry. In this case, video was the most effective prototyping medium because the technology is already available in the marketplace. The impetus was more on inspiring executives to realize what is possible vs. testing the technology’s capability or feasibility. By contrast, when a client asked us to explore the use of sensors for increasing business intelligence, we built a smart refrigerator to test the feasibility and usability of the technological components in physical form, given that the Internet of Things is a nascent technology. A particular area of focus was exploring the transmission of data via the cloud.

Challenges With Getting Started

As businesses expand their prototyping programs, they will face challenges. Here are some of the obstacles they will need to scale.

1) New Skills. Prototyping requires a combination of creative design skills and rapid iterative development skills. Most companies have yet to cultivate these skills. Our 6th annual Digital IQ survey of nearly 1,500 business and technology executives found that only 19% of respondents rated the IT’s prototyping skills as “excellent.”

2) New Prototyping Processes. Traditional business processes like building a business case and determining an ROI don’t apply to prototyping. Businesses must find new ways to plan, fund and evaluate prototyping initiatives, such as the ability to identify the next new opportunity for growth, efficiency and effectiveness. How many ideas were generated, how many advanced to the next stage and how many ideas were taken to market?

3) A New Understanding. Executives need to understand why prototyping is vital in today’s fast-paced business climate, and the word needs to spread like wildfire across the enterprise. The good news is that it only takes one prototype presentation to turn someone into a believer.

Technology is touching every aspect of our lives, and businesses must constantly explore ways to better engage customers, employees and suppliers. Technology is moving too fast for companies to wait until a vendor hands them the next version of their product or service. Prototyping is no longer reserved for a handful of companies that are considered the kings of creativity. It’s necessary for all companies.

3 Keys to Achieving Sound Governance

Of the many definitions of governance, the simplest ones tend to have the most clarity. For the purpose of this piece, governance is a set of processes that enable an organization to operate in a fashion consistent with its goals and values and the reasonable expectations of those with vested interests in its success, such as customers, employees, shareholders and regulators. Governance is distinct from both compliance and enterprise risk management (ERM), but there are cultural and process-oriented similarities among these management practices.

It is well-recognized that sound governance measures can reduce the amount or impact of risk an organization faces. For that reason, among others, ERM practitioners favor a robust governance environment within an organization.

A few aspects of sound governance are worth discussion.  These include:  1) transparency and comprehensive communications, 2) rule of law and 3) consensus-building through thorough vetting of important decisions.

Transparency 

Transparency lessens the risk that either management or staff will try to do something unethical, unreasonably risky or wantonly self-serving because decisions, actions and information are very visible.  An unethical or covert act would stand out like the proverbial sore thumb.

Consider how some now-defunct companies, such as Enron, secretly performed what amounted to a charade of a productive business. There was no transparency about what assets of the company really were, how the company made money, what the real financial condition actually was and so on.

Companies that want to be transparent can:

  • Create a culture in which sharing of relevant data is encouraged.
  • Publish information about company vision, values, strategy, goals and results through internal communication vehicles.
  • Create clear instructions on a task by task basis that can used to train and be a reference for staff in all positions that is readily accessible and kept up to date.
  • Create clear escalation channels for issues or requests for exceptions.

Rule of Law

Good governance requires that all staff know that the organization stands for lawful and ethical conduct. One way to make this clear is to have “law abiding” or “ethical “as part of the organization’s values. Further, the organization needs to make sure these values are broadly and repeatedly communicated. Additionally, staff needs to be trained on what laws apply to the work they perform. Should a situation arise where there is a question as to what is legal, staff needs to know to whom they can bring the question.

The risks that develop out of deviating from lawful conduct include: financial, reputational and punitive. These are among the most significant non-strategic risks a company might face.

Consider a company that is found to have purposefully misled investors in its filings about something as basic as the cost of its raw materials. Such a company could face fines and loss of trust by investors, customers, rating agencies, regulators, etc., and individuals may even face jail time. In a transparent organization that has made it clear laws and regulations must be adhered to, the cost or cost trend of its raw materials would likely be a well documented and widely known number. Any report that contradicted common knowledge would be called into question.

Consider the dramatic uptick of companies being brought to task under the Foreign Corrupt Practices Act (FCPA) for everything from outright bribes to granting favors to highly placed individuals from other countries. In a transparent organization that has clearly articulated its position on staying within the law, any potentially illegal acts would likely be recognized and challenged.

How likely is it that a highly transparent culture wherein respect for laws and regulations is espoused would give rise to violations to prominent laws or regulations? It would be less likely, thus reducing financial, reputational and punitive risks.

The current increase in laws and regulations makes staying within the law more arduous, yet even more important. To limit the risk of falling outside the rule of law, organizations can:

  • Provide in-house training on laws affecting various aspects of the business.
  • Make information available to staff so that laws and regulations can be referenced, as needed.
  • Incorporate the legal way of doing things in procedures and processes.
  • Ensure that compliance audits are done on a regular basis.
  • Create hotlines for reporting unethical behavior.

Consensus-Building

Good governance requires consultation among a diverse group of stakeholders and experts. Through dialogue and, perhaps some compromise, a broad consensus of what is in the best interest of the organization can be reached. In other words, important decisions need to be vetted. This increases the chance that agreement can be developed and risks uncovered and addressed.

Decisions, even if clearly communicated and understood, are less likely to be carried out by those who have not had the chance to vet the idea.

Consider a CEO speaking to rating agency reviewers and answering a question about future earnings streams. Consider also that the CFO and other senior executives in separate meetings with the rating agency answer the same question in a very different way. In this scenario, there has clearly not been consensus on what the future looks like. A risk has been created that the company’s credit rating will be harmed.

To enhance consensus-building, companies can:

  • Create a culture where a free exchange of opinions is valued.
  • Encourage and reward teamwork.
  • Use meeting protocols that bring decision-making to a conclusion so that there is no doubt about the outcome (even when 100% consensus cannot be reached).
  • Document and disseminate decisions to all relevant parties.

During the ERM process step wherein risks are paired with mitigation plans, improved governance is often cited as the remedy to ameliorate the risk. No surprise there. Clearly, good governance reduces risk of many types. That is why ERM practitioners are fervent supporters of strong governance.

Should You Offshore Your Analytics?

There has been much on LinkedIn and Twitter in recent years about the shortfall in analytical resource, for the U.S. and UK especially.

Several years ago, I had the learning experience of attempting to offshore part of my analytics function to India, Bangalore to be precise. It was all very exciting at first, traveling out there and working with the team as they spent some time in the UK. Plus, on paper, offshoring looked like a good idea, to address the peaks and troughs of demand for analysis and modeling.

The offshoring pitch successfully communicated the ease of accessing highly trained Indian graduates at a fraction of UK wages. However, as with all software demos, the experience after purchase was a little different.

I always expected the model to take a while to bed down, and you expect to give any new analysts time to get up to speed with our ways of working. However, after a few months, the cracks began to show. Analysts in India were failing to understand what was required unless individual pieces of work were managed like mini-projects and requirements specified in great detail. There also appeared to be little ability to improvise or deal with “dirty data,” so significant data preparation was still required by my UK team (who were beginning to question the benefit).

Once propensity modeling was attempted, a few months later, it became even more apparent that lack of domain knowledge and rote learning from textbooks caused problems in the real world. Several remedies were tried. Further visits to the UK, upgrading the members of the team to even more qualified analysts (we started with graduates but were now working solely with those who held masters degrees and in some cases PhDs). Even after this, none of my Bangalore team were as able to “think on their feet,” like any of my less qualified analysts in the UK, and there were still not any signs that domain knowledge (about insurance, our business, our customer types, previous insight learned, etc) was being retained.

After 18 months, with a heavy heart (as all those I had worked with in Bangalore sincerely wanted to do the best job they could), I ended this pilot and instead recruited a few more analysts in the UK. Several factors drove the final decision, including:

  1. Erosion on labor arbitrage (the most highly skilled cost more);
  2. Inefficiency (i.e. need for prep and guidance) affecting the UK team;
  3. Cost and effort to comply with data security requirements.

Since that time, I have had a few customer insight leaders suggest that it is worth trying again (nowadays with China, Eastern Europe or South Africa), but I am not convinced. On reflection, my biggest concerns are around the importance of analysts understanding their domain (business/customers) and doing their own data preparation (as so much is learned from exploratory data analysis phase). The “project-ization” of analysis requests does not suit this craft.

So, for me, the answer is no. Do you have any experience of trying this?