Tag Archives: stanley milgram

How We’re Wired to Make Bad Decisions

Business is a contact sport. Some companies win while others lose. That won’t change. There is no way to guarantee success. Make the best decisions you can, and then fight the battle in the marketplace.

Yet research into more than 2,500 large corporate failures that Paul Carroll and I did found that many big decisions are doomed as they come off the drawing board—before first contact with the competition. Why?

The short answer is that humans are far from rational in their planning and decision-making. Psychological and anthropological studies going back decades, including those of Solomon AschStanley MilgramIrving JanisDonald Brown and, more recently, Dan Ariely, consistently demonstrate that even the smartest among us face huge impediments when making complicated decisions, such as those involved in setting strategy.

In other words, humans are hard-wired to come up with bad decisions. Formulating good ones is very difficult because of five natural tendencies:

1. Fallacious assumptions: If “point of view is worth 80 IQ points,” as Alan Kay says, people often start out in a deep hole.

One problem is the anchoring bias, where we subconsciously tend to work from whatever spreadsheet, forecast or other formulation we’re presented. We tend to tinker rather than question whether the assumptions are right or whether the ideas are even worth considering. Even when we know a situation requires more sophisticated analysis, it’s hard for us to dislodge the anchors.

See also: Downsizing: Common Sense in Decision-Making May Lead to a Trap  

Another strike against expansive thinking is what psychologists call the survivorship bias: We remember what happened; we don’t remember what didn’t happen. We are encouraged to take risks in business, because we read about those who made “bet the company” decisions and reaped fortunes—and don’t read about those that never quite made the big time because they made “bet the company” decisions and lost.

2. Premature closure: People home in on an answer prematurely, long before we evaluate all information.

We get a first impression of an idea in much the same way we get a first impression of a person. Even when people are trained to withhold judgment, they find themselves evaluating information as they go along, forming a tentative conclusion early in the process. Premature conclusions, like first impressions, are hard to reverse.

A study of analysts in the intelligence community, for instance, found that, despite their extensive training, analysts tended to come to a conclusion very quickly and then “fit the facts” to that conclusion. A study of clinical psychologists found that they formed diagnoses relatively rapidly and that additional information didn’t improve those diagnoses.

3. Confirmation bias: Once people start moving toward an answer, they look to confirm that their answer is right, rather than hold open the possibility that they’re wrong.

Although science is supposed to be the most rational of endeavors, it constantly demonstrates confirmation bias. Ian Mitroff’s The Subjective Side of Science shows at great length how scientists who had formulated theories about the origins of the Moon refused to capitulate when the moon rocks brought back by Apollo 11 disproved their theories; the scientists merely tinkered with their theories to try to skirt the new evidence.

Max Planck, the eminent physicist, said scientists never do give up their biases, even when they are discredited. The scientists just slowly die off, making room for younger scientists, who didn’t grow up with the errant biases. Planck could just as easily been describing most business people.

4. Groupthink: People conform to the wishes of the group, especially if there is a strong person in the leadership role, rather than ask tough questions.

Our psyches lead us to go along with our peers and to conform, in particular, to the wishes of authority figures. Numerous psychological experiments show that humans will go along with the group to surprising degrees.

From a business standpoint, ample research, supported by numerous examples, suggest that even senior executives, as bright and decisive as they typically are, may value their standing with their peers and bosses so highly that they’ll bend to the group’s wishes—especially when the subject is complicated and the answers aren’t clear, as is always the case in strategy setting.

5. Failure to learn from past mistakes: People tend to explain away their mistakes rather than to acknowledge their errors, making it impossible to learn from them.

Experts are actually more likely to suffer from overconfidence than the rest of the world. After all, they’re experts. Studies have found that people across all cultures tend to think highly of themselves even if they shouldn’t. They also blame problems on bad luck rather than take responsibility and learn from failures. Our rivals may succeed through good luck, but not us. We earned our way to the top.

See also: How to Lead Like a Humble Gardener  

While it’s been widely found that some 70% of corporate takeovers hurt the stock-market value of the acquiring company, studies find that roughly three-quarters of executives report that takeovers they were involved in had been successes.

The really aware decision makers (the sort who read articles like this one) realize the limitations they face. So, they redouble their efforts, insisting on greater vigilance and deeper analysis.

The problem is that that isn’t enough. As the long history of corporate failures show, vigilant and analytical executives can still come up with demonstrably bad strategies.

The solution is not to just be more careful. Accept that the tendency toward decision-making errors is deeply ingrained and adopt devil’s advocates and other explicit mechanisms to counter those tendencies.

5 Ways to Flub a Big Decision

Business is a contact sport. Some companies win, while others lose. That won’t change. There is no way to guarantee success. Make the best decisions you can, then fight the battle in the marketplace.

Yet research into more than 2,500 large corporate failures that Paul Carroll and I did found that many big decisions are doomed as they come off the drawing board—before first contact with the competition. Why?

The short answer is that humans are far from rational in their planning and decision-making. Psychological and anthropological studies going back decades, including those of Solomon AschStanley MilgramIrving JanisDonald Brown and, more recently, Dan Ariely, consistently demonstrate that even the smartest among us face huge impediments when making complicated decisions, such as those involved in setting strategy.

In other words, humans are hard-wired to come up with bad decisions. Formulating good ones is very difficult because of five natural tendencies:

1. Fallacious assumptions: If “point of view is worth 80 IQ points,” as Alan Kay says, people often start out in a deep hole.

One problem is the anchoring bias, where we subconsciously tend to work from whatever spreadsheet, forecast or other formulation we’re presented. We tend to tinker rather than question whether the assumptions are right or the ideas are even worth considering. Even when we know a situation requires more sophisticated analysis, it’s hard for us to dislodge the anchors.

See Also: Better Way to Think About Leadership

Another strike against expansive thinking is what psychologists call the survivorship bias: We remember what happened; we don’t remember what didn’t happen. We are encouraged to take risks in business, because we read about those who made “bet the company” decisions and reaped fortunes—and don’t read about those who never quite made the big time because they made “bet the company” decisions and lost.

2. Premature closure: People home in on an answer prematurely, long before we evaluate all information.

We get a first impression of an idea in much the same way we get a first impression of a person. Even when people are trained to withhold judgment, they find themselves evaluating information as they go along, forming a tentative conclusion early in the process. Premature conclusions, like first impressions, are hard to reverse.

A study of analysts in the intelligence community, for instance, found that, despite their extensive training, analysts tended to come to a conclusion very quickly and then “fit the facts” to that conclusion. A study of clinical psychologists found that they formed diagnoses relatively rapidly and that additional information didn’t improve those diagnoses.

3. Confirmation bias: Once people start moving toward an answer, they look to confirm that their answer is right, rather than hold open the possibility that they’re wrong.

Although science is supposed to be the most rational of endeavors, it constantly demonstrates confirmation bias. Ian Mitroff’s The Subjective Side of Science shows at great length how scientists who had formulated theories about the origins of the Moon refused to capitulate when the moon rocks brought back by Apollo 11 disproved their theories; the scientists merely tinkered with their theories to try to skirt the new evidence.

Max Planck, the eminent physicist, said scientists never do give up their biases, even when they are discredited. The scientists just slowly die off, making room for younger scientists, who didn’t grow up with the errant biases. Planck could just as easily have been describing most business people.

4. Groupthink: People conform to the wishes of the group, especially if there is a strong person in the leadership role, rather than ask tough questions.

Our psyches lead us to go along with our peers and to conform, in particular, to the wishes of authority figures. Numerous psychological experiments show that humans will go along with the group to surprising degrees.

From a business standpoint, ample research, supported by numerous examples, suggest that even senior executives, as bright and decisive as they typically are, may value their standing with their peers and bosses so highly that they’ll bend to the group’s wishes—especially when the subject is complicated and the answers aren’t clear, as is always the case in strategy setting.

5. Failure to learn from past mistakes: People tend to explain away their mistakes rather than to acknowledge their errors, making it impossible to learn from them.

Experts are actually more likely to suffer from overconfidence than the rest of the world. After all, they’re experts. Studies have found that people across all cultures tend to think highly of themselves even if they shouldn’t. They also blame problems on bad luck rather than take responsibility and learn from failures. Our rivals may succeed through good luck, but not us. We earned our way to the top.

While it’s been widely found that some 70% of corporate takeovers hurt the stock-market value of the acquiring company, studies find that roughly three-quarters of executives report that takeovers they were involved in had been successes.

The really aware decision makers (the sort who read articles like this one) realize the limitations they face. So, they redouble their efforts, insisting on greater vigilance and deeper analysis.

The problem is that that isn’t enough. As the long history of corporate failures show, vigilant and analytical executives can still come up with demonstrably bad strategies.

The solution is not to just be more careful. Accept that the tendency toward decision-making errors is deeply ingrained and adopt devil’s advocates and other explicit mechanisms to counter those tendencies.

A Method for Avoiding Group Think

Have you ever been in a meeting where everyone else rallies around a position that you were sure was wrong? You wonder whether you should make waves by being the only one to disagree. Maybe everyone else knows something you don’t? Chances are good that you kept quiet, especially if the boss was among the supporters.

Extensive research shows that you would not be alone in doing so—and that organizations would be better off if they could keep dissenters like you from buckling under group pressure.

In 1955, Soloman Asch conducted a series of landmark experiments that demonstrated the tendency to acquiesce. Asch put a subject into a small group of people he hadn’t met. The group was taken through a series of visual tests where the answers were obvious but, after a while, all the participants other than the subject would give unanimous, incorrect answers. Unknown to the subject, the others were all cooperating with the experimenters.

As Asch put it, subjects were being tested to see what mattered more to them, their eyes or their peers.

The eyes had it, but not by much. Asch reported that, in 128 runnings of the experiment, subjects gave the wrong answer 37% of the time. Many subjects looked befuddled. Some expressed their feelings that the rest of the group was wrong. But they went along.

Interestingly, Asch found that all it took was one voice of dissent, and the subject gave the correct answer far more frequently. If just one other person in the room, gave the correct answer, the subject went along with the majority just 5% of the time.

In organizational settings, the tendency to conform is heightened because the subject is complicated, the answers unclear. There are social and economic bonds that tie a group together, and there is a very human tendency to yield to authority.

Following in Asch’s footsteps, a series of experiments conducted by Stanley Milgram in the 1960s demonstrated obedience to authority to startling proportions.

Executives too often squash dissent because they feel that it will keep them from moving quickly. Some argue that allowing disagreement can halt action entirely. Tom Kelly, the renowned innovation expert at design firm IDEO, wrote in The Ten Faces of Innovation:

“Every day, thousands of great ideas, concepts and plans are nipped in the bud by devil’s advocates.”

John Kotter, professor emeritus at Harvard Business School and a highly regarded expert on leadership and change management, captured the frustration of many executives when he said:

“Every visionary knows the frustration of pitching a great idea, only to see it killed by naysayers.”

But how do leaders know whether a contrary view is standing in the way of their bold, visionary stroke or a disastrous folly? They don’t.

Rather than reinforce conformity and squash dissent, leaders (at every level) should, instead, heed the advice of Peter Drucker, who wrote in The Effective Executive,

“Decisions of the kind the executive has to make are not made well by acclamation. They are made well only if based on conflicting views, the dialogue between different points of view, the choice between different judgments.”

More important, Drucker observed, only disagreement can provoke imagination and alternatives:

“A decision without an alternative is a desperate gambler’s throw, no matter how carefully thought out it might be.”

In The Essence of Strategic Decisions, Charles Schwenk reports that numerous field and laboratory studies found that decision-making was much improved if someone on the team is brave enough to dissent. In particular, dissenters are most useful when organizations tackle complex, ill-structured problems—such as critical business strategy questions. Constructive questioning and debate increase the quality of assumptions, increase the number of alternatives considered and improve decision makers’ use of ambiguous information to make predictions.

But as leaders try to encourage constructive questioning and debate, they must remember that there’s a catch for dissenters. As one executive warned me,

“Devil’s advocates, if occasionally right, will get hunted down and killed by the antibodies in a company. Remember, they just won an argument. That means that someone else lost.”

(I think he meant “hunted down and killed” figuratively, which isn’t as dramatic as when Saddam Hussein personally shot a senior minister in his government when that minister suggested, quite mildly, that Iraq might want to consider looking for a peaceful settlement of its 1980s war with Iran.)

Indeed, just relaying bad news can be hazardous to your career. A study cited by James Surowiecki in The Wisdom of the Crowds found that those who delivered bad news in corporations were tainted, even if they had nothing to do with causing the problem and even if their bosses said they knew the messenger wasn’t at fault.

The current emphasis on teamwork can create problems, too. In good conditions, strong teams can function with impressive efficiency. But the bonds of teamwork can make it hard to deliver tough news. Teams tend to be formed of people who resemble each other in many ways, and they become friends. You don’t want to tell your friend that he’s messed up.

So, somehow, a balance must be struck. Constructive debate needs to be encouraged without injecting paralysis into the organization. Always remember, however, that the natural tendency is toward conformity, not debate. And, without debate, the consequences can be disastrous for both the organization and its leaders.

Take Bill Smithburg, who led Quaker Oats’ $1.7 billion purchase of Snapple in 1994. Although analysts warned at the time that the price could be as much as $1 billion too high, Smithburg saw synergies. Those synergies never materialized. Quaker sold Snapple for $300 million just three years after buying it, and Smithburg was out as chief executive. Reflecting on the failed acquisition several years later, Smithburg said,

“There was so much excitement about bringing in a new brand, a brand with legs. We should have had a couple of people arguing the ‘no’ side of the equation.”

Quaker Oats was eaten up by Pepsico a few years later.

Sometimes the “no side of the equation” is the one that pushes for change. In this case, be careful not to follow the example of Ed Schwinn, when he was CEO of the business that bore his family name. When a Schwinn team looked at the possibilities of mountain bikes in the 1980s, Ed Schwinn felt that they were a passing fad and argued against major investment in them. Schwinn was the dominant maker of bikes, and he didn’t see any reason that would change. A senior executive felt otherwise and argued his position vociferously. Ed Schwinn adjourned the meeting and said the group would reconvene on the issue in two weeks. They did—after Schwinn fired the contrarian. That decision turned out to be a catastrophic misjudgment. Schwinn (the company) followed Ed Schwinn’s intuition and never caught up. The company went into bankruptcy in 1992.

Better to follow the example of Alfred Sloan, the legendary builder of General Motors. Sloan once said to a meeting of one of his top committees, “Gentlemen, I take it we are all in complete agreement on the decision here?” Everyone around the table nodded.

“Then,” Sloan continued, “I propose we postpone further discussion of this matter until our next meeting to give ourselves time to develop disagreement and perhaps gain some understanding of what the decision is all about.”

Next time you’re about to embark on a major initiative, or decide against one, make sure you have a couple of people arguing the “no” side.