Tag Archives: groupthink

How to Avoid Falling for Groupthink

“In our world, parallel lines do not meet, and you can’t turn an orange wrong side out.” –Joseph Krutch

Yet, I see carrier management and agency owners regularly default to wishful thinking in their decision making. In other words, while parallel lines do not meet, these specific people believe the lines should meet. They make decisions on that assumption. Reality, though, does not change. Parallel lines can never meet no matter how much a person wishes the lines did meet. To pretend otherwise and make decisions in this alternative reality can only lead to problems if not disaster. And just because disaster does not happen immediately should not be taken as a sign that it won’t.

A famous company once had an executive promoting how he had made parallel lines meet. Because most people did not look too closely and the accounting was opaque, what people did not notice was that a shell con game was being played (not literally a con game). While the line-bending benchmarks were being touted, the results being reported were mostly due to an entirely new part of the business that did not have a true financial reckoning until the credit crisis. Even after that, it took about 10 more years for people to admit they’d been seeing a mirage — when financial promises were finally and completely broken.

Always remember: Parallel lines never meet. And you can work to see the lines as they are rather than as you wish them to be.

An example or two might help. Take two carriers of equal size. One has an issue causing expenses to be $200 million more than the competitor’s. All else being equal, the first carrier’s loss ratio needs to be the equivalent of $200 million better. Yet the management of that carrier has proclaimed that its loss ratio only needs to be the same because there is no way its expense ratio is higher than average. So, the carrier loses money the next year, and the next, and the next, and the next. Parallel lines are straight, not warped. The carrier’s thinking is warped (true story).

Or, there are agency owners who think that unmotivated producers will become motivated on their own initiative. That is warped. If the producers had initiative, they would already be motivated (true story, multiplied by thousands).

Yet humans are pre-programmed to believe what they want to believe, reality be damned. St. Augustine wrote something to the effect of, “Do not plan long journeys [pilgrimages], (to help you believe more in something like an aspect of religion in this realm) because whatever you believe in you have already seen.” I have read one theory that the strong desire of humans to believe in whatever they want to believe is for survival. If they did not believe in the unreal, they might give up hope. That makes some sense to me, so the challenge is to know when to believe in reality and to be disciplined enough to recognize the “when” to believe in reality regardless of how sour that reality may be.

See also: Another Reason for Insurers to Embrace AI

The solution, one of the few solutions actually, is to have someone close to you who will always be brutally straight with you.

Another solution is to be away from your kingdom, your organization, when you seek advice. A human’s ability to accept reality often increases the farther one is from home.

Another solution for larger organizations is to always have outsiders on the board and give them extra influence or voice. An entity will begin believing in alternate realities even more rigidly than individuals. This is what happens with groupthink. An example of carrier groupthink is everyone at a carrier thinking it has great claims service even though the agents, based on their customers’ experience, almost universally say differently (again, true story).

Reality really can suck. No bones about that. But reality usually wins, so if you want to be a winner, take steps to understand and accept reality on a vigorous basis.

How We’re Wired to Make Bad Decisions

Business is a contact sport. Some companies win while others lose. That won’t change. There is no way to guarantee success. Make the best decisions you can, and then fight the battle in the marketplace.

Yet research into more than 2,500 large corporate failures that Paul Carroll and I did found that many big decisions are doomed as they come off the drawing board—before first contact with the competition. Why?

The short answer is that humans are far from rational in their planning and decision-making. Psychological and anthropological studies going back decades, including those of Solomon AschStanley MilgramIrving JanisDonald Brown and, more recently, Dan Ariely, consistently demonstrate that even the smartest among us face huge impediments when making complicated decisions, such as those involved in setting strategy.

In other words, humans are hard-wired to come up with bad decisions. Formulating good ones is very difficult because of five natural tendencies:

1. Fallacious assumptions: If “point of view is worth 80 IQ points,” as Alan Kay says, people often start out in a deep hole.

One problem is the anchoring bias, where we subconsciously tend to work from whatever spreadsheet, forecast or other formulation we’re presented. We tend to tinker rather than question whether the assumptions are right or whether the ideas are even worth considering. Even when we know a situation requires more sophisticated analysis, it’s hard for us to dislodge the anchors.

See also: Downsizing: Common Sense in Decision-Making May Lead to a Trap  

Another strike against expansive thinking is what psychologists call the survivorship bias: We remember what happened; we don’t remember what didn’t happen. We are encouraged to take risks in business, because we read about those who made “bet the company” decisions and reaped fortunes—and don’t read about those that never quite made the big time because they made “bet the company” decisions and lost.

2. Premature closure: People home in on an answer prematurely, long before we evaluate all information.

We get a first impression of an idea in much the same way we get a first impression of a person. Even when people are trained to withhold judgment, they find themselves evaluating information as they go along, forming a tentative conclusion early in the process. Premature conclusions, like first impressions, are hard to reverse.

A study of analysts in the intelligence community, for instance, found that, despite their extensive training, analysts tended to come to a conclusion very quickly and then “fit the facts” to that conclusion. A study of clinical psychologists found that they formed diagnoses relatively rapidly and that additional information didn’t improve those diagnoses.

3. Confirmation bias: Once people start moving toward an answer, they look to confirm that their answer is right, rather than hold open the possibility that they’re wrong.

Although science is supposed to be the most rational of endeavors, it constantly demonstrates confirmation bias. Ian Mitroff’s The Subjective Side of Science shows at great length how scientists who had formulated theories about the origins of the Moon refused to capitulate when the moon rocks brought back by Apollo 11 disproved their theories; the scientists merely tinkered with their theories to try to skirt the new evidence.

Max Planck, the eminent physicist, said scientists never do give up their biases, even when they are discredited. The scientists just slowly die off, making room for younger scientists, who didn’t grow up with the errant biases. Planck could just as easily been describing most business people.

4. Groupthink: People conform to the wishes of the group, especially if there is a strong person in the leadership role, rather than ask tough questions.

Our psyches lead us to go along with our peers and to conform, in particular, to the wishes of authority figures. Numerous psychological experiments show that humans will go along with the group to surprising degrees.

From a business standpoint, ample research, supported by numerous examples, suggest that even senior executives, as bright and decisive as they typically are, may value their standing with their peers and bosses so highly that they’ll bend to the group’s wishes—especially when the subject is complicated and the answers aren’t clear, as is always the case in strategy setting.

5. Failure to learn from past mistakes: People tend to explain away their mistakes rather than to acknowledge their errors, making it impossible to learn from them.

Experts are actually more likely to suffer from overconfidence than the rest of the world. After all, they’re experts. Studies have found that people across all cultures tend to think highly of themselves even if they shouldn’t. They also blame problems on bad luck rather than take responsibility and learn from failures. Our rivals may succeed through good luck, but not us. We earned our way to the top.

See also: How to Lead Like a Humble Gardener  

While it’s been widely found that some 70% of corporate takeovers hurt the stock-market value of the acquiring company, studies find that roughly three-quarters of executives report that takeovers they were involved in had been successes.

The really aware decision makers (the sort who read articles like this one) realize the limitations they face. So, they redouble their efforts, insisting on greater vigilance and deeper analysis.

The problem is that that isn’t enough. As the long history of corporate failures show, vigilant and analytical executives can still come up with demonstrably bad strategies.

The solution is not to just be more careful. Accept that the tendency toward decision-making errors is deeply ingrained and adopt devil’s advocates and other explicit mechanisms to counter those tendencies.

5 Ways to Flub a Big Decision

Business is a contact sport. Some companies win, while others lose. That won’t change. There is no way to guarantee success. Make the best decisions you can, then fight the battle in the marketplace.

Yet research into more than 2,500 large corporate failures that Paul Carroll and I did found that many big decisions are doomed as they come off the drawing board—before first contact with the competition. Why?

The short answer is that humans are far from rational in their planning and decision-making. Psychological and anthropological studies going back decades, including those of Solomon AschStanley MilgramIrving JanisDonald Brown and, more recently, Dan Ariely, consistently demonstrate that even the smartest among us face huge impediments when making complicated decisions, such as those involved in setting strategy.

In other words, humans are hard-wired to come up with bad decisions. Formulating good ones is very difficult because of five natural tendencies:

1. Fallacious assumptions: If “point of view is worth 80 IQ points,” as Alan Kay says, people often start out in a deep hole.

One problem is the anchoring bias, where we subconsciously tend to work from whatever spreadsheet, forecast or other formulation we’re presented. We tend to tinker rather than question whether the assumptions are right or the ideas are even worth considering. Even when we know a situation requires more sophisticated analysis, it’s hard for us to dislodge the anchors.

See Also: Better Way to Think About Leadership

Another strike against expansive thinking is what psychologists call the survivorship bias: We remember what happened; we don’t remember what didn’t happen. We are encouraged to take risks in business, because we read about those who made “bet the company” decisions and reaped fortunes—and don’t read about those who never quite made the big time because they made “bet the company” decisions and lost.

2. Premature closure: People home in on an answer prematurely, long before we evaluate all information.

We get a first impression of an idea in much the same way we get a first impression of a person. Even when people are trained to withhold judgment, they find themselves evaluating information as they go along, forming a tentative conclusion early in the process. Premature conclusions, like first impressions, are hard to reverse.

A study of analysts in the intelligence community, for instance, found that, despite their extensive training, analysts tended to come to a conclusion very quickly and then “fit the facts” to that conclusion. A study of clinical psychologists found that they formed diagnoses relatively rapidly and that additional information didn’t improve those diagnoses.

3. Confirmation bias: Once people start moving toward an answer, they look to confirm that their answer is right, rather than hold open the possibility that they’re wrong.

Although science is supposed to be the most rational of endeavors, it constantly demonstrates confirmation bias. Ian Mitroff’s The Subjective Side of Science shows at great length how scientists who had formulated theories about the origins of the Moon refused to capitulate when the moon rocks brought back by Apollo 11 disproved their theories; the scientists merely tinkered with their theories to try to skirt the new evidence.

Max Planck, the eminent physicist, said scientists never do give up their biases, even when they are discredited. The scientists just slowly die off, making room for younger scientists, who didn’t grow up with the errant biases. Planck could just as easily have been describing most business people.

4. Groupthink: People conform to the wishes of the group, especially if there is a strong person in the leadership role, rather than ask tough questions.

Our psyches lead us to go along with our peers and to conform, in particular, to the wishes of authority figures. Numerous psychological experiments show that humans will go along with the group to surprising degrees.

From a business standpoint, ample research, supported by numerous examples, suggest that even senior executives, as bright and decisive as they typically are, may value their standing with their peers and bosses so highly that they’ll bend to the group’s wishes—especially when the subject is complicated and the answers aren’t clear, as is always the case in strategy setting.

5. Failure to learn from past mistakes: People tend to explain away their mistakes rather than to acknowledge their errors, making it impossible to learn from them.

Experts are actually more likely to suffer from overconfidence than the rest of the world. After all, they’re experts. Studies have found that people across all cultures tend to think highly of themselves even if they shouldn’t. They also blame problems on bad luck rather than take responsibility and learn from failures. Our rivals may succeed through good luck, but not us. We earned our way to the top.

While it’s been widely found that some 70% of corporate takeovers hurt the stock-market value of the acquiring company, studies find that roughly three-quarters of executives report that takeovers they were involved in had been successes.

The really aware decision makers (the sort who read articles like this one) realize the limitations they face. So, they redouble their efforts, insisting on greater vigilance and deeper analysis.

The problem is that that isn’t enough. As the long history of corporate failures show, vigilant and analytical executives can still come up with demonstrably bad strategies.

The solution is not to just be more careful. Accept that the tendency toward decision-making errors is deeply ingrained and adopt devil’s advocates and other explicit mechanisms to counter those tendencies.