Wednesday, November 30, 2011

Current Issue |

Current Issue |

What Managers Need to Know About Cognitive Biases
by Neal Beets

“All of us think we see the world as it is, but we see it as we are.”

—Stephen Covey

The field of brain research is hot. Scientists are writing popular books about it.1 Journalists are translating academic research into practical insights for the rest of us.2

One subfield of brain research deals with cognitive biases. Cognitive biases are ways of seeing or deciding that are shaped by our psychology and biology.

What do managers need to know about cognitive biases?

Some cognitive biases are so commonplace they have become part of our language and culture. Things like the bandwagon effect, the herd instinct, and the self-fulfilling prophecy are part of our everyday understanding and vocabulary. So are 20/20 hindsight, Monday morning quarterback, rose-colored glasses, and rationalization.

Less commonly known but just as important for managers are several more cognitive biases.

Managers work in a political environment. This is always the case in the general sense that choices are made and priorities are set by elected officials. In addition, some managers work in explicitly partisan environments where local officials run for office with party labels and party backing. In-group–out-group dynamics are a daily part of both the explicit and implicit political environment.

Even in cities and towns where elections are nonpartisan, political groups and parties still compete to be the powerful, influential, alpha in-group.

One way to become and sustain an in-group is for several elected officials to undercut elected officials from another group and to marginalize staff who are viewed as sympathetic with the emerging out-group. Consequently, one must be cautious in determining how policy input addresses the merits of a public policy question as contrasted with how that input advances or retards the power and influence of a particular group.

Political in-groups and out-groups wage almost constant battle. The intensity of the struggle differs from place to place, time to time, and issue to issue. But, subtle or stormy, the contest for influence and control is always present in organizations, and especially government bodies.

In-group–out-group effects extend beyond politics. Longtime residents of a city, town, or county often begin their public testimony at public meetings by making sure their elected officials know how many decades they have lived in that community. The old-timers discount those who have lived in the community for only a few decades.

In-group–out-group dynamics also play an important part in management and employee relationships. Some employees jockey to be favorites of the manager. Other employees work equally hard to remain aloof or independent; they are proud of marching to the beat of a different drummer. Union bargaining can present classic instances of in-group–out-group allegiance.

Racial, gender, and ethnic discrimination is also a conspicuous and unfortunate reflection of in-group–out-group dynamics.

Whether dealing with malevolent forms of discrimination or more ordinary divisions between people feeling allegiance to one group and not another, we need to know when in-group–out-group dynamics are at play, how to focus on the merits of a problem rather than power struggles, and how to reach across the divide between groups with generosity, understanding, and courtesy.

We all know Garrison Keillor’s mythical Lake Wobegon. It’s the town where all the women are strong, all the men good looking, and all the children above average.

Sim'ilarly, managers often have to deal with the perception that there just isn’t another county, city, or town as hardworking, intelligent, successful, and distinctive as their own. The Lake Wobegon Effect makes it difficult for managers to change anything in their organization; their community is already better than the rest, or so it thinks.

The Lake Wobegon Effect also shapes ethical issues in government. Michael Josephson reminds us that we tend to judge ourselves by our best intentions, but we seem to judge others by their last worst act.3 We are ethical; it’s those others who are unethical. Confirming this bias, studies show that far more than a mere majority of respondents will indicate that they are more ethical, more hardworking, more intelligent, or, yes, even more attractive than “others,” which of course creates an absurdity. How can almost all of us be superior to everyone else?

Most of us tend to magnify our virtues and minimize our vices and to reverse that perception for others. We need the humility to realize this tendency and the courage to combat it by stepping back and putting ourselves in the other person’s shoes (or in a third party’s position) so we can understand and deal with reality more accurately and fairly.

I wish I had a dollar for every elected official who said he or she didn’t have enough information to make a public policy decision. We all have ways of avoiding tough decisions. And this one—“I don’t have enough information”— is one of the most common.

Like other cognitive biases, the information bias has an element of truth to it. In certain contexts it makes perfect sense and is entirely appropriate to want more information. In contrast with shoot-from-the-hip decision making, evidence-based decision making tends to allow passions to cool and thoughtfulness to prevail.

But we also need to recognize and appreciate the limits of information and factual knowledge. Sometimes the information being sought simply isn’t available. Or maybe the information that is available is not relevant and does not really affect the decision to be made. A surfeit of information clouds, it doesn’t clarify.

More fundamentally, small-group decision making in a council setting often depends as much on values as on facts. For example, what are our priorities? What is most important to accomplish, and are we willing to disclose our preferences to the public? Are we willing to pay the price discipline may require of us to follow our stated priorities? Not all decisions require more facts; often they require the courage of one’s convictions.

Finally and most profoundly, life is a mystery. To postpone making a decision until you have “all the facts” can be like waiting for Godot. Facts don’t make decisions, people do. While attempting first to gather all the relevant facts, we must not shy away from actually reaching and explaining a decision. By making and carrying out decisions based on our values and the best available evidence, we can learn from our experience in implementing that decision and reviewing its consequences, intended and unintended. Making a decision puts us in a better position to make the next decision, and the next, and so on.

The favorite bias and tool of lawyers, national politicians, and salespersons is framing: framing means creating a perceptual boundary or image that focuses attention in ways that serve the framer’s purpose. If you can create the dominant frame of reference for a public policy issue, you are more than halfway to persuading others to your position.

The decision frame helps us and others see (and not see) issues, facts, and values in a certain light. This can be good and bad. It is good in the sense of being practical; we all need some way to organize our thoughts and emotions about a subject. It can be bad if the decision frame seriously distorts reality, such as by ignoring key considerations.

The lesson here is not to reject perceptual frames of reference. We might as well reject our eyesight or hearing. Rather, we must respect that all public policy issues come with frames, and we must always question the accuracy, completeness, fairness, and relevance of the offered frame or frames.

Before concluding what we can do about cognitive biases, here are some additional biases affecting our professional lives as government managers.

Primacy effect, recency effect. Something is not more important or more true because you perceive it first or last; it’s just that you remember it more strongly.
Repetition bias, emotional bias, authority bias, belief bias, optimism bias, expectation bias, and wishful thinking. Statements don’t become more true through repetition. They don’t become more true as a consequence of the volume or emotion with which they are presented. Statements don’t become more true because of the length of the explanation or because of the title of the speaker. They are not more true because you hope or expect them to be true.
Inevitability effect. Necessity and inevitability are interpretations, not facts. To say something is inevitable or that we have no choice in the matter is often a sign we have stopped thinking or given up. There are always options; maybe not desirable options, but options nonetheless.
Being a manager is a tough job, made tougher by the cognitive biases we all deal with every day in our own thinking and in the thinking of others.

The challenge is to recognize our biases and compensate appropriately. Get information from many sources. Check your own biases by asking for feedback. Be humble about the complexity of what you are dealing with. Don’t think you have, or anyone has, a monopoly on the truth. To the greatest extent possible, test hypotheses on a small scale, where being wrong has less negative impact.

And we certainly need to disabuse ourselves of such illegal and hurtful biases as racial, ethnic, and gender stereotypes. But, despite all these cautions, we need the courage to make decisions so we can learn from the new experience gained by carrying out a decision.

In dealing with others, recognize that bias is part of being human. The question is not whether elected officials, staff, and community members have biases, but how to deal with those biases. Recognition is key. Compensation is critical to make up for the weaknesses and inaccuracies submerged biases can introduce into your decision-making process.

If we do our best to recognize and compensate for our cognitive biases and the cognitive biases of others, our decisions will be more richly informed and more likely to advance the public good.

1David Eagleman, Incognito: The Secret Lives of the Brain (New York: Pantheon Books, 2011); Steven Pinker, The Blank Slate: The Modern Denial of Human Nature (New York: Viking, 2002).
2David Brooks, The Social Animal: The Hidden Sources of Love, Character, and Achievement (New York: Random House, 2011); Malcolm Gladwell, Blink: The Power of Thinking Without Thinking (New York: Little, Brown and Co., 2005).
3Michael Josephson, You Don’t Have To Be Sick To Get Better (Los Angeles: Josephson Institute of Ethics, 2001).

Neal Beets is town manager, Windham, Connecticut (

No comments:

Post a Comment