Blog

Work skills of the future: constructive uncertainty

In recent years, science has shed a great deal of light on human cognitive bias, but, lamentably, the impacts of those breakthroughs in understanding cognition have yet to be felt in business, for the most part.

The first step for anyone who wants to counter our in-built biases is to be aware of them and take actions that will counter them, to the extent that is possible. That final proviso is based on science, again. In many cases, simply being aware of a certain sort of bias is not sufficient to counter its hold on our reasoning.

Two well-known examples are sharedness bias and preference bias in group decision making (for a longer discussion, see Dissensus, not consensus, is the shorter but steeper path).

  • Sharedness bias is the tendency of a group to judge shared information — information which all or most are aware of prior to entering a discussion — as being more relevant, important, accurate, and influential than unshared information. This leads to unshared information — even when it is in fact relevant, important, and accurate — being less influential.
  • Preference bias is the tendency of group members to stick to their initial — and possibly wrong — preferred solution to an issue or problem. The members will bias their analysis of information to support their initial preference.

But just making members of the group aware  of these biases does not extinguish them. It requires other techniques, such as the active use of dissent. Systematic dissent throughout the decision-making process — especially when advocated by a consistent subset of the members — leads to divergent thinking, which is a new viable, and one that sidesteps these biases.

So, we as individuals and as leaders should actively promote dissent in group decision making, not for the sake of dissent itself, but for its ability to spark divergent thinking and to guide people past the sharing and preference biases lurking in the dark shadows of our minds.

Other well-known biases lurk in our world. For example, research into the judgment of baseball umpires clearly demonstrate that All-Star pitchers get a bigger strike zone than some just up from the minors. This is despite the efforts of umpires to be fair and unbiased. So there is a social bias at work. But nerves also play a factor. Notably, on the most important plays — when you’d imagine they have every incentive to be especially focused — their error rate goes up. Researchers found that umpires were 13% more likely to miss a legitimate strike in the bottom of the ninth inning of a tied game than in the first pitch of the game.

There may be no cure for the umpires, aside from simply using the high speed cameras that Major League Baseball uses to monitor every pitch, and let the umpires make the call based on that. But there would probably still be errors. Perhaps IBM could train Watson to do it.

Howard Ross, the author of Everyday Bias: Identifying and Navigating Unconscious Judgments in Our Daily Lives, available in a few weeks, recommends a general technique to counter our natural biases: constructive uncertainty. As he describes it,

It is helpful to begin to practice what I call constructive uncertainty. Learning to slow down decision-making, especially when it affects other people, can help reduce the impact of bias. This can be particularly important when we are in circumstances that make us feel awkward or uncomfortable.

In effect, Ross is suggesting that we slow down so that our preference and social biases don’t take over, because we are deferring decision making, and are instead gathering information. We may even go so far as to intentionally dissent with the perspectives and observations that we would normally make, but surfacing them in our thinking, not letting them just happen to us.

The idea of constructive uncertainty is not predicated on eliminating our biases: they are as built into our minds as language and lust. On the contrary, constructive uncertainty is based on the notion that we are confronted with the need to make decisions based on incomplete information. More than ever before, learning trumps ‘knowing’, since we are learning from the cognitive scientists that a lot of what we ‘know’ isn’t so: it’s just biased decision making acting like a short circuit, and blocking real learning from taking place.