We write often about how the only thing predictable at a startup is — the lack of predictability. Today HBS Working Knowledge has a quartet of essays on how you can anticipate surprises, manage disasters vis a vis your team, and ultimately, learn from failures when they occur. It’s a captivating series, being based on some historic events, and each essay has great instruction on leadership:
* Ernest Shackleton’s failed expedition across Antarctica in 1914 (pictured): How disaster changes leadership goals
* The 1996 ascent of Mount Everest, (chronicled by Jon Krakauer in Into Thin Air: Effective signaling in a crisis — nurturing confidence, dissent and commitment in your organization.
* NASA’s failed Mission to Mars in 1999: Lessons from a failure.
But the fourth essay, Planning for Surprises, has the most relevance for founders. Why focus on reacting to such disasters when you can anticipate or even avoid them?
“Predictable surprises [like Enron] happen when leaders had all the data and insight they needed to recognize the potential, even the inevitability, of major problems, but failed to respond with effective preventative action” … Here’s the good news: There are reasons why leaders fail to prevent predictable surprises and there are ways to identify trouble while there is still time to stop it.
A few of these reasons are…
I. The 3 Vulnerabilities:
A few classic factors, really types of vulnerability, “conspire to keep us from dealing with problems that are worthy of our attention,” the authors write.
* Psychological vulnerabilities : have to do with well-recognized biases in the way people think, such as self-serving illusions and overcommitment, as well as the tendency to stick with the status quo and to discount the future. (This is the: “I can do this, if i just keep trying/have more time,” despite previous results that suggest otherwise.)
* Organizational vulnerabilities: arise because of structural barriers to the effective collection, processing, and dissemination of information, such as the division of organizations into independently operating silos and the filtering of information as it passes up through the hierarchies. (This is the “lack of communication problem.” Are you getting–and giving–the information that you need?)
* Political vulnerabilities contribute to predictable surprises when a small number of individuals and organizations are able to “capture” the political system or organization for their own benefit. (This is about incentives. Do the people on your team have the same goals? Or are your interests competing?)
II. Cognitive biases
…”people tend to exist in a state of denial that leads us to undervalue risks. In addition, people overly discount the future, reducing our willingness to invest in the present to prevent some disaster that may be quite distant.”
(For more on cognitive bias, see Venture Hacks’ terrific summary of Marc Andreessen’s essay on the same, published earlier this week. Marc’s piece is based on the theories of Charlie Munger, a.k.a. Warren Buffett’s investing partner for 60 years.)
III. Fear of the Unknown / Love of the Status Quo
…”People also try to maintain the status quo [and] are more willing to run the risk of incurring a large but small-probability loss in the future rather than accepting a smaller, yet certain loss now. We don’t want to invest in preventing a problem that we have not experienced and can hardly bear to imagine. Thus, far too often, we only address problems after the surprise has occurred.”
So what can do?
Step 1: recognize where you are susceptible to the 3 Vulnerabilities
Step 2: identify your cognitive biases
Step 3: become willing to experience small, but certain, losses today. No more wishful thinking that low-likelihood bigger surprises won’t happen in future. Deal now.
And the BIG ONE…
Step 4: Assume a ‘veil of ignorance’
This sounds strange, but the key issue here is to identify the difference between your hypotheses and your assumptions. Most of us make an assumption, where we really should be posing a hypothesis — and then rigorously testing that hypothesis. To test appropriately, you must forget your pre-dispositions, your cognitive biases, your vulnerabilities and assume a veil of ignorance. Otherwise you won’t read the results correctly.
The authors write:
In practice it is difficult to assume complete ignorance. But it is essential to try to be conscious of the assumptions you are making about what is possible and critically what is not possible. To the extent that you can treat these as hypotheses to be rigorously challenged and tested, rather than as assumptions that are taken for granted, you reduce the potential to be predictably surprised. People always learn with a point of view; the key is to be open to altering that point of view in the face of reality.
For example, we question whether the Bush administration objectively considered the issue of weapons of mass destruction (WMD) in Iraq from an objective standpoint (under a veil of ignorance), or whether the administration started with a partisan perception and then avoided any reasonable challenge to this view. Richard Clarke, the former counterterrorism czar under Former President Clinton and President Bush, made a compelling case that the administration never escaped their desire to see the data as they wanted to see it.
They say ignorance is bliss. Well now, you’ve got a very good reason to make it one of your business tools, too. Following the steps laid out by these authors, ought to help you deal with small problems now, and avoid larger more devastating problems later on.
The complete essay, Planning for Surprises is based on the new book
Predictable Surprises: The Disasters You Should Have Seen Coming and How to Prevent Them by Max H. Bazerman (Harvard) and Michael D. Watkins (INSEAD). Pick it up!