If you want a textbook example of how negative thinking can help prevent errors, look no further than the BP oil spill in the Gulf of Mexico. As an editorial in New York Times makes clear, “BP’s disjointed response suggested it had given little thought to the possibility of a blowout at 5,000 feet.”
And it’s not just BP that gave little thought to a blowout. The same could be said of many Wall Street firms. They, too, failed to give serious thought to the possibility of a blowout of the U.S. housing market. In testimony earlier this year, for instance, JP Morgan CEO Jamie Dimon made this admission: “In mortgage underwriting,” he said, “somehow, we just missed that home prices don’t go up forever…”
Nobody likes to think negatively (by which I mean thinking seriously and deeply on the front end of a problem about what can go wrong). We prefer to think there will always be blue sky.
But in any serious endeavor, negative thinking is absolutely necessary. Eisenhower thought so. That’s why, when planning the invasion of Normandy, he went so far as to draft a letter taking responsibility for the failure of the D-Day invasion.
More recently, Lloyd Blankfein, the head of Goldman Sachs, testified about the essential importance of negative thinking, which at Goldman takes the form of stress tests.
“…The one thing that we constantly learn from every crisis,” he said. “is the need for more stress tests.”
“What a stress test does is it says, ‘Don’t tell me that this is unlikely. What if it did happen?’
“’But, it’s not going to happen.’”
“’What if it did’”?
What if it did? That’s the key question every negative thinker needs to ask — and it’s one that BP clearly avoided.
But there’s a natural impediment to negative thinking. Researchers call this impediment a confirmation bias. But The Wall Street Journal, in a recent article, referred to it as the “yes-man in your head.” Either way, it amounts to the same thing: when we have a decision to make and set out to gather information, our search isn’t neutral. As the Journal piece points out, we are twice as likely to seek information that confirms our original belief as we are to seek information that contradicts it.
In other words, the more we know, the more certain we become that we are right. And we go on believing that — right up to the blowout.