Brain vs. Gut: Making Smart Decisions in the Face of Uncertainty

Media274In a previous blog, I mentioned Daniel Kahneman’s book, Thinking Fast and Slow. It’s an excellent, in-depth look at how human beings think and make decisions, based on decades of research (much of which was conducted by Kahneman himself with various colleagues). I find this stuff fascinating; in fact, I regularly give a lunch-n-learn talk called, “Should You Trust Your Gut? – Human psychological weaknesses when making decisions in the face of uncertainty.”

The key message in Thinking Fast and Slow is that we all have two different ways of thinking, which Kahneman calls System 1 and System 2. System 1 is our intuitive, emotional, gut-feel approach to thinking. It’s filled with heuristics which have served us well as a species for many millennia, and which still serve us well – about 99% of the time. Imagine how impossible our lives would be if we had to stop and consciously consider all the pros and cons associated with every mundane choice we make each day. Which shirt should you wear? Should you smile and say hello to the stranger at the coffee shop? What should you order for lunch? We make these decisions quickly and unconsciously, and our lives run much more smoothly as a result. System 1 is always switched on.

System 2 is our rational, calculating, logical brain at work. When we balance our checkbook, or decide whether we’re better off paying $4.19 for the 15 ounce jar of mixed nuts or $7.99 for the 28 ounce jar, System 2 springs into action. As our lives, our businesses, and our world have become more complex, the need for System 2 has increased dramatically. The problem is, System 2 is lazy. Most of us won’t do the math on the mixed nuts problem; it’s too difficult to do in our heads. Instead, we fall back on our intuition and personal experience: usually, larger packages of stuff are the better buy (giant economy size!), so we assume that the 28 ounce jar of nuts is the better deal. In this case, we would be wrong, but not by much.

If you’re buying mixed nuts, it probably doesn’t matter much if you end up paying a bit more per nut. If you’re choosing a retirement plan or making a major capital investment for your company, overpaying and/or under-performing may matter a great deal. Andrew McAfee of MIT’s Sloan School of Management recently posted an article entitled, “The Future of Decision Making: Less Intuition, More Evidence.” Dr. McAfee introduces example after example in which number-crunching analysis has outperformed intuition, from predicting the quality of wine vintages to forecasting suicide rates to diagnosing and treating disease (the “evidence-based medicine” we’ve heard so much about in the last couple of years). It’s a very good summary of the subject.

Unfortunately, the cause of evidence-based decision making was set back significantly by Malcolm Gladwell’s best-selling 2007 book, Blink. It’s a well-written, entertaining, and ultimately extremely misleading book. It is essentially a series of anecdotes about situations in which someone was able to make an instantaneous, intuitive, correct decision, but they could not consciously tell anyone why they knew what they knew. Evelyn Harrison of the J. Paul Getty Museum knew that a kouros (a type of ancient Greek statue) was fake the instant she saw it, despite the documentation to the contrary. Vic Braden, a well-known tennis coach, realized that he could almost always tell when a player was going to double-fault as soon as they started into their motion on their second serve. From these stories, Gladwell draws the conclusion that we all think too much, that we should all just follow our gut instincts, and we’ll be better off for it.

This is nonsense. Blink is based on two enormously flawed ideas. First, the book is filled with anecdotes, not research results. One of the few clever things I have heard come out of the mouth of a politician was spoken by Senator Barbara Boxer of California (in a completely different context): “The plural of ‘anecdote’ is not ‘data.’” Stories are great for illustrating concepts, and there is ample evidence to show that people are far more swayed by descriptive anecdotes than they are by statistical data. But anecdotes should be used to provide examples of legitimate statistical research results. When they aren’t, they’re simply misleading.

Second, the examples Gladwell cites invariably fall into one of two categories: 1) situations for which evolution has equipped human beings with excellent instincts (like sensing danger or reading another person’s expressions), and 2) experts drawing on decades of experience in their fields.

In contrast to this line of thinking, Daniel Kahneman and his co-researcher, Gary Klein, describe four criteria, all of which must be met before we should trust our intuition:

· Familiarity: Do we have a lot of experience with similar situations?

· Feedback: Did we get consistent, reliable feedback?

· Equanimity: Were the situations emotionally neutral, or emotionally charged?

· Lack of Bias: Were and/or are we now potentially influenced by any inappropriate personal interests?

If the circumstances fail even one of these tests, Kahneman and Klein recommend using some kind of analysis – i.e., engaging System 2, rather than relying solely on System 1.

Many people –especially experts and/or high-ranking decision makers – find the notion that data analysis is superior to experienced intuition or expert judgment to be unsettling.

In his article, McAfee describes the resistance he has encountered when he presents the results of these studies. Many people simply refuse to believe the research results. This is a classic example of cognitive dissonance, a phenomenon described in detail by Carol Tavris and Elliot Aronson in their book, Mistakes Were Made (But Not By Me). Cognitive dissonance occurs when someone is presented with evidence that contradicts a deeply held belief or value that person holds. Faced with a choice between abandoning their belief or denying the evidence, they deny the evidence. Most experts and high-ranking decision makers hold a strong belief in their own intuition, especially in their areas of expertise. They would sooner deny hard facts than to give up that belief.

Sometimes, the denial has even deeper roots. Tavris and Aronson describe how hard it was to get 19th century doctors to acknowledge the evidence that washing their hands between patients would greatly reduce the spread of disease. How do you reconcile “I’m a good doctor; I save people’s lives” with “I’ve contributed to the deaths of many patients in the past by failing to wash my hands?” Many doctors just couldn’t accept this, and so despite the evidence, they denied that hand washing would make any difference. There are also examples of 21st century district attorneys who refuse to believe DNA evidence which exonerates someone they convicted many years ago. How do you deal with “I ruined this innocent man’s life” when you see yourself as one of the good guys, keeping the bad guys off the streets? The short answer is, many people can’t. And so, they deny the evidence.

Finding the correct balance between System 1 and System 2 isn’t easy. The fact is, for most of our day-to-day lives, System 1 works just fine. The problems arise when we insist on sticking with System 1 in a situation which requires System 2. And those situations are becoming more and more common.

Leave a Reply

Your email address will not be published. Required fields are marked *

Solve : *
30 × 15 =


Decision Strategies