The research and insights of Daniel Kahneman, a psychologist and 2002 Nobel Prize winner in economics, are extremely valuable to leaders as they seek to improve strategic decision-making. Hahnemann’s best-selling book, “Thinking Fast and Slow”, illuminates how the mind works based on his years of research. His fascinating results can alert leaders about unconscious cognitive processes that are influencing thinking, often causing sub-optimal decision-making. Sound decisions based on best available data are the hallmark of a leader, but the natural workings of the mind can interfere.
Kahneman describes two brain processes. “Fast thinking” is impulsive, automatic, and intuitive. This legacy of human evolution had inherent survival advantages, allowing humans to take rapid action when needed. “Slow-thinking” is thoughtful, deliberate, and analytical. It activates when the mind faces a situation it does not automatically comprehend. Slow-thinking involves conscious mental activities such as self-control, choice and focused attention, which are the tools of emotional intelligence, self-awareness, and strategic decision-making. Having awareness of cognitive traps gives you the ammunition you need to avoid them. You will improve decision-making by moving away from “thinking fast” and the traps described below, whereas “thinking slow” will produce a better outcome.
We have a default tendency to make fast thinking snap judgments that oversimplify analyses of complex situations. Shortcuts or “heuristics” allow for quick decisions, but we tend to overuse them. With the substitution heuristic, we substitute an easier question for the one that we actually need to answer. In hiring, for example, the tough question: “Will this person be successful in the job”, which requires significant study of their background and history of success, is replaced by the easier question: “Does the person interview well”. The availability heuristic overestimates the importance or probability of something that is most personally relevant, recently heard, or vividly remembered. Managers working from memory in conducting performance appraisals will weigh more heavily employee behaviors that are most easily recalled, which are usually dramatic examples, whether good or bad. Furthermore, more recent memories are automatically weighted more heavily.
There is a natural unconscious tendency to seek out and rely on information that confirms our beliefs. This confirmation bias also downplays or dismisses information that might change our minds. The result is decision-making informed with only partial information and not all that is required. When preparing your proposals and strategies, actively seek out opposing information and viewpoints, and have your team do the same. In meetings, you might have the person making a proposal argue against it. An opponent of the proposal can in turn, in good faith, argue for it. Another unconscious bias, the endowment effect, means that just owning something makes it more valuable to that person. Along with the related loss aversion effect, people would rather leave things as they are than risk a loss. Most strategists are good at identifying the risks of new businesses, but it is much more difficult to see the risk of failing to change. Analyzing existing businesses, products and operations with the same scrutiny as a new investment will help avoid this trap.
Avoidable statistical mistakes are made regularly. As with confirmation bias, the resulting decision-making utilizes only part of the needed information. There are numerous types of probability-related errors. Let’s examine just one. Base rate neglect occurs when we ignore general information about a population and focus on specific information that applies only in a certain case. Almost half of Harvard medical students made the following base rate error in an exam: A serious, but rare, the disease affects 1 in 1000 people. How worried should you be in receiving a positive determination using a test with 95% accuracy? Most people would worry that they had a 95% chance of having the disease. Half the Harvard students thought that as well. However, because the base rate is very low (1/1000), the actual likelihood is roughly 2%, and the chance of a false positive is 98%. Traps like these can negatively impact your decisions. Avoiding them will give your decisions more power.
Would these illustrative statistics ever come to mind? People fear death from sharks, but in America cows or horses are 40 times more likely to kill (mostly kicks), and deer are 130 times more (auto-accidents).