There are people who spend their lives peeling an onion. If they are lucky, it is a sweet bulb, and offers up its layers without too many tears. If they are very lucky, what the peeling reveals is interesting enough that others are interested and pay attention. And if they are very, very lucky, well, then the Royal Swedish Academy of Sciences honors their efforts with the most prestigious award that onion peelers can receive.
There were two people peeling this particular onion, but one passed away before their work was fully recognized. In telling their story on their work on human decision making psychology, Daniel Kahneman recognizes his longtime collaborator Amos Tversky while sharing their work that was recognized with the 2002 Nobel Prize in Economics.
Since before the 1970s, it was well understood that humans were rational decision makers, and that when we strayed from this rational behavior, it was driven by some strong emotion. Anger, fear, hatred, love — these were the things that pushed us into irrationality. This makes perfect sense. This “Utility Theory” was well known and not really challenged because it was, well, obvious. Tversky and Kahneman, however, challenged its depiction of rational human decision making in their 1979 paper, “Prospect Theory: An Analysis of Decision Under Risk.” We, it turns out, don’t behave very rationally at times (the Ultimatum Game is a very good example of this). But what made this paper special was that they went beyond documenting the failure of Utility Theory and pointed their fingers at the design of the machinery of cognition as the reason, rather than the corruption of thought by emotion. They argued that heuristics and biases were the key players in judgement and decision making. This was a revolutionary idea. The first layer of the onion had been peeled back, and as you might expect, it revealed more questions.
“Jumping to conclusions on the basis of limited evidence is so important to an understanding of intuitive thinking, and comes up so often in this book, that I will use a cumbersome abbreviation for it: WYSIATI, which stands for what you see is all there is. System 1 is radically insensitive to both the quality and the quantity of the information that gives rise to impressions and intuitions.”
Kahneman describes the cognitive machine with a cast of two players. These are, as he calls them, “System 1” and “System 2.” It would be easy to call these the subconscious and the conscious; this would be a good first approximation, but it wouldn’t be entirely correct. System 1 does operate quickly, automatically, and with no sense of voluntary control, as you would expect the subconscious to do. But it is responsible for feeding System 2 with things like feelings, impressions, and intuitions. System 2 generally takes what System 1 gives it without too much fuss. But System 1 behaves in funny ways sometimes, producing some very interesting results.
Interestingly, you can give System 2 a specific task that taxes the cognitive resources and some interesting things happen. For example, watch the video below and count the number of times the players wearing white shirts pass the ball.
Did you get the correct number? Did you see the gorilla? About half of the people who watch the video simply do not see the gorilla. System 2 is busy counting, while System 1 is supporting that task and not distracting it with irrelevant extraneous information. Sometimes, it would appear, we are quite blind to the obvious. And blind to our own blindness. This “tunnel vision” of sorts happens not only when we are cognitively busy, but also in times of stress or high adrenaline.
I experienced this personally a few years ago during an Emergency Response training exercise. I was part of the two-man team entering a room where we had to assess the scene and respond accordingly. The instructor running the exercise had taped a sign to the wall with information that would provide some answers to questions we would have in the room; things that would be obvious in a real situation. Except it wasn’t obvious. I didn’t see it. At all. A coworker was playing the part of an injured person and lying on the floor and safely removing him from the scene was all we could think about. As we debriefed I was asked why I did not address the issues on the sign. I had to go back to see the sign for myself to convince myself they were serious. I was shocked.
After Prospect Theory, discerning the rules for the cognitive machine became a hot research area in the field of cognitive psychology. And what they found is astounding. A brief overview of the heuristics and biases can be found online, and these are discussed in detail in the text. Some, like the affect heuristic, make perfect sense. Emotion and belief or action are tied together. This is a significant influence in how you create your beliefs about the world. But priming, on the other hand, is downright scary because we have no conscious knowledge that it is taking place.
One of the interesting things that comes out of this research is that not only are humans are not rational thinkers, we aren’t very good statistical thinkers either. Kahneman and Tversky’s first paper together was “Belief in the law of small numbers.” The “law of small numbers” asserts, somewhat tongue in cheek, that the “law of large numbers” applies to small numbers as well. There is much truth about this in how we build very lasting first impressions, quickly finding rules where random chance is a better explanation. Kahneman’s insights into the day-to-day decisions and judgements that we make, with no thought into them, are priceless.
“The idea that large historical events are determined by luck is profoundly shocking, although it is demonstrably true…It is hard to think of the history of the twentieth century, including its large social movements, without bringing in the role of Hitler, Stalin, and Mao Zedong. But there was a moment in time, just before an egg was fertilized, when there was a fifty-fifty chance that the embryo that became Hitler could have been a female. Compounding the three events, there was a probability of one-eighth of a twentieth century without any of the three great villains and it is impossible to argue that history would have been roughly the same in their absence. The fertilization of these three eggs had momentous consequences, and it makes a joke of the idea that long-term developments are predictable.”
It is pleasant to find an academic that can write a general interest book. Too frequently the result of such an effort is a dense tome that is closer to a textbook. Thinking Fast and Slow is enjoyably readable. But it is more than that. It is a very complete book—a dissecting of the machinery of the mind. It pulls back, in 38 chapters, the covers to reveal in plain language the mechanisms that operate our minds every day. The sorts of things that go on behind the scenes in every decision we make. But also the myriad ways that advertising professionals can and do manipulate us.
This is not a weekend quick read. The paperback version weighs in at 512 pages. That shouldn’t hold you back. After all, this is a review of an entire lifetime of Nobel Prize-winning research, in clear language without jargon, told with its historical perspective. There is gold on every page and I’m grateful for every one of them.