There are books you enjoy reading, and then there are books you can’t wait to tell your friends about. But the best books are the ones you actually buy for your friends to make sure they read them. Michael Lewis’ The Undoing Project falls in that latter category.
There are books you enjoy reading, and then there are books you can’t wait to tell your friends about. But the best books are the ones you actually buy for your friends to make sure they read them. Michael Lewis’ The Undoing Project falls in that latter category.1
I admit to being an avid Lewis fan, having read almost all his books. He has a unique talent for explaining complex economic, financial, and statistical concepts in a readily understandable fashion, in part by weaving explanations into compelling tales about intriguing people. The Undoing Project does just that, telling the story of the extraordinarily synergistic collaboration and close friendship between 2 brilliant Israeli psychologists, Daniel Kahneman and Amos Tversky, who quite literally changed the way we think about thinking. The book is also a fascinating story about the sources of human creativity, the power of teamwork, and the ability to discern profound scientific concepts from seemingly mundane and decidedly low-tech observations. The psychologists’ story is made all the more compelling by its dramatic backdrop, set against the tumultuous birth of Israel, and its rich intellectual development accelerated by the ever-present and existential threat of war. But all of that is not why you should read the book.
The Undoing Project should be read (especially by physicians) because it describes, in a lucid and compelling manner, how we think and why we so often make mistakes. It details our highly evolved, very efficient, and incredibly rapid ability to intuitively discern relevant features out of complex patterns using associated reasoning. This allows us to reach judgements about our environment (eg, safe vs dangerous) and make choices among possible actions by weighing risk versus reward. Kahneman described this process as System 1 thinking, which employs cognitive short cuts or heuristics.2 While this process is usually highly efficient and effective-explaining the dominance of Homo sapiens, it occasionally leads to major errors in judgment (eg, becoming dinner for a lion, tiger, or bear or a Bernie Madoff). For physicians, it explains our ability to walk in a patient’s room and make an instant and lifesaving diagnosis or make the wrong diagnosis, order the wrong medication, or operate on the wrong side of a patient. Fortunately, we are also endowed, to variable degrees of proficiency, with the ability to think slowly, carefully assembling and analyzing data, testing assumptions, coming to sound conclusions, and making few errors. Kahneman refers to this type of cognition as System 2 thinking.2 But it’s the errors inherent in heuristics that are the focus of Michael Lewis’s book as well as of much of Tversky’s and Kahneman’s early careers.3
Lewis starts of where his book Moneyball ended, showing the power of predictive analytics in putting together winning MLB teams for less money. But he points out that there are limits to such an approach, and that these limits reflect fundamental errors in the way even seasoned scouts view potential athletic talent and the probability of future success. He notes that even highly successful scouts often err by focusing on an athlete’s readily identifiable but often irrelevant characteristics because they fit a preconceived bias (eg, scouts missed Jeremy Lin’s potential because Chinese-American shooting guards from Harvard didn’t fit their stereotype). This is why so much conventional wisdom is just plain wrong. The rest of the book describes how Kahneman and Tversky discovered the sources of such errors by deconstructing human cognition and decision making. It also details some of the effects of their discoveries on various disciplines including medicine and economics.
Kahneman survived a harrowing childhood in Nazi-occupied France, emigrating to Israel after the war. Tall, introspective, and deeply analytical, he graduated from Hebrew University with a degree in psychology and was assigned by the Israeli Defense Forces to identify personality types likely to succeed in various military fields. Kahneman quickly discerned that military stereotypes were of little value because they were subject to a variety of biases. In contrast, the Israeli-born Tversky, was short, voluble, extroverted, energetic, and supremely self-confident; he was a paratrooper and legitimate war hero. He also studied psychology and became intensely interested in the science of decision making. Both Kahneman and Tversky undertook graduate studies at the University of Michigan but their collaboration blossomed when they returned to Israel and joined the faculty of Hebrew University in 1969.
The 2 would spend long stretches talking, hypothesizing, shredding each other’s arguments, and proposing clever tests for pet theories-often undertaken on unsuspecting undergraduates. They initially focused on errors made by trained statisticians due to an acceptance of conclusions based on small Ns-a common mistake even today in medicine. Next they discerned that common errors in judging the probability of events usually resulted from misapplication of basic heuristics. These heuristics include representativeness, availability, and anchoring.
The representativeness heuristic occurs when probabilities are estimated based on a perception of how closely A is representative of B. While this cognitive short cut is often accurate, it can lead to serious errors because representativeness fails to incorporate other factors that are more germane to accurate probability estimates. Kahneman and Tversky cite as an example leaping to the conclusion that the most likely occupation of someone described as meek and tidy with a need for order and a passion for detail is a librarian. But because there are far more farmers, lawyers, salesmen, etc. than librarians and thus far more individuals with these characteristics among these other professions, such a conclusion is fundamentally flawed. Tversky and Kahneman also detailed factors leading to representativeness errors including: insensitivity to prior probabilities, overweighting of small sample sizes, overestimation of the actual predictability of a given event, and unwarranted confidence in the accuracy of our predictions (ie, the illusion of validity).3 Another cause of representativeness error is failure to appreciate the phenomenon of regression toward the mean. This has the effect of leading to an overestismate of the value of negative incentives and an underestimate of the value of positive incentives.
A second source of error is the availability heuristic, in which we judge the probability of an event simply based on our ability to recall similar events. This introduces all sorts of biases including overestimating the size of an association based on the ease with which we can retrieve an example. This, in turn can reflect how we search for associations.
A third source of errors results from use of anchoring heuristics. This error reflects our tendency to make estimates of final results based on initial data, which may be highly skewed. For example, assume you are given 5 seconds to calculate either 8 x 7 x 6 x 5 x 4 x 3 x 2 x 1 or 1 x 2 x 3 x 4 x 5 x 6 x 7 x 8. Which set of numbers is more likely to result in a quick estimate closest to the correct calculation of 40,320? Obviously, use of the former set consistently leads to more accurate estimates. Because of anchoring biases, people tend to overestimate the probability of conjunctive (related) events and underestimate the probability of disjunctive (unrelated) events.
Lewis provides lots of examples to better understand these biases which underpin many new disciplines ranging from evidenced-based medicine to behavioral economics. Chapter 8 addresses the former by telling the story of Dr. Don Redelemier, who studied the cause of preventable errors in a Canadian trauma hospital. After reading Kahneman and Tversky’s article in Science,3 he grasped that these heuristic errors could also be the source of medical errors. As fate would have it, he then served an internal medicine residency at Stanford after Tversky had been recruited to their faculty. A lunch the two men shared sparked a collaboration that helped lay the foundation of evidenced-based medicine. For example, they discovered that physicians made different and often lower value management decisions when considering patients as individuals versus groups.4 They also pointed out that framing is important in accurate medical decision making: the more detailed the description of a condition and the more diagnostic options provided to a physician, the more likely that he or she would reach an accurate diagnosis and management plan (aka the Unpacking Principle).5 Redelemier also met Kahneman, after the latter had begun to explore emotional influences on decision making. Together they published a paper which concluded that patients tended to consider potential losses as more significant than the corresponding gains when choosing management plans, and that patients’ interpretation of events and decision-making was strongly influenced by how situations/data were presented to them or how they were framed.6
This understanding of the impact of emotions on decision making also formed the foundation of behavioral economics. Loss aversion is a key concept here as well. In general, people find the loss of X dollars more aversive than they find the gain of X dollars attractive. However, people usually don’t weigh monetary values in simple, dispassionate, numerical terms but rather, assign subjective values or “utilities” to financial outcomes. Thus, risk aversion at a given monetary value tends to decrease with increasing wealth. Lewis details the impact of Tversky’s and Kahneman’s work on the field of behavioral economics through the eyes of Richard Thaler, a University of Chicago behavioral economist who was heavily influenced by their writing and became another one of Kahneman’s collaborators.
Throughout the book, Lewis returns to Kahneman’s and Tversky’s personal lives, chronicling their trials and tribulations, academic achievements and frustrations, and, over time, the chilling of their relationship caused by time, distance, and professional jealousy. The story concludes with one of them winning the Nobel Prize but I will not spoil the ending.
Why is this book so important to read? I believe it gives powerful insights into how we think and why we make errors. Recognizing the sources of human errors may help you avoid them. Understanding how humans think may help you think better. I hope that it will also motivate you to read Kahneman’s New York Times bestseller, Thinking Fast and Slow2 and many of Kahneman’s and Tversky’s elegantly crafted papers. But if for no other reason, read The Undoing Project because it’s just a great book.
1. Lewis, Michael. The Undoing Project: A Friendship That Changes Our Minds. New York, W.W. Norton & Company, 2017.
2. Kahneman, Daniel. Thinking Fast and Slow. New York, Farrar, Straus and Giroux, 2011.
3. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974 Sep 27;185(4157):1124-31.
4. Redelmeier DA, Tversky A. Discrepancy between medical decisions for individual patients and for groups. N Engl J Med. 1990 Apr 19;322(16):1162-4.
5. Redelmeier DA, Koehler DJ, Liberman V, Tversky A. Probability judgement in medicine: discounting unspecified possibilities. Med Decis making. 1995 Jul-sep;15(3):227-30.
6. Redelmeier DA, rozin P, Kahneman D. Understanding patients’ decisions. Cognitive and emotional perspectives. JAMA. 1993 Jul 7;270(1):72-6.