Immediately after the trauma center staff called Redelmeier to come to the operating room, they diagnosed the heart problem on their own—or thought they had. The young woman remained alert enough to tell them that she had a past history of an overactive thyroid. An overactive thyroid can cause an irregular heartbeat. And so, when Redelmeier arrived, the staff no longer needed him to investigate the source of the irregular heartbeat but to treat it. No one in the operating room would have batted an eye if Redelmeier had simply administered the drugs for hyperthyroidism. Instead, Redelmeier asked everyone to slow down. To wait. Just a moment. Just to check their thinking—and to make sure they were not trying to force the facts into an easy, coherent, but ultimately false story.
Something bothered him. As he said later, “Hyperthyroidism is a classic cause of an irregular heart rhythm, but hyperthyroidism is an infrequent cause of an irregular heart rhythm.” Hearing that the young woman had a history of excess thyroid hormone production, the emergency room medical staff had leaped, with seeming reason, to the assumption that her overactive thyroid had caused the dangerous beating of her heart. They hadn’t bothered to consider statistically far more likely causes of an irregular heartbeat. In Redelmeier’s experience, doctors did not think statistically. “Eighty percent of doctors don’t think probabilities apply to their patients,” he said. “Just like 95 percent of married couples don’t believe the 50 percent divorce rate applies to them, and 95 percent of drunk drivers don’t think the statistics that show that you are more likely to be killed if you are driving drunk than if you are driving sober applies to them.”
Redelmeier asked the emergency room staff to search for other, more statistically likely causes of the woman’s irregular heartbeat. That’s when they found her collapsed lung. Like her fractured ribs, her collapsed lung had failed to turn up on the X-ray. Unlike the fractured ribs, it could kill her. Redelmeier ignored the thyroid and treated the collapsed lung. The young woman’s heartbeat returned to normal. The next day, her formal thyroid tests came back: Her thyroid hormone production was perfectly normal. Her thyroid never had been the issue. “It was a classic case of the representativeness heuristic,” said Redelmeier. “You need to be so careful when there is one simple diagnosis that instantly pops into your mind that beautifully explains everything all at once. That’s when you need to stop and check your thinking.”
It wasn’t that what first came to mind was always wrong; it was that its existence in your mind led you to feel more certain than you should be that it was correct. “Beware of the delirious guy in the emergency unit with the long history of alcoholism,” said Redelmeier, “because you will say, ‘He’s just drunk,’ and you’ll miss the subdural hematoma.” The woman’s surgeons had leapt from her medical history to a diagnosis without considering the base rates. As Kahneman and Tversky long ago had pointed out, a person who is making a prediction—or a diagnosis—is allowed to ignore base rates only if he is completely certain he is correct. Inside a hospital, or really anyplace else, Redelmeier was never completely certain about anything, and he didn’t see why anybody else should be, either.
* * *
Redelmeier had grown up in Toronto, in the same house in which his stockbroker father had been raised. The youngest of three boys, he often felt a little stupid; his older brothers always seemed to know more than he did and were keen to let him know it. Redelmeier also had a speech impediment—a maddening stammer he would never cease to work hard, and painfully, to compensate for. (When he called for restaurant reservations, he just told them his name was “Don Red.”) His stammer slowed him down when he spoke; his weakness as a speller slowed him down when he wrote. His body was not terribly well coordinated, and by the fifth grade he required glasses to correct his eyesight. His two great strengths were his mind and his temperament. He was always extremely good at math; he loved math. He could explain it, too, and other kids came to him when they couldn’t understand what the teacher had said. That is where his temperament entered. He was almost peculiarly considerate of others. From the time he was a small child, grown-ups had noticed that about him: His first instinct upon meeting someone else was to take care of the person.
Still, even from math class, where he often wound up helping all the other students, what he took away was a sense of his own fallibility. In math there was a right answer and a wrong answer, and you couldn’t fudge it. “And the errors are sometimes predictable,” he said. “You see them coming a mile away and you still make them.” His experience of life as an error-filled sequence of events, he later thought, might be what had made him so receptive to an obscure article, in the journal Science, that his favorite high school teacher, Mr. Fleming, had given him to read in late 1977. He took the article home with him and read it that night at his desk.
The article was called “Judgment Under Uncertainty: Heuristics and Biases.” It was in equal parts familiar and strange—what the hell was a “heuristic”? Redelmeier was seventeen years old, and some of the jargon was beyond him. But the article described three ways in which people made judgments when they didn’t know the answer for sure. The names the authors had given these—representativeness, availability, anchoring—were at once weird and seductive. They made the phenomenon they described feel like secret knowledge. And yet what they were saying struck Redelmeier as the simple truth—mainly because he was fooled by the questions they put to the reader. He, too, guessed that the guy they named “Dick” and described so blandly was equally likely to be a lawyer or an engineer, even though he came from a pool that was mostly lawyers. He, too, made a different prediction when he was given worthless evidence than when he was given no evidence at all. He, too, thought that there were more words in a typical passage of English prose that started with K than had K in the third position, because the words that began with K were easier to recall. He, too, made predictions about people from mere descriptions of them with a degree of confidence that was totally unjustified—even uncertain Don Redelmeier fell prey to overconfidence! And when asked quickly to guess the product of 1 × 2 × 3 × 4 × 5 × 6 × 7 × 8, he saw how he, too, would think it less than the product of 8 × 7 × 6 × 5 × 4 × 3 × 2 × 1.
What struck Redelmeier wasn’t the idea that people made mistakes. Of course people made mistakes! What was so compelling is that the mistakes were predictable and systematic. They seemed ingrained in human nature. Reading the article in Science reminded Redelmeier of all the times he had made what seemed in retrospect to be an obvious mistake on a math problem—because it was so much like the other mistakes he and others had made. One passage in particular stuck with him—it was in the section on this thing they called “availability.” It talked about the role of the imagination in human error. “The risk involved in an adventurous expedition, for example, is evaluated by imagining contingencies with which the expedition is not equipped to cope,” the authors wrote. “If many such difficulties are vividly portrayed, the expedition can be made to appear exceedingly dangerous, although the ease with which disasters are imagined need not reflect their actual likelihood. Conversely, the risk involved in an undertaking may be grossly underestimated if some possible dangers are either difficult to conceive of, or simply do not come to mind.”
This wasn’t just about how many words in the English language started with the letter K. This was about life and death. “That article was more thrilling than a movie to me,” said Redelmeier. “And I love movies.”