The Undoing Project: A Friendship that Changed the World

Redelmeier had never heard of the authors—Daniel Kahneman and Amos Tversky—though at the bottom of the page it said that they were members of the Department of Psychology at Hebrew University in Jerusalem. To him it was more important that his older brothers had never heard of them, either. Aha, finally. I know something more than my brothers! he thought. Kahneman and Tversky offered what felt like a private glimpse of the act of thinking. Reading their article was like getting a peek behind the magician’s curtain.

Redelmeier didn’t have much trouble figuring out what he wanted to do with his life. As a kid he’d fallen in love with the doctors on television—Leonard McCoy on Star Trek and, especially, Hawkeye Pierce on M*A*S*H. “I sort of wanted to be heroic,” he said. “I would never cut it in sports. I would never cut it in politics. I would never make it in the movies. Medicine was a path. A way to have a truly heroic life.” He felt the pull so strongly that he applied to medical school at the age of nineteen, during his second year of college. Just after his twentieth birthday he was training, at the University of Toronto, to become a doctor.

And that’s where the problems started: The professors didn’t have much in common with Leonard McCoy or Hawkeye Pierce. A lot of them were self-important and even a bit pompous. Something about them, and what they were saying, led Redelmeier to seditious thoughts. “Early on in medical school there are a whole bunch of professors who are saying things that are wrong,” he recalled. “I don’t dare say anything about it.” They repeated common superstitions as if they were eternal truths. (“Bad things come in threes.”) Specialists from different fields of medicine faced with the same disease offered contradictory diagnoses. His professor of urology told students that blood in the urine suggested a high chance of kidney cancer, while his professor of nephrology said that blood in the urine indicated a high chance of glomerulonephritis—kidney inflammation. “Both had exaggerated confidence based on their expert experience,” said Redelmeier, and both mainly saw only what they had been trained to see.

The problem was not what they knew, or didn’t know. It was their need for certainty or, at least, the appearance of certainty. Standing beside the slide projector, many of them did not so much teach as preach. “There was a generalized mood of arrogance,” said Redelmeier. “ ‘What do you mean you didn’t give steroids!!????’” To Redelmeier the very idea that there was a great deal of uncertainty in medicine went largely unacknowledged by its authorities.

There was a reason for this: To acknowledge uncertainty was to admit the possibility of error. The entire profession had arranged itself as if to confirm the wisdom of its decisions. Whenever a patient recovered, for instance, the doctor typically attributed the recovery to the treatment he had prescribed, without any solid evidence that the treatment was responsible. Just because the patient is better after I treated him doesn’t mean he got better because I treated him, Redelmeier thought. “So many diseases are self-limiting,” he said. “They will cure themselves. People who are in distress seek care. When they seek care, physicians feel the need to do something. You put leeches on; the condition improves. And that can propel a lifetime of leeches. A lifetime of overprescribing antibiotics. A lifetime of giving tonsillectomies to people with ear infections. You try it and they get better the next day and it is so compelling. You go to see a psychiatrist and your depression improves—you are convinced of the efficacy of psychiatry.”

Redelmeier noticed other problems, too. His medical school professors took data at face value that should have been inspected more closely, for example. An old man would come into the hospital suffering from pneumonia. They’d check his heart rate and find it to be a reassuringly normal seventy-five beats per minute . . . and just move on. But the reason pneumonia killed so many old people was its power to spread infection. An immune system responding as it should generated fever, coughs, chills, sputum—and a faster than normal heartbeat. A body fighting an infection required blood to be pumped through it at a faster than normal rate. “The heart rate of an old man with pneumonia is not supposed to be normal!” said Redelmeier. “It’s supposed to be ripping along!” An old man with pneumonia whose heart rate appears normal is an old man whose heart may well have a serious problem. But the normal reading on the heart rate monitor created a false sense in doctors’ minds that all was well. And it was precisely when all seemed well that medical experts “failed to check themselves.”

As it happens, a movement was taking shape right then and there in Toronto that came to be called “evidence-based medicine.” The core idea of evidence-based medicine was to test the intuition of medical experts—to check the thinking of doctors against hard data. When subjected to scientific investigation, some of what passed for medical wisdom turned out to be shockingly wrong-headed. When Redelmeier entered medical school in 1980, for instance, the conventional wisdom held that if a heart attack victim suffered from some subsequent arrhythmia, you gave him drugs to suppress it. By the end of Redelmeier’s medical training, seven years later, researchers had shown that heart attack patients whose arrhythmia was suppressed died more often than the ones whose condition went untreated. No one explained why doctors, for years, had opted for a treatment that systematically killed patients—though proponents of evidence-based medicine were beginning to look to the work of Kahneman and Tversky for possible explanations. But it was clear that the intuitive judgments of doctors could be gravely flawed: The evidence of the medical trials now could not be ignored. And Redelmeier was alive to the evidence. “I became very aware of the buried analysis—that a lot of the probabilities were being made up by expert opinion,” said Redelmeier. “I saw error in the way people think that was being transmitted to patients. And people had no recognition of the mistakes that they were making. I had a little unhappiness, a little dissatisfaction, a sense that all was not right in the state of Denmark.”

Toward the end of their article in Science, Daniel Kahneman and Amos Tversky had pointed out that, while statistically sophisticated people might avoid the simple mistakes made by less savvy people, even the most sophisticated minds were prone to error. As they put it, “their intuitive judgments are liable to similar fallacies in more intricate and less transparent problems.” That, the young Redelmeier realized, was a “fantastic rationale why brilliant physicians were not immune to these fallibilities.” He thought back to the errors he had made while trying to solve math problems. “The same problem solving exists in medicine,” he said. “In math you always check your work. In medicine, no. And if we are fallible in algebra, where the answers are clear, how much more fallible must we be in a world where the answers are much less clear?” Error wasn’t necessarily shameful; it was merely human. “They provided a language and a logic for articulating some of the pitfalls people encounter when they think. Now these mistakes could be communicated. It was the recognition of human error. Not its denial. Not its demonization. Just the understanding that they are part of human nature.”

But Redelmeier kept to himself any heretical thoughts he harbored as a young medical student. He had never felt the impulse to question authority or flout convention, and had no talent for either. “I was never shocked and disappointed before in my life,” he said. “I was always very obedient. Law-abiding. I vote in all elections. I show up at every university staff meeting. I’ve never had an altercation with the police.”

Michael Lewis's books