Smarter Faster Better: The Secrets of Being Productive in Life and Business

Just two decades earlier, flying from Rio to Paris had been a much more taxing affair. Prior to the 1990s and advances in cockpit automation, pilots were responsible for calculating dozens of variables during a flight, including airspeed, fuel consumption, direction, and optimal cruising altitude, all while monitoring weather disturbances, discussions with air traffic control, and the plane’s position in the sky. Such trips were so demanding that pilots often rotated responsibilities. They all knew the risks if vigilance waned. In 1987, a pilot in Detroit had become so overwhelmed during takeoff that he had forgotten to set the wing flaps. One hundred and fifty-four people died when the plane crashed after takeoff. Fifteen years before that, pilots flying near Miami had become fixated on a faulty landing gear light and had failed to notice that they were gradually descending. One hundred and one people were killed when the craft slammed into the Everglades. Before automated aviation systems were invented, it wasn’t unheard of for more than a thousand people to die each year in airplane accidents, often because pilots’ attention spans were stretched too thin, or due to other human errors.

The plane flying from Rio to Paris, however, had been designed to eliminate such mistakes by vastly reducing the number of decisions a pilot had to make. The Airbus A330 was so advanced that its computers could automatically intervene when problems arose, identify solutions, and then tell pilots, via on-screen instructions, where to direct their focus as they responded to computerized prompts. In optimal conditions, a human might fly for only about eight minutes per trip, during takeoff and landing. Planes like the A330 had fundamentally changed piloting from a proactive to a reactive profession. As a result, flying was easier. Accident rates went down, and airlines’ productivity soared because more customers could travel with less crew. A transoceanic flight had once required as many as six pilots. By the time of Flight 447, thanks to automation, Air France needed only two people in the cockpit at any given time.

Four hours into the trip, midway between Brazil and Senegal, the plane crossed the equator. Most of the passengers would have been asleep. There were clouds from a tropical storm in the distance. The two men in the cockpit remarked on static electricity dancing across the windows, a phenomenon known as St. Elmo’s fire. “I’m dimming the lighting a bit to see outside, eh?” said Pierre-Cedric Bonin, the pilot whose wife was in the passenger cabin. “Yes, yes,” the captain replied. There was a third aviator in a small hold behind the cockpit, taking a nap. The captain summoned the third man to switch places, and then left the two junior pilots at the controls so he could sleep. The plane was flying smoothly on full autopilot at thirty-two thousand feet.

Twenty minutes later there was a small bump from turbulence. “It might be a good idea to tell the passengers to buckle up,” Bonin informed a stewardess over the intercom. As the air surrounding the cockpit cooled, three metal cylinders jutting from the craft’s body—the pitot tubes, which measure airspeed by detecting the force of air flowing into them—became clogged with ice crystals. For almost a hundred years, aviators have complained about, and safely accommodated, ice in pitot tubes. Most pilots know that if their airspeed measurement plunges unexpectedly, it’s likely because of clogged pitot tubes. When the pitot tubes on Flight 447 froze over, the plane’s computers lost airspeed information and the auto-flight system turned off, as it was programmed to do.

A warning alarm sounded.

“I have the controls,” Bonin said calmly.

“Okay,” his colleague replied.

At this point, if the aviators had done nothing at all, the plane would have continued flying safely and the pitot tubes would have eventually thawed. But Bonin, perhaps shaken out of a reverie by the alarm and wanting to offset the loss of the autopilot, pulled back a bit on the command stick, causing the plane’s nose to nudge upward and the aircraft to gain altitude. Within one minute, it had ascended by three thousand feet.

With Flight 447’s nose now pointed slightly upward, the plane’s aerodynamics began to change. The atmosphere at that height was thin, and the ascent had disrupted the smooth flow of air over the plane’s wings. The craft’s “lift”—the basic force of physics that pulls airplanes into the sky because there is less pressure above a wing than below it—began deteriorating. In extreme conditions, this can cause an aerodynamic stall, a dangerous situation in which a plane starts falling, even as its engines strain with thrust and the nose points skyward. A stall is easy to overcome in its early stages. Simply lowering the nose so air begins flowing smoothly over the wings prevents a stall from emerging. But if a plane’s nose remains upward, a stall will become worse and worse until the airplane drops like a stone in a well.

As Flight 447 rose through the thin atmosphere, a loud chime erupted in the cockpit and a recorded voice began warning, “Stall! Stall! Stall! Stall!,” indicating that the plane’s nose was pointed too high.

“What’s this?” the copilot said.

“There’s no good…uh…no good speed indication?” Bonin responded. The pitot tubes were still clogged with ice and so the display did not show any airspeed.

“Pay attention to your speed,” the copilot said.

“Okay, okay, I’m descending,” Bonin replied.

“It says we’re going up,” the copilot said, “so descend.”

“Okay,” said Bonin.

But Bonin didn’t descend. If he had leveled the plane, the craft would have flown on safely. Instead, he continued pulling back on the stick slightly, pushing the airplane’s nose further into the sky.



Automation has today penetrated nearly every aspect of our lives. Most of us now drive cars equipped with computers that automatically engage the brakes and reduce transmission power when we hit a patch of rain or ice, often so subtly we never notice the vehicle has anticipated our tendency to overcorrect. We work in offices where customers are routed to departments via computerized phone systems, emails are automatically sent when we’re away from our desks, and bank accounts are instantaneously hedged against currency fluctuations. We communicate with smartphones that finish our words. Even without technology’s help, all humans rely on cognitive automations, known as “heuristics,” that allow us to multitask. That’s why we can email the babysitter while chatting with our spouse and simultaneously watching the kids. Mental automation lets us choose, almost subconsciously, what to pay attention to and what to ignore.

Automations have made factories safer, offices more efficient, cars less accident-prone, and economies more stable. By one measure, there have been more gains in personal and professional productivity in the past fifty years than in the previous two centuries combined, much of it made possible by automation.

But as automation becomes more common, the risks that our attention spans will fail have risen. Studies from Yale, UCLA, Harvard, Berkeley, NASA, the National Institutes of Health, and elsewhere show errors are particularly likely when people are forced to toggle between automaticity and focus, and are unusually dangerous as automatic systems infiltrate airplanes, cars, and other environments where a misstep can be tragic. In the age of automation, knowing how to manage your focus is more critical than ever before.

Take, for instance, Bonin’s mindset when he was forced to take control of Flight 447. It is unclear why he continued guiding the plane upward after agreeing with his copilot that they should descend. Maybe he hoped to climb above the storm clouds on the horizon. Perhaps it was an unintentional reaction to the sudden alarm. We will never know why he didn’t return the controls to neutral once the stall warning sounded. There is significant evidence, however, that Bonin was in the grip of what’s known as “cognitive tunneling”—a mental glitch that sometimes occurs when our brains are forced to transition abruptly from relaxed automation to panicked attention.

Charles Duhigg's books