The Real Anthony Fauci: Bill Gates, Big Pharma, and the Global War on Democracy and Public Health

Dr. Fauci never doubted that he wanted to be a doctor, commenting that in high school, “[T]here really was no question that I was going to be a physician. I think there was subliminal stimulation from my mother, who, right from the very beginning when I was born, wanted me to be a physician.”6

Dr. Fauci earned his medical degree from Cornell in 1966, graduating first in his class. Like his wife, immunologist and NIH’s Bioethics Department Director Dr. Christine Grady, Dr. Fauci is a lifelong germaphobe, but he confesses that he went into virology and immunology not so much to kill bugs as to avoid combat service in Vietnam: “I left Cornell and went into my internship and residency in 1966. That was at the exponential phase of the Vietnam War, and every single physician went into military service. I can remember very clearly when we were gathered in the auditorium at Cornell early in our fourth year of medical school. The recruiter from the Armed Forces came there and said, ‘Believe it or not, when you graduate from medical school at the end of the year, except for the two women, everyone in this room is going to be either in the Army, the Air Force, the Navy, or the Public Health Service. So, you’re going to have to take your choice. Sign up and give your preferences.’ So I put down Public Health Service as my first choice and then the Navy. Essentially, I came down to the NIH because I didn’t have any choice.”7

The US Public Health Service was a heavily militarized public health agency led by its uniformed officer corps, including the surgeon general, which had grown out of military hospitals operated by the early Navy. NIH was its research arm created during World War II to support soldiers’ health during the war. As infectious disease mortalities in the US declined precipitously in the mid-1950s, NIH maintained its relevance by declaring war on cancer.8,9

“I was very lucky because I knew that it was a phenomenal scientific opportunity. I wanted to learn some basic cellular immunology with the ultimate aim of going into what has been my theme for the past twenty-one years—human immunobiology and the regulation of the human immune system.”10

After completing his residency at Cornell Medical Center, Dr. Fauci joined NIH in 1968 as a clinical associate at the NIAID, one of two dozen of NIH’s sub-agencies. In 1977, he became deputy clinical director of NIAID. Oddly, his specialty was applied research in immune-mediated illness—a subject of increasingly grave national concern. He would spend the next fifty years largely ignoring the exploding incidence11 of autoimmunity and allergic diseases, except to the extent they created profitable markets for new pharmaceuticals. Dr. Fauci became NIAID’s director on November 2, 1984, just as the AIDS crisis was spiraling out of control.

NIAID: A Sleepy, Irrelevant Agency

When Dr. Fauci assumed leadership of NIAID, the agency was a backwater. Allergic and autoimmune disorders were hardly a factor in American life. Peanut allergies, asthma, and autoimmune diseases (e.g., diabetes and rheumatoid arthritis) were still so rare that their occasional occurrences in schoolchildren were novelties. Most Americans had never seen a child with autism; only a tiny handful would recognize the term until the 1988 film Rain Man introduced it into the vernacular. Cancer was the disease Americans increasingly feared, with nearly all the attention at NIH and the bulk of federal health funding going to the National Cancer Institute (NCI).

Worst of all, by the era of Dr. Fauci’s ascendance as an ambitious bureaucrat at NIAID, infectious diseases were no longer a significant cause of death in America. Dramatic improvements in nutrition, sanitation, and hygiene had largely abolished the frightening mortalities from mumps, diphtheria, smallpox, cholera, rubella, measles, pertussis, puerperal fever, influenza, tuberculosis, and scarlet fever.12 The devastating lethality from these former scourges that decimated earlier generations of Americans had dwindled. From 1900, when one-third of all deaths were linked to infectious diseases (e.g., pneumonia, tuberculosis, and diarrhea and enteritis), through 1950, infectious disease mortality decreased dramatically (except for the 1918 Spanish flu), leveling off in the 1950s to what we see today, about 5 percent of all US deaths.13

Annual deaths from communicable disease dropped in the 1980s to around 50 per hundred thousand population, from 800 per hundred thousand in 1900.14 By the twentieth century, more people were dying of old age and heart attacks than from contagious illnesses.15

At NIAID and at its sister agency, CDC, the bug hunters were sliding into irrelevance. NIAID’s heyday was a distant memory; it had served at the forefront of the war against deadly pestilence. NIH had mobilized scientists to track the epidemics of cholera, Rocky Mountain spotted fever, and the 1918 Spanish flu contagion that infected and killed millions globally.

Today CDC and NIAID promote the popular orthodoxy: that intrepid public health regulators, armed with innovative vaccines, played the key role in abolishing mortalities from these contagious illnesses. Both science and history dismiss this self-serving mythology as baseless. As it turns out, the pills, potions, powders, surgeries, and syringes of modern medicine played only a minor role in the historic abolition of infectious disease mortalities.

An exhaustive 2000 study by CDC and Johns Hopkins scientists published in Pediatrics, the official journal of the American Academy of Pediatrics, concluded, “Thus vaccination does not account for the impressive declines in [infectious disease] mortality seen in the first half of the [20th] century . . . nearly 90 percent of the decline in infectious disease mortality among US children occurred before 1940, when few antibiotics or vaccines were available.”16

Similarly, a comprehensive 1977 study by McKinlay and McKinlay, formerly required reading in almost all American medical schools, found that all medical interventions, including vaccines, surgeries, and antibiotics, contributed only about 1 percent of the decline and at most 3.5 percent.17 Both CDC and the McKinlays attributed the disappearance of infectious disease mortalities not to doctors and health officials, but to improved nutrition and sanitation—the latter credited to strict regulation of food preparation, electric refrigerators, sewage treatment, and chlorinated water. The McKinlays joined Harvard’s iconic infectious disease pioneer, Edward Kass, in warning that a self-serving medical cartel would one day try to claim credit for these public health improvements as a pretense for imposing unwarranted medical interventions (e.g., vaccines) on the American public.

As the McKinlays and Kass18 had predicted, vaccinologists successfully hijacked the astonishing success story—the dramatic 74 percent decline in infectious disease mortalities of the first half of the twentieth century—and deployed it to claim for themselves, and particularly for vaccines, a revered and sanctified—and scientifically undeserving—prestige beyond criticism, questioning, or debate.

Robert F. Kennedy's books