The Last Epidemic

10/23/14
 
   < < Go Back
 
from The Wall Street Journal,
10/18/14:

Just a few generations ago, progress against infectious disease convinced Americans that modern medicine had won the battle against microbes. Why is the public so skeptical today?

In the winter of 1947, an American tourist arrived in New York City on a bus from Mexico, feeling feverish and stiff. He checked into a hotel and did some sightseeing before his condition worsened. A red rash now covered his body. He went to a local hospital, which monitored his vital signs and transferred him to a contagious disease facility, where he was incorrectly diagnosed with a mild drug reaction. He died a few days later of smallpox.

By this point, the man had infected at least a dozen New Yorkers, one of whom died. Taking no chances, city officials began a massive but voluntary vaccination campaign against a disease that had killed more people than any other in history. Within weeks, several million New Yorkers took the vaccine. Though health experts still disagree about the danger posed by these isolated smallpox cases, one point remains clear: There was precious little panic. Outside schools, fire stations and hospitals, the vaccination lines snaked for blocks. People didn’t worry about the vaccine’s safety; they feared that there might not be enough vaccine to go around.

Sound familiar? Parts of the 1947 smallpox scare—the sick traveler harboring a deadly disease, the missed hospital diagnosis, the quickly spreading infection—strike a disturbing chord. A key difference between that crisis and our current one with Ebola is, of course, the absence of an effective vaccine—and the fact that Ebola is usually transmitted through close, direct physical contact with the bodily fluids of someone infected.

But Americans in the 1940s had a different mind-set as well. Today many Americans doubt that health authorities can handle the crisis. Back then, by contrast, there was a growing confidence in the power of medical research to solve any problem, tame any epidemic, conquer any disease. It was a confidence grounded in the miracle drugs and vaccines beginning to emerge from university and pharmaceutical laboratories, and in the public health apparatus that had served the nation and its troops so well during World War II.

It hadn’t always been this way. What is truly remarkable about the march of modern medicine is how slow the progress was in the preceding centuries. Though the vaccine for smallpox was discovered by the British doctor Edward Jenner in the 1790s, it didn’t trigger a revolution in medical thinking. Until well into the 1850s, the onset of disease was still attributed to foul-smelling clouds of decomposed matter known as “miasmas,” and the most common remedy was to purge ill patients of supposed impurities until the body’s equilibrium was restored.

It’s hard today to imagine such dangerous foolery passing for mainstream medicine, but let one example suffice. In 1799, a Virginia gentleman suffering from a severe throat infection “procured a bleeder in the neighborhood, who took from his arm, in the night, twelve or fourteen ounces of blood.” Feeling no better, the man sent for his doctors. The first to arrive prescribed an enema and then “two copious bleedings.” Seeing no improvement, a second doctor ordered “ten grains of calomel [a devastating mercury-based drug] succeeded by repeated doses of emetic tartar,” causing a massive discharge “from the bowels.”

Then the real bleeding began. Thirty-two ounces were drawn by lancet, while blisters were applied “to the extremities.” (A person giving eight ounces of blood today must wait two months before donating again.) The man finally told his doctors to stop. “Let me go quietly,” George Washington pleaded, and he did.

The great medical breakthroughs in the mid-19th century came mainly from Europe. Among these was the concept of germ theory proposed by Louis Pasteur, Robert Koch and Joseph Lister. Germ theory linked specific germs to specific diseases, like rabies, cholera and tuberculosis. It taught people to accept the peculiar idea that humans shared their communities, their homes, even their bodies with invisible, often dangerous microorganisms. Put simply, what you didn’t see could make you very ill.

Germ theory spurred the development of modern laboratory research. Its impact on pathology and bacteriology can hardly be overstated. In 1900, the life expectancy for an American man was 46, and for an American woman 48. By 1950, the figures had jumped to 65 and 72 respectively.

Some of this increase can be explained by factors such as better nutrition, cleaner water and the passage of pure food and drug laws. But much of it was due to the vaccines, sulfa drugs and antibiotics aimed at the deadly infections that put children at special risk. In the 1870s, one infant in five born in New York City died in the first year of life. Among those fortunate enough to reach adulthood, a quarter did not live to see 30.

Progress came in fits and starts, with devastating setbacks along the way. The influenza pandemic of 1918-1919 killed tens of millions around the globe. Approximately one in four Americans took sick, and a half million died. The number of U.S. soldiers lost to influenza during World War I (44,000) rivaled the number killed by enemy fire (50,000). Army virologists waged an all-out (and moderately successful) campaign to develop an influenza vaccine and began to vaccinate GIs for a host of diseases.

In terms of public confidence, America’s golden age of medicine reached its peak in the 1950s.

It was here that the miracle of the laboratory routed the terror of infectious disease in the most dramatic imaginable way. The disease was polio—also known as infantile paralysis—which descended like a plague upon Americans each summer, killing thousands of children and leaving thousands more in leg braces, wheelchairs and iron lungs. Polio in the 1950s, like Ebola today, put everyone at risk. The fear was palpable.

But Americans channeled these fears into a common purpose, much like the smallpox episode of 1947. Uniting behind Franklin D. Roosevelt’s March of Dimes, they raised hundreds of millions of dollars to find an effective polio vaccine. In a move probably incomprehensible to most parents today, they volunteered their children—almost two million of them—for the massive public trials in 1954 that tested Dr. Jonas Salk ’s killed-virus injected polio vaccine. When the results came in, showing the vaccine to be “safe, effective, and potent,” the nation celebrated. At a White House ceremony honoring Salk, President Eisenhower fought back tears as he told the young researcher: “I have no words to thank you. I am very, very happy.”

Ebola is currently dominating the news, and for good reason. Part of an entire continent is at risk. Named for the Ebola River in Central Africa, where it first emerged in 1976, the Ebola virus, like polio and influenza, has several different strains. The reservoir for the virus is uncertain, though bats—the flying mammals that harbor dozens of viruses perilous to humans—are the leading suspects. A bat takes a bite of fruit; it falls to the ground; a primate eats the remains; a villager slaughters the primate—there are multiple variations.

What seems most apparent at this early point is the yawning chasm between public health officials and the public at large. We live in a post-Vietnam, post-Watergate, Internet-obsessed culture, where respect for government pronouncements and expert opinion has dramatically eroded. Distrust is now endemic, and a crisis like Ebola, which few saw coming, much less planned for, only fuels this divide.

Health officials strongly believe that the chances of a major outbreak occurring in the U.S. are slim to none. The disease is not transmitted when the carrier is asymptomatic, and can only be passed from person to person through the exchange of bodily fluids. A robust public health system—unlike those in West Africa—should easily contain its spread.

But the public sees something quite different.

A single traveler arrives in Texas from Liberia. He quickly takes ill with a high fever, visits a hospital and is sent home. Feeling worse by the hour, he returns to the hospital, where he dies. When two nurses who treated the man test positive for the disease, it becomes clear that the hospital had no effective plan in place to deal with the situation. To compound matters, one of the nurses had boarded a plane to visit relatives in Ohio. The possible ring of contamination now extends well beyond Dallas, showing the lightning speed with which an infectious disease can spread in the modern world.

Next week marks the 100th birthday of Jonas Salk. Shortly after his vaccine was declared successful, he gave a nationally televised interview with Edward R. Murrow. “Who owns the patent on this vaccine?” Murrow asked. “Well the people, I would say,” Salk replied. “There is no patent. Could you patent the sun?”

More From The Wall Street Journal (subscription required):