A History of Medical Innovation That Doesn’t Ignore the Side Effects

When you purchase an independently reviewed book through our site, we earn an affiliate commission.

By Cass R. Sunstein

From Blood Transfusions to Mass Vaccination, the Long and Risky History of Medical Innovation
By Paul A. Offit

In his 1983 novella “Worstward Ho,” Samuel Beckett wrote his most famous words: “Ever tried. Ever failed. No matter. Try again. Fail again. Fail better.” The history of medicine consists of trying and failing, trying again, failing again and failing better. Many of those efforts, and many of those failures, produced tragedies and deaths. Following Hippocrates, doctors often pledge to “do no harm,” but if they took that pledge seriously, countless lifesaving innovations would never have been possible. This is true whether we are speaking of chemotherapy, antibiotics, heart transplants or vaccines for Covid-19.

In his new book, “You Bet Your Life,” Paul A. Offit wants to understand the failures and tragedies that help pave the way to medical innovation. For most of human history, anesthesia did not exist. Patients had to be forcibly restrained while their limbs were amputated and their cancers were removed, typically amid piercing screams and unbearable agony. Things did not start to change until the 1840s, when a carnival barker named Gardner Colton charged people 25 cents to sniff “laughing gas,” also known as nitrous oxide, which made them fall down in hysterics and then go to sleep for a few minutes. On Dec. 10, 1844, a dentist named Horace Wells attended Colton’s show. Soon after inhaling the gas (and making a fool of himself), he told a friend that a person could probably “have a tooth extracted or a limb amputated and not feel any pain.”

Wells sought out Colton immediately after the show, and the very next day, he became the first person to use nitrous oxide as an anesthetic: He asked a fellow dentist to extract one of his own teeth. The procedure was painless. Over the following weeks, Wells used nitrous oxide on 15 of his patients. It worked every time. In January 1845, he asked if he could demonstrate his method to specialists in a large amphitheater at the Massachusetts General Hospital. The demonstration failed. Wells gave too little of the anesthetic to his patient, who woke up during the extraction, in intense pain and screaming. Members of the audience shouted out, “Humbug!” Wells was disgraced.

While Wells was experimenting with nitrous oxide, a dentist named William Morton was trying a different anesthetic: ether. In October 1846, Morton was given an opportunity to demonstrate its use in the very same amphitheater in which Wells had failed. After he successfully completed his operation, Morton declared, “Gentlemen, this is no humbug.” The amphitheater is now called the Ether Dome, in tribute to the birth of anesthesia in the United States.

But that’s hardly the end of the story. In 1847, James Young Simpson, a Scottish doctor, discovered that chloroform was more potent than ether — and also that it worked more quickly and did not cause vomiting. It had an enthusiastic reception in Europe. But chloroform was hardly risk-free, and by 1863, it was responsible for 100 deaths, often following minor operations. It was not until the end of World War I that chloroform was phased out in Europe — and it was not until the early 1980s that American doctors stopped using ether. Nitrous oxide continues to be deployed in dentistry, but contemporary anesthetics are much safer and better than anything that Wells, Morton and Simpson could have possibly imagined.

The first heart transplant was undertaken by James Hardy, who put a chimpanzee’s heart into Boyd Rush’s chest in 1964. Rush died in just two hours. In 1967, Christiaan Barnard transplanted a human heart into Louis Washkansky, who survived for 18 days. Barnard became an international celebrity, and in 1968, more than 100 heart transplants were performed around the world. But half of the patients died within the month, and only 10 percent were alive two years later. In light of that record of failure, the number of heart transplants dwindled to 16 in 1970 and 17 in 1971. But Norman Shumway, Richard Lower and Richard Caves, among many others, developed a series of innovations, including an essential immune suppressant (now called cyclosporine) and Caves’s “bioptome,” a thin piano wire with pincers at the end, which allowed early detection when a patient was rejecting a transplant. Doctors now perform about 2,300 heart transplants annually in the United States alone. The average survival rate is 15 years.

Gerhard Domagk, a researcher at a German pharmaceutical company, aimed to kill streptococcus, one of the world’s most deadly bacteria. His work led to the production of sulfanilamide, an early antibiotic for which he won the Nobel Prize. In a short period, the drug saved thousands of lives from pneumonia alone. But it also had side effects. It turned out that diethylene glycol was used as an ingredient in Elixir Sulfanilamide — and that it contributed to kidney failure. As Offit explains, the sulfanilamide disaster helped to spur enactment of the Food, Drug and Cosmetics Act of 1938, which required pharmaceutical companies to list all ingredients on product labels, and also to perform sufficient testing in advance.

We celebrate Jonas Salk, inventor of one of the first polio vaccines — but in some cases, his vaccine actually caused polio, temporarily paralyzing tens of thousands of children. One of the companies that made Salk’s vaccine, Cutter Laboratories, had failed to kill a lethal polio strain — and thus injected it into patients. The result was a man-made polio epidemic, one of the worst biological disasters in American history. Offit puts it gently: “Cutter did many things wrong.” It took decades, and a series of medical advances and twists and turns, for the United States to become polio-free. The rise of X-rays was also accompanied by considerable tragedy, including hundreds of deaths from cancer, culminating in the modern period of generally safe use.

Offit is a good storyteller, and he has some terrific stories to tell. He also draws important lessons. In the domain of medical innovation, tragedies cannot be prevented, no matter how many regulations we put in place. Science moves forward in fits and starts, with blunders, failures and losses along the way. New discoveries are rarely immediate; we inevitably learn more over time. Ours is not a risk-free world, which means that we need to choose the lesser risk. New technologies are always a gamble.

All of those claims are true, but I think that Offit also pulls out an even deeper and more provocative moral from this history. In life and in public policy, many people in Europe and the United States are drawn to the “precautionary principle,” which essentially calls for a high degree of risk aversion: Whenever an innovation threatens to cause harm, we should be exceedingly cautious before we allow it. Offit’s examples, and the history of medical advances, demonstrate that in its most extreme forms, the precautionary principle is self-defeating. Simply put, precautions kill. Whether we are speaking of anesthesia, heart transplants, antibiotics, chemotherapy or blood transfusions, the precautionary principle would have vastly slowed down innovations that, yes, carried serious risk and led to real harm, but were ultimately a great boon to humanity.

None of this means that anything goes. To the extent possible, our judgments should turn on the numbers — on a rigorous assessment of the magnitude of the risks from inaction, from action and from everything in between. The challenge is that when we are innovating, we might not have much information. We might have to start with speculation and guesswork, and learn in real time. We might have to roll the dice with our lives. As Offit shows, there’s heroism in that. Beckett put it this way: “No choice but stand. Somehow up and stand. Somehow stand.”

Site Index

Site Information Navigation

Source: Read Full Article