For 2,400 years patients believed doctors were doing good; for 2,300 years they were wrong. Until the invention of antibiotics in the 1940s doctors, in general, did their patients more harm than good. Why do bad ideas die hard?
Ed and Ron discussed two important books that illustrate the need for us to constantly challenge our core assumptions about the way world works: Bad Medicine: Doctors Doing Harm Since Hippocrates, by David Wootton, and The Butchering Art: Joseph Lister’s Quest to Transform the Grisly World of Victorian Medicine, by Lindsey Fitzharris.
The Butchering Art
Joseph Lister, April 5, 1827 – Feb 1912 (84).
Testing ether during surgery in London, 1843, known as the “Yankee dodge” (discovered 1275!), first used in USA 1842.
One surgeon could amputate a leg in 30 seconds (once, testicles too).
Between 1843-1859, 41 medical students died from fatal infections.
Lister almost quit medicine after his brother died, and become a preacher.
The microscope played a big role in Lister’s work, but most doctors thought it was a toy.
Lister’s germ theory was rejected byThe Lancet, the leading medical journal.
He created disciples: Listerians, who saw his methods work in hospitals.
People who never doubted Lister’s work: survivors!
Doctors in the USA remain unconvinced, up until the mid-1870s. Massachussets General was the first hospital to use Lister’s methods, in 1877.
Listerine was invented and marketed by Dr. Joseph Joshua Lawrence in 1879, PA; Johnson & Johnson was also formed, first selling sterile dressings and sutures.
“New Opinions are always suspected, and usually opposed, without any other reason but because they are not already common.” - John Locke
One of Lister’s assistants said, “A new and great scientific discovery is always apt to leave in its trail many casualties among the reputations of those who have been champions of an older method. It is hard for them to forgive the man whose work has rendered their own of no account.”
“When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is almost certainly wrong.” –Arthur C. Clarke
Bad Medicine: Doctors Doing Harm Since Hippocrates
“We know how to write histories of discovery and progress, but not how to write histories of stasis, of delay, of digression. We know how to write about the delight of discovery, but not about attachment to the old and resistance to the new.”
Bad Medicine Drives Out Good Medicine
The history of medicine begins with Hippocrates in the fifth century BC. Yet until the invention of antibiotics in the 1940s doctors, in general, did their patients more harm than good.
In other words, for 2400 years patients believed doctors were doing good; for 2300 years they were wrong.
From the 1st century BC to the mid-nineteenth century, the major therapy was bloodletting, performed with a special knife called a lancet.
Interestingly enough, that is the title of today’s prestigious English medical journal, The Lancet. Bad ideas die hard. “The lancet was the magician’s wand of the dark ages of medicine,” according to Oliver Wendell Holmes.
Bloodletting had its opponents of course, but the debate was over where in the body to draw the blood from, not over its effectiveness.
Four treatments were used for 2,000 years: emetics, purgatives, bloodletting and cautery, ¾ remained standard therapies longer than that.
The Case Against Medicine
The author makes three devastating arguments.
First, if medicine is defined as the ability to cure diseases, then there was very little medicine before 1865. Prior to that—a period the author calls Hippocratic medicine—doctors relied on bloodletting, purges, cautery, and emetics, all totally ineffectual, if not positively deleterious (no matter how efficiently they were administered).
The term iatrogenesis describes how doctors do harm while trying to do good. It is estimated that one-third of good medicine is a placebo effect, meaning medicine up to 1865 was less effective than placebos today.
Second, effective medicine could only begin when doctors began to count and compare, such as using clinical trials.
Third, the key development that made modern medicine possible is the germ theory of disease.
This is not to say that advances in knowledge were not made prior to 1860. Unfortunately, those advances had no pay-off in terms of advances in therapy, or what Wootton calls technology—that is, therapies, treatments, and techniques to cure.
So until the 1860s, doctors had knowledge of what was wrong but could only use it to predict who would live and who would die.
Wootton describes how the advances in knowledge did not change therapies, in perhaps the most devastating conclusion in the book:
The discovery of the circulation of the blood (1628), of oxygen (1775), of the role of haemoglobin (1862) made no difference; the discoveries were adapted to the therapy [bloodletting] rather than vice versa.
...[I]f you look at therapy, not theory, then ancient medicine survive more or less intact into the middle of the nineteenth century and beyond.
Strangely, traditional medical practices—bloodletting, purging, inducing vomiting—had continued even while people’s understanding of how the body worked underwent radical alteration.
The new theories were set to work to justify old practices. [Emphasis added].
In a reversal of the scientific method, the therapies guided the theory, not the other way around.
Diffusing a new theory into a population is no easy task, nor is it quick. Wootton describes in captivating detail how various innovations in medicine were rejected by the medical establishment (the following list is much longer):
Examples of delay and resistance
Joseph Lister is credited with positing germ theory in 1865, yet there was considerable evidence for this theory dating back to 1546, and certainly by 1700. Prior to this, infections were thought to be caused by stale air and water (even Florence Nightingale believed this).
Wootton says 1865 is turning point, not transformation. 1950s medicine started extending life
Even though by 1628 it was understood that the heart pumped blood through the arteries, the use of tourniquets in amputations didn’t happen until roughly a century later.
The microscope was invented by 1677—simultaneously with the telescope, which lead to new discoveries in astronomy—yet as late as 1820 it had no place in medical research, believed to be nothing more than a toy.
Penicillin was first discovered in 1872, not 1941, as popularly believed. Its effectiveness was doubted for nearly 70 years.
The theory that bacteria, not stress, causes stomach ulcers was met with considerable resistance for over a decade. This is explained in a fascinating book The Great Ulcer War, by William S. Hughes.
Anesthesia was discovered to kill pain by 1795, first used on animals in 1824, then dentists. It wasn’t used by doctors in surgery until 1846, in London, and it was degradedly labeled the “Yankee dodge.”
Dentists pioneered anesthesia. One of first painless dentists, Horace Wells, was driven to suicide by the hostility of the medical profession.
The thermometer was invented in the 17th century, but was not commonly used until 1850 in Berlin, then New York by 1865.
The medical profession resisted the use of statistics and comparative trials for centuries. The first comparative study was conducted in 1575, but it took until 1644 for the next one. Then John Snow’s 1855 account of transmission of cholera in the water was rejected for over a decade. The modern clinical trial dates from 1946.
Puerperal fever or childbed fever caused one-half of 6 to 9 women in every 1,000 to die in the 18th and 19th centuries. In May 1847, Ignaz Semmelweis, a Hungarian doctor, advocated doctors wash their hand in between patient—and cadaver— examinations. The incidence of fever fell dramatically, but he didn’t publish his findings until 1860, which by that time he was considered an eccentric, being confined to a lunatic asylum in 1865; two weeks later he died. (Interestingly, even he still believed the disease was caused by stale air).
Why the delay?
Wootton believes the primary obstacle to progress was not practical, nor theoretical, but psychological and cultural—“it lay in doctor’s sense of themselves.” Consider the psychological obstacles:
Medicine has often involved doing things to other people that you normally should not do. Think for a moment what surgery was like before the invention of anesthesia in 1842.
Imagine amputating the limb of a patient who is screaming and struggling. Imagine training yourself to be indifferent to the patient’s suffering, to be deaf to their screams. Imagine developing the strength to pin down the patient’s thrashing body.
Imagine taking pride, above all, in the speed with which you wield the knife, in never having to pause for thought or breath: speed was essential, for the shock of an operation could itself be a major factor in bringing about the patient’s death.
To think about progress, you must first understand what stands in the way of progress—in this case, the surgeon’s pride in his work, his professional training, his expertise, his sense of who he is.
The cultural obstacles, Wootton believes, are based on a somewhat counterintuitive observation: institutions have a life of their own.
All actions cannot be said to be performed by individuals; some are performed by institutions. For instance, a committee may reach a decision that was nobody’s first choice.
This is especially true for institutions that are shielded from competition and hermetically sealed in orthodoxy.
In a competitive market, germ theory would have been tested in a competing company, diffusing into the population much faster than it did within the institutions of the medical community.
Germ theory was adopted because the medical profession knew it was in crisis.
Why is this Relevant to the Professions?
The similarities between bad medicine, the billable hour, timesheets, Frederick Taylor’s efficiency metrics, and value pricing are illustrative. Even today the US Centers of Disease Control reports that 2 million people get infections in hospitals, of those 90,000 die. The largest cause? Failure to properly wash hands.
In physics the key barriers to progress are most likely theoretical. In oceanography they might be practical. What are the key barriers to progress in the professional knowledge firm?
My VeraSage colleague Tim Williams remarked on Wootton’s book: “It makes me think about Stephen Covey’s premise [in his book, The Seven Habits of Highly Effective People] that if you want to make incremental changes, work on practices. If you want to make significant changes, work on paradigms.”
The problem is, minds are slower to change than markets, especially in the professions.
If a supposed scientific and evidence-based profession is this slow to change, what chance do lawyers, CPAs, and other professionals have to move away from the discredited labor theory of value—the modern-day equivalent of bloodletting?
Will the professions resist change for as long as doctors did? Are the cultural and institutional legacies that entrenched? Do professionals really want to define themselves by how many hours they log on a timesheet?
We do not know, but the evidence seems to indicate in the positive. Obviously, burying the billable hour and the timesheet is going to be a very long process indeed. It may not be within reach, but it is certainly within sight.
Other Reading
Ron’s article in the CPA Practice Advisor, “The Diffusion of Value Pricing in the Profession.”