Siege
Irradiated by LabRat
So, as I have been reminded in the comments, awhile back I made a note to myself to go over a bit one of those strange quirks of a pre-scientific age of medicine, bloodletting.
“Science” as we understand it now didn’t really exist until relatively recently as an organized systematic pursuit; for the bulk of human history, the closest thing as we’d understand was essentially the occupation of bright rich people with lots of time on their hands and a penchant for writing things down. Anybody who was educated was educated in multiple disciplines in multiple languages, and the net result of human progress was those bits of trial, error, vanity, and sheer initiative that got written down and proved to work, in whatever rough and ready fashion, for whatever reasons. Sometimes, when everybody was very fortunate, there was math mapping out why it worked.
Biology in general is not highly susceptible to math, and as such the history of the life sciences, especially medicine, is particularly colorful. The bulk of human energy in discovery tends to be in fields that yield the greatest relevance, and while the lives of barnacles may be of intense interest to people with a Darwinian sort of temperament, the health of human beings and the wide and splendid variety of ways they can become unhealthy has always attracted far more attention. Since human bodies can both break in uncountable ways and also in many ways repair themselves, medicine has always had a very high proportion of trial and error in it, and to a large extent (*coughnutritioncough*) still does.
While there are many bizarre ideas and theories littering history that purported to explain disease, treat it, or promote health, perhaps none is quite as iconic as the long-lived and bizarrely counterintuitive practice of bloodletting. If you have an ill patient languishing about, it strikes no one as a good idea to let off a couple pints of his oxygen-and-nutrient carrying life’s fluid. Unless, perhaps, you’re a medieval physician and trying something seems as good an idea as trying nothing, and in any case it’s passed around in what goes for medical literature of the time.
For good reason: in just enough of cases as to encourage*, especially when the likeliest outcome anyway is “patient either dies or gets better in spite of the doctor”, it very probably did work.
The reason for this lies in bacterial infections, and the nature of what it takes for a bacteria to successfully invade and perpetuate themselves in a human host. While a flawed analogy in many ways, an infection is much like a war; in order for the invading side to be able to get anywhere, they need enough resources to mount the initial attack, establish positions to defend, and continue to press the assault while keeping those positions secure and continuing to fuel the whole operation.
This basic reality underlies a great deal of what the immune system does when it senses invasion; one of the reasons a great many very different pathogens share a basic common symptom set (fever, lethargy, loss of appetite, aches and pains) is because they aren’t things the pathogen itself is doing, they’re things a mammal’s immune system always does in response to any infection- the fever and pains of the flu aren’t the flu virus’s doing, all the virus wants is to shed into whatever fluid the primary vector of that particular flu is going to be. They’re the fault of interleukin-1.
What this essentially does is shift the body from day-to-day productive mode into a siege mode; it saps your energy and makes everything hurt to discourage you from moving around consuming precious resources, and zaps your appetite to remove motivation to get up and do it anyway. It raises your temperature because, even if it’s not an optimal temperature for you, it’s even less optimal for the invaders; there’s a relatively narrow range of temperatures that enzymes work well at, but human tissues are better equipped to cope with working slowly at sub-optimal temperatures than pathogens are at rapidly reproducing (which they must to sustain the infection) at those same too-high temperatures**.
One of the other things the immune system does in response to any infection that we don’t notice as much as the aches and the fever is give us an artificial case of anemia. The immune system throttles the production of new red blood cells way back, monkeywrenches normal cellular iron metabolism, and specialized cells hoard the iron from defunct red blood cells away. This can cause serious problems on its own for someone with a particularly intractable long-term infection (or some cancers), but in the short term it’s another siege-mode action against invasion: the invaders need the iron for active reproduction more than the body needs it to idle along, for roughly the same reasons that a soldier going about maintaining a siege line and making assaults needs more food than a civilian hiding in a basement does. Better to be a bit thin-blooded when you’re feeling too punk to be doing much anyway than that the invaders have that resource available to them.
This is why bloodletting probably DID work from time to time; if you have a blazing infection in which the responsible pathogens are very active and have a huge force of hard-working, fast-reproducing agents, a big sudden case of critical anemia is a big hit to their supply line. It might be enough of a body blow to give the advantage to the host’s ability to weather states of deprivation longer than they could… or, of course, it might weaken the patient enough to kill them outright, as it likely did to George Washington. Leeches were even better, because they could be applied to a specific site of infection in the case of a bacterially compromised wound (not that the people doing it had any idea that’s what it was), and could also be used in a far more calculated and measured way than simply whacking a big hole in a vein and bleeding the patient until it seemed like “enough”.
Of course, the practicioners of bloodletting had not the slightest clue that there was a specific and limited case in which their practice could possibly have any productive result, and thought instead that an excess of blood was actively bad for the patient; in addition to the infections in which the practice may have had some chance of helping as much as it hurt, it was also done for arthritis, rheumatism, headaches, melancholy, and pretty much anything else not defined as a state of ideal health- or even as a health-promoting measure. It deserves its reputation as a senselessly harmful practice that was nonetheless widely believed for centuries to be beneficial… it just may have been encouraged along from rare time to time by an actual success, for an actual reason.
*Variable reinforcement again, much more problematic this time.
**Unless it’s getting so high as to actively endanger you, let the fever burn if you’re sick. You’ll be uncomfortable but you’ll also be better much faster.
December 20th, 2010 at 5:32 pm
Neat!
So next question… what’s some modern examples of “things we do that sometimes work, sometimes don’t, we don’t really know why, but they work enough to keep doing them?”
I’m assuming the “don’t really know why” is at this point a moving frontier, but it’s always there about *something*. Is that accurate?
December 20th, 2010 at 5:47 pm
Great post, and a couple of things I’ve never thought about with relationship to blood letting, like the siege forcing…
December 20th, 2010 at 7:34 pm
Do Kellogg’s Corn Flakes Help Control Masturbation?
Depends on what you use ‘em for.
December 21st, 2010 at 9:36 am
Fascinating. Thanks for going over it.
December 21st, 2010 at 12:01 pm
Somewhat on point- after I almost died in the 90’s of Zimbabwean falciparum malaria (which incidentally or maybe not so incidentally EATS red blood cells- see results below), I developed a fascination with parasitology, the evolution of parasites, the paths of arboviruruses, and any science about tropical diseases, and collected many books on those subjects.
In one- might be able to find it later- I encountered an experimental treatment for advanced syphilis — 1920’s?- in which patients were infected with malaria! Sounds like a bad joke- but in a significant amount of cases the malarial fever (HIGH- mine burned off 25 pounds in a week) killed the syphilis off.
Or, reading you- could it have deprived the bacterium of hemoglobin/ iron as well/ instead??
December 21st, 2010 at 12:22 pm
Steve: Phage ( virus targeting bacteria ) treatment for infection is still being researched and used in eastern Europe and Russia.
There is also the strange case of Yaws … it is disease similar to Syphilis that targets bones. Apparently, wiping it out in Africa left the population very susceptible to Syphilis infections.
December 21st, 2010 at 12:28 pm
Jenny- oh, that’s a minefield of a question… but one particular answer may actually be how we clinically approach back pain; there seems to be little difference between patients told to rest and patients told to exercise.
Steve- probably both. Treponemans are bastards in that they bunker up in tissues and wait out tough times, but something as persistent and devastating as malaria could indeed burn them out for either reason or both. Although, when I threw a couple of terms at Google it looks like high temperatures seriously limit its growth, with “high” being within the possible range of a high but not brain-destroying fever, so I would bet on the fever having the biggest impact…
December 21st, 2010 at 12:45 pm
Got the refs. There is a detailed account in The Malaria Capers by Robert Desowitz (1991) and a short ref in his 2002 Federal Bodysnatchers and the New Guinea Virus. 1917 through the 30’s, one von Juaregg actually won a Nobel for it in 1927, heat looks to be the mechanism, Desowitz mentions interleukin, and there were some nasty Nazi follow- ups.
Desowitz is both a good writer and a former front- line tropical epidemiologist- you’d enjoy his books (I have four and there are more). He is also my go- to guide to DDT — anti- ban but for targeted use on walls etc and not for broadcast- he knows evolution (;-)
December 21st, 2010 at 3:47 pm
There is a little bit more to bloodletting that start to make it more logical sounding — in terms of what was known at the time.
Now, I approach this from an 18th Century emphasis, but the Galenic dogma (nigh on 200 years old at this time) was still going strong, and the upstart Galvanic dogma (“frog legs jump when your electrify them”) didn’t really change the practice of medicine, only the “theory” — they medical students of physicians from the opposing camps might well have had lethal street brawls around teh medical college, but they were both likely to do teh same thing.
A Galenic (“Four Humors”) trained physician would try to reduce fever by bleeding you “to reduce excess blood and heat, while reducing teh friction of the blood in the veins,” and a Galvanic (“nervous stimulation” trained physician would often perscribe the SAME treatment, only he is trying to “Reduce the overstimulation of the nerves”. . . same symptom, same treatment — they just had different justifications.
1. Medicine was (despite the wonderful Galenic or Galvanic dogmas, depending on how late you are looking at) largely empirically based — “I did it, it seems to have had positive results more often than negative. I shall do it again.” Any observation was kitbashed to fit inside the previously promulgated dogma (I cannot call them “theories”, although Galvanism comes closer) by interpreting the observation to fit.
2. Medicine was symptomatically based — “Fever is fever. If you have a fever, we shall treat the fever, because if we make the symptoms go away, you’re cured, right? If the fever comes back in 24 hours, well, that’s a new recurrance of fever, but the old fever went away and you were cured.” When you cannot see the actual cause of disease, you focus on what you can see. You had a fever — I bleed you 8oz. Your temp drops almost immediately. (Hypovolemic shock will tend to do that to a guy. . . ) That’s a cure, right?!? This explains why EVERYBODY got a quinine solution for ANY fever — it worked so often on malaria, and malaria was so prevalent there were good odds if it wasn’t the underlying complaint, a chronic, but mostly in remission, case of it was still contributing by affecting your immune system. (Before you scoff, better look in your medicine cabinet for your cold remedies.)
3. No one had even thought of the idea of epidemiology, so what you have to go on are YOUR memories of what worked and what didn’t, combined with the second, third, or worse hand memories of doctors as to what THEY remember working. No one likes to consider the times when you failed. . . especially if you have no idea why. (In other words, your mental “statistical universe” just became the textbook example of the psychological effects of variable reinforcement! Or, think back to treating all fevers with quinine, AKA Peruvian Bark tincture — it worked a lot of the time, but no one noticed in ONLY worked on people with malaria.)
4. If you DID read case histories, they were usually either:
A. Cases so unusual someone figured there was a money making book in it (Can you say the phrase, “Hard cases make for bad law”? Can you use it in a sentance? Dr. Benjamin Rush could.)
B. Second or third hand (yet usually reported as first hand)stories that had either lost enough detail, context, and/or had had “new” facts that fit the writer’s prejudices inserted “where it makes sense”. (Yay! Medical “textbooks” derived from a Tweet of a blog of a radio news caller talking about something he read in the paper! It’s “Dan Brown Teaches Cardiology.”)
C. People just making shit up because it pays. (Yay! “Neurophysiology, by Jayson Brown in collaboration with Dan Rather, Foreword by Al Gore”)
5. The great resistance to taking a series (LOTS of bodies) of FRESH human cadavers of various ages, sizes, and stages of health, and doing an ORGANIZED and detailed examination of the body through complete dissection, including the measuring amount of various fluids and recording what “normal” is supposed to look like. The only way to figure the proper “average” volume of blood would be to take a bunch of neatly killed fresh corpses, and carefully drain them, measuring teh blood and noting correlations between “patients” and their blood volume — weight, fitness, normal altitude lived at, health immediately before death, etc. They tried to do some estimations by comparing blood volumes of animals that were “about man sized”, but that didn’t work out so well — the working estimate vaired, but was assumed to be around 10 quarts (about twice what it really is, on average). (“So, you have PLENTY of blood to spare — 8oz at waking and noontime won’t really hurt you, see?”)
6. Since blood fluid VOLUME replenishes fairly quickly (this is why the Red Cross tells you to drink at least twice as much as usual the next 24 hours after donating a pint), and they had no awareness that blood was anything other than a homogenous fluid, they figured the replacement of blood volume in 24 hours meant the replacement of BLOOD. (“Hey, it all grows back in a day or so, so we can take 8oz every other day until you get better, right?”)
7. So many patients died regardless of treatment or lack thereof (and still no formal epidemiology to quantify that), that you would see nothing wrong with reading an inquest report that said, “Treatment for infection successful, but patient faded away over the several weeks of the treatment and finally died from a lack of vigor shortly after the final treatment. Should have purged patient more violently to stimulate their Galvanic response.”
8. One hypothesis also revolves around a patient pool that was almost universally suceptible to a disorder that WOULD respond (symptomatically, at least) to bloodletting. The 18th Century European diet and culture is almost IDEAL for a population that would be almost universally hypertensive by 30. Given that many of the individual journal records we have of what patients were going to go get bled for, and teh results (often reporting nearly IMMEDIATE relief of symptoms, with a success rate well in excess of what your expectation of placebo effect alone would be), well, a lot of them look an awful lot like acute symptoms of chronic high blood pressure. If you are hypertensive, and I drain 8oz of blood out of you in a single setting, I will assure you — your blood pressure WILL drop. If your most noticeable acute symptom is caused by that excessive blood pressure (hypertensive migraine, for instance), the symptom is probably going to vanish. . . until you replenish the fluid volume and a “brand new” migraine pops up.
In other words — bloodletting APPEARED to work very well. And there is the fact that the patient has EVERY expectation the treatment is valid, maximizing placebo effect odds.
One note — while leeches were prefered for infants, and women, or other patients whose delicate and thin skin (the extremely aged, for instance) would be too savaged by steel, teh fact is that bloodletting via lancet was usually considered better. The lancet never gets tempermental and loses its appetite. The lancet never dies en route. The leech doesn’t have calibrated markings so we can drain a PRECISE 8oz (or whatever the Dr. orders) — no more, no less. By the last quarter of teh 18th Century, many upper middle class and up families had their own bloodletting equipment (usually based on multiple shallow cuts, as it was easier for a laymen to use without transecting major vessel).
December 21st, 2010 at 4:05 pm
Oh, and the only needing to replace the blood volume, not worry about red cell replenishment?
At least the old doctors didn’t KNOW that blood had several compnants, some of which take weeks to regenerate. They had exceeded their observational capacity and can be forgiven for not realizing what they WEREN’T seeing.
We’ve got medical professionals STILL pumping Lactated Ringer’s into trauma casualties, and figuring that getting the BP up means the patient is stable. Except we already KNOW that salt, water, and salty fermented milk sugar DOES NOT TRANSPORT OXYGEN worth a darn.
December 21st, 2010 at 4:20 pm
Excellent comment, which goes much more into the whys and wherefores of why medical history is the way it is. I was just trying to get to the siege metaphor.
I’ll throw a pointer to it in my next post.
December 22nd, 2010 at 8:51 am
(Before you scoff, better look in your medicine cabinet for your cold remedies.)
Given what the FedGov has left me there, it might as well be Quinine.
December 24th, 2010 at 4:04 am
while not usually an immediate health risk, an excess buildup of iron is perfectly well treated by bloodletting.
As a male with a meat-heavy diet and a natural tendency toward high levels of iron in the blood, this is one reason I try to donate blood on a semi-regular basis.
everybody wins in this case - I get to dump some iron so it doesn’t become a long-term issue from chronic buildup, and the receipt gets nice extra-iron-rich blood to help them heal after whatever major trauma caused them to need a donation.
December 25th, 2010 at 6:02 pm
Random —
Word. Hemochromatosis can call for pulling a pint every week, until iron levels are normal, and then doing regular blood donations at minimum standard intervals for maintenance.
Of course, I don’t know if the initial weekly bleeds of a half liter each week are eligible for use in the bloodbank, since they violate the 8 week deferralperiod between standard donations. . .
December 28th, 2010 at 1:41 am
I work with a man who is afflicted with a disease called polycythemia vera. His body produces too many red blood cells. Periodically (He calls it “Getting his oil changed”) he goes to a phlebotomist and has a hole punched in a vein and lets them draw off some of his blood. Basically, his blood gets too thick to pump properly. Without bloodletting his blood flow is poor, he gets blood clots, and could die.
Geodkyt, I asked him- seems that his affliction prevents the blood from being useful in transfusions for some reason- it gets flushed down the sewer.