Book Review: At Home by Bill Bryson

For my infrequent book reviews I have tried to focus on books that are explicitly relevant to the issues and themes of this blog. However, I wanted to call attention to a very enjoyable book that, honestly, is only marginally related to science-based and alternative medicine, At Home by Bill Bryson.

It is an entertaining, historical buffet of domestic life that uses, as a very loose framing device, a tour of Mr. Bryson’s house to set up fascinating meanderings through the history of domestic life. What did people eat, wear, sleep on and so forth in the centuries leading up to modern diet, clothing, and furniture? Who really thought up the flush toilet? When was childhood invented? All sorts of intriguing and loosely related questions such as these are raised and at least partially answered. Most of this has little to do with my usual subjects. However, the book does offer a few illustrative examples of medical history which I want to share.

Two of the core problems in getting people to recognize the superiority of scientific methods over traditional ways of investigating health and disease are 1) a lack of appreciation for how dramatically more successful scientific medicine has been compared to the thousands of years of pre-scientific medicine and 2) a failure to understand how unreliable our commonsense and personal experience are when it comes to medicine. A knowledge of history can be a powerful tool in overcoming these problems. And Mr. Bryson provides a few very telling examples of pre-scientific medical theories and practices that persisted for hundreds or thousands of years despite being wildly wrong.

It is very important that I stress the true significance of these examples. It is not simply that previous generations had a lot of stupid or crazy ideas and gosh isn’t great that now we know better. Since the evolution of at least the Cro-Magnons, people have been just as smart as any of us around today. The wheel and the stirrup required just as much brain power to think up as the semi-conductor. Or, looked at from a different angle, we are just as blind and likely to be fooled as our ancestors. The lesson of medical history, how foolish ideas were born, spread, and became intractable dogma, is not that we are smarter now than our ancestors. The lesson is that we are, in fact, the same as the cavemen or the Sumerians, or the Romans, or the Victorians, and that foolish ideas can just as easily be born, spread, and become intractable dogma now as they ever could if we fail to accept our limitations and use the tools of science bequeathed to us.

What we have now that is fundamentally different from what preceding generations had is not a feature of ourselves. It is a method, an approach that our ancestors discovered and which we are still improving. And we have the information this method has generated, which we can preserve and transmit more easily than ever before and which we can build on. In short, we have science and the technologies it has helped us to produce.

One example of ideas about health that are now well-known to be false, but that made perfect sense to former generations is the relationship between sexual and reproductive functions and health. The Victorians, in particular, despite a technological sophistication greater than had previously been achieved, had some bizarre notions concerning sex and health. For example:

For men, the principal and preoccupying challenge was not to spill a drop of seminal fluid outside the sacred bounds of marriage–and not much there either, if they could decently manage it. As one authority explained, seminal fluid, when nobly retained within the body, enriched the blood and invigorated the brain. The consequence of discharging this natural elixir illicitly was to leave a man literally enfeebled in mind and body. So even within marriage one should be spermatozoically frugal, as more frequent sex produced “languid” sperm, which resulted in listless offspring. Monthly intercourse was recommended as a safe maximum.

It is relatively easy to imagine how such notions could arise, and in the absence of rigorously controlled observations they would be perpetuated by “authorities” in medicine. Case studies (a “sciency” word for anedcotes or testimonials) were used to support such notions, just as they are all too often used to justify medical theories and practices today:

Case studies vividly drove home the risks. A medical man named Samuel Tissot described how one of his patients drooled continuously, dripped watery blood from his nose, and “defecated in his bed without noticing it.”

It may seem obviously ridiculous to assign blame for such symptoms to a history of masturbation, as this “authority” did, but the apparent correlation was undoubtedly just as obvious to medical practitioners of the time, and it is only through systematic, controlled observations that we can weed out such spurious, fanciful connections from true cause/effect relationships.

Another bizarre and quite long-standing notion about health that we have fortunately discarded, concerns the subject of cleanliness. The Romans were fond of frequent, lengthy, and complicated bathing practices for many reasons, including the belief it promoted health. However, with the rise of Christianity in Europe, and the loss of classical knowledge during the Middle Ages, this idea was reversed.

Christianity was always curiously ill at ease with cleanliness anyway, and early on developed an odd tradition of equating holiness with dirtiness. When Thomas a Becket, archbishop of Canterbury, died in 1100, those who laid him out noted approvingly that his undergarments were “seething with lice…”

Then in the Middle Ages the spread of plague made people consider more closely their attitude to hygiene and what they might do to modify their own susceptibility to outbreaks. Unfortunately, people everywhere came to exactly the wrong conclusion. All the best minds agreed that bathing opened the epidermal pores and encouraged deathly vapors to invade the body. The best policy was to plug the pores with dirt. For the next six hundred years most people didn’t wash, or even get wet, if they could help it–and in consequence they paid an uncomfortable price. Infections became part of everyday life. Boils grew commonplace. Rashes and blotches were routine. Nearly everyone itched all the time. Discomfort was constant, and serious illness was accepted with resignation.

Again, it is easy but misguided to snicker at such notions and congratulate ourselves on our more enlightened understanding. It is not our superior intelligence, nor even solely the invention of the microscope and its aid in the discovery of germs, that allows us to scoff at such beliefs. Even with the technological and historical advantages we possess, equally absurd notions concerning hygiene exist today: from colon cleansing to detoxification to the surprising number of chiropractors and other alternative medicine advocates today who still deny the germ theory of disease.

Interestingly, the notion that clogging one’s pores with dirt is healthy was replaced by the equally bogus notion that clogged pores were themselves a cause of disease. This seems eerily familiar to those of us confronted with contemporary theories about “toxins” and “accretions” in our colon promoted by advocates of some CAM methods.

Now instead of it being bad to have pink skin and open pores, the belief took hold that the skin was in fact a marvelous ventilator–that carbon dioxide and other toxic inhalations were expelled through the skin, and that if the pores were blocked by dust and other ancient accretions natural toxins would become trapped within and would dangerously accumulate. That’s why dirty people–the Great Unwashed of Thackery–were so often sick. Their clogged pores were killing them. In one graphic demonstration, a doctor showed how a horse, painted all over in tar, grew swiftly enfeebled and piteously expired.

Without systematic, controlled methods for observing how healthy and ill patients respond to preventative and treatment measures, were are destined to lurch blindly from one wild theory to another, even accepting mutually incompatible notions in sequence or at the same time, as so often happens in the world of CAM.

One of the classic examples of the critical importance of systematized observations and record-keeping, is the discovery of the cause of cholera. When cholera was rampaging through the cities of England in the 19th century, nobody understood what caused it.

“What is cholera?” The Lancet wrote in 1853. “Is it a fungus, an insect, a miasma, and electrical disturbance, a deficiency of ozone, a morbid off-scouring of the intestinal canal? We know nothing.”

The most common belief was that cholera and other terrible diseases arose from impure air.

Many smart, educated people accepted this miasma theory, which had been around at least since classical times. The individual who first identified the real source of the cholera infection was John Snow. But more than this, he was a founding figure of modern epidemiology, a key component of modern evidence-based medicine.

Snow’s lasting achievement was not just to understand the cause of cholera but also to collect the evidence in a scientifically rigorous manner. He made the most careful maps showing the exact distributions of where cholera victims lived. These made intriguing patterns. For instance, Bethlehem Hospital, the famous lunatic asylum, had not a single victim, while people on the facing streets in every direction were felled in alarming numbers. The difference was that the hospital had its own water supply….while people outside took their water from public wells.

Of course, the habit of trusting conclusions derived from systematic observation wasn’t yet established (and isn’t always present today), so Snow’s explanation was dismissed during his own lifetime. And even today, relics of the miasma theory persist in some CAM disciplines, such as homeopathy which views microorganisms as causing disease not through their effects on the body but through changes in the spiritual “vital force” that are then transmitted to offspring; changes called miasms.

Finally, Bryson paints a picture of childhood that is horrific to the eyes of most modern citizens of prosperous industrial nations, but that represents the reality of the overwhelming majority of human history.

Life was full of perils from the moment of conception. For mother and child both, the most dangerous milestone was birth itself. When things went wrong, there was little any midwife or physician could do. Doctors, when called in at all, frequently resorted to treatments that only increased the distress and danger, draining the exhausted mother of blood (on the grounds that it would relax her–then seeing loss of consciousness as proof of success), padding her with blistering poultices or otherwise straining her dwindling reserves of hope and energy.

Such therapies were not employed because doctors were callous or stupid, but because they had only the authority of their mentors and the evidence of their own experience to guide them, and these things made such treatments look as though they were working even when they were killing the patient.

It is frequently claimed that among children in the pre-modern age, “one third died in their first year of life, and half failed to reach their fifth birthday.” Bryson discusses some more scientific statistics that suggest,

infant mortality was not quite as bad as figures now generally cites would encourage us to suppose. [In one city with detailed records], slightly over a quarter of babies died in their first year, and 44 percent were dead by their seventh birthday…Not until seventeen years had passed did the proportion of deaths…reach 50 percent.

When the most optimistic figures show 25% of infants dead by 1 year, 44% by 7 years, and 50% dead before what we now consider to be adulthood, it is a powerful statement about how dramatically the scientific method, applied to sanitation, nutrition, and healthcare, has changed the world as nothing before it ever did.

It is a cliché to say that those who are ignorant of the past are doomed to repeat it, but it is an apt and applicable cliché when it comes to much of modern unscientific medical theory and practice. The history of medicine has a lot to teach us about the dangers of relying on intuition, tradition, authority, anecdote, and personal experience in trying to understand and influence health and disease. And the dramatic impact of scientific methods on the well being of humankind is impossible to properly appreciate without an understanding of how people suffered before the advent of scientific medicine, and how ineffectual most approaches to understanding and relieving this suffering were in pre-scientific times. Winston Churchill is credited with saying, more or less, “Democracy is the worst possible system of government, except for all the others that have been tried.” The same is very much true of science and science-based medicine. It is full of flaws and shortcomings and is thoroughly imperfect. Yet it is far more effective than anything else we’ve ever tried, and history can teach us this.

Most of Bryson’s book is not about health and disease, or even science, but the myriad aspects of domestic life that we tend to take for granted. Whether or not one is interested in the history of medicine, it is a worthwhile book to read.

This entry was posted in Book Reviews. Bookmark the permalink.

2 Responses to Book Review: At Home by Bill Bryson

  1. Janet Camp says:

    I have read several of Bryson’s books and find them delightful. I’ll be sure to pick this one up as well. I wrote a paper in college once (Anthropology major) in which I described general environmental conditions prior to the industrial revolution, that was similar in nature to the things you quote from the book, although related more to the state of water, streets, air quality and working conditions than medicine. I did it because I kept hearing people going on about the environment (this was the late 70’s) in a way that seemed to be nostalgic for some earlier and purer time. I also recommended that they read Upton Sinclair’s, “The Jungle” if they thought it was only London and only before the industrial revolution.

  2. skeptvet says:

    I was a literature major (as well as biology), and I remember being astounded at the depictions of nature and places outside of human settlements in any literature before the 19th century. The un-tamed world was loathed and feared because it was bloody dangerous! Now that we’ve destroyed so much of it, and begun to understand the consequences of doing so, we idealize it, and that’s not an entirely bad thing since it motivates us to try and care better for wat’s left (though frankly I doubt we’ll ever able to achieve meaningful conservation until we get control of our population growth, wich doesn’t look all that likely; but that’s another topic). Still, just as we romanticize the health and diet of pre-industrial times, we also romanticize the natural environment because we are isolated from it. That story Into the Wild is a classic example of the dangerous ignorance most people in industrial nations have of real nature.

Leave a Reply

Your email address will not be published. Required fields are marked *

This blog is kept spam free by WP-SpamFree.