Don’t Be Such a Scientist or The Negating Culture of Science and its Costs

I recently finished reading Randy Olson’s book Don’t Be Such a Scientist, and I definitely recommend it. It is an entertaining read and has a number of useful insights into the perennial problem of communicating complex, nuanced, and incomplete scientific information in a way that is engaging and accessible and still in some meaningful sense truthful. I won’t say I agree entirely with Olson’s take on the subject, of course. Sometimes I think he lets the public off too easily, expecting too little from the intelligent lay person. And as always when making generalizations, he seems to stray sometimes into caricature and stereotype. Still, there is no denying that there is currently an enormous divide between mainstream culture and the subculture of science. The days of standing room only public science lectures, such as those of the 19th century, are gone. And the days of scientists being seen as respected and trusted white-coated heroes who invented antibiotics, stopped the scourge of polio, and won the Second World War are over too. There is now a profound suspicion of science, and intellectualism generally, in the U.S., and scientists can no longer take for granted that they will listened to, trusted, or supported by the general public unless they can compete in the busy and bewildering media from which most of us get our understanding of complex issue.

The one concept that struck me most forcefully in Olson’s book, was the image of science as fundamentally a negating enterprise. For all the reading and writing I’ve done on the subject of medical research and the dangers of simply seeking to confirm our preconceptions, I never fully appreciated the implications of this for the appearance of science to non-scientists.

It is very difficult to reliably prove an idea true. Certainly informal assessments of our personal experiences almost always confirm our pre-existing beliefs. Confirmation bias, the availability bias, cognitive dissonance, and a host of other such factors make this inevitable. But even scientific research studies, with all their attempts at controlling personal bias, will almost inevitably prove true whatever investigators set out to prove. The best way to get to the truth is to attempt to prove ideas wrong. A negative finding, especially from a source predisposed in favor of the hypothesis, is worth more than a positive finding. Of course, technically one cannot prove a negative. But the failure to disprove an idea with multiple, vigorous attempts is certainly more reliable an indicator of the idea’s veracity than multiple studies set up to confirm what is already believed to be true.

The implications of this for the culture of science, and the barriers to effective communication between scientists and non-scientists, are profound. Scientists expect criticism and see it as a sign that they and their ideas are being taken seriously. Sure, we are human and so as full of ego and narcissism as anyone else. But by training and experience, most of us acquire relatively thick skins, and we come to see strong challenges to our ideas as a good thing, a kind of intellectual personal trainer that will cause us pain but ultimately make us stronger.

I have been through the peer review process for several publications I have written, and it isn’t pretty. Seeing something I have put months of hard work into torn apart, and facing the prospect of more work to revise what I was sure was perfect to begin with, causes lots of hurt and anger. But at the end of the process, I am generally grateful that the final product is better and that I have been saved the embarrassment of public error. I understand that the criticisms are not personal (the reviews are anonymous, of course, which helps) and I accept the ego bruising as a fair price to pay for weeding out bad work and weak ideas. Likewise, I try as hard as I can to give up beliefs and practices that have been reasonably shown to be wrong, even if I am attached to them and personally convinced of their value. I trust the process, based on the logic of the underlying philosophy and the evidence of history, and this helps me to appreciate the value of the sometimes painful experience of having my ideas and work criticized.

As part of this enculturation, I also feel it normal to respectfully but aggressively criticize the ideas of others. I’ve discussed before how proponents of CAM often resent such criticism and see it as fundamentally unfair and inherently personal. In the culture of faith-based medicine, where truth is judged on the basis of one’s personal experiences or the received wisdom of one’s mentors, challenging someone’s beliefs is the same thing as challenging their intelligence, honesty, or worth. In the culture of science, no one’s beliefs are beyond challenge, at least theoretically (though of course scientists are political animals like all humans, so this principal isn’t always followed). This is one of example of the clash between the negating culture of science and other, non-scientific ways of looking at health and disease.

Olson also makes a big point of talking about how unlikeable scientists can seem to the rest of the world. This is particularly a concern for an endeavor like this blog, which is to a great extent devoted to identifying ideas which are not true and therapies which don’t work. It is far more pleasant to hear proclamations of hope and optimism than to hear all the reasons why something which purports to offer hope really doesn’t. Debunking is inherently negating, and it is easy to see why this leads to the image of skeptics as sour, curmudgeonly, and willfully choosing not to believe in anything. Of course, anyone the least bit of a skeptic themselves knows this isn’t true, just as anyone who actively practices science knows how positive and affirming it can be. The sense of wonder and discovery and the joy of figuring things out is a big part of the rewards in doing science, but for some reason they are less often communicated to the public than the contempt many scientists feel for bad ideas supported by wishful thinking, sloppy logic, and few facts. Science communicators, one might even say science entertainers, like Carl Sagan and Neil deGrasse Tyson are notable exceptions.  

So I agree with Olson that in many ways the culture of science is built on negation, on aggressive intellectual attack and defense of ideas, on a disrespect for those who make stuff up, botch their facts, and show more concern for what they wish to be true than for what really is true. This kind of negativism is not a bad thing, of course, since it is what enables the discovery or real, practical truths that benefit us all. And the negating aspects of science are not all there is to the enterprise. There is a great deal of awe and wonder, creativity, community, and true hope for meaningful progress and improvement in the world. But the negating aspects of the scientific approach do present a public relations problem. Most people seem to take a pretty quick dislike to dispassionate, cerebral, fact-based exposition and to the negation of hopeful, feel-good ideas no matter how nonsensical.

So what do we do about this? Abandoning reality for wish fulfillment doesn’t strike me as a good choice, so we are stuck having to challenge bad ideas no matter how popular. But as Olson suggests, this can at least sometimes be done with humor and humility and with frequent reminders of the elegance, wonder, and real benefit inherent in pursuing and defending the truth. While I think Olson sometimes goes too far in the degree to which he seems to suggest we simplify our messages and make them more entertaining and less instructive, nevertheless his underlying point is valid. The positions staked out by science and reason must compete in a marketplace of ideas, and some of the competitors they face are much more marketable.

The advances of science are often more complex and less obvious than the early triumphs of vaccination and antibiotics, which make them less self-evidently proof that the approach is the right one. And the misuses of scientific knowledge and technological progress are better understood, which further tarnishes the image of science. But the fundamental nature of science as a method which relies on challenge and disproof is itself a weakness from a public relations point of view. And the cultural reverence for factual accuracy and distaste for excessive, hyperbolic, and ultimately unjustified claims also sets those of us promoting science-based medicine at something of a disadvantage. Yet all of these marketing weakness are strengths from the point of view of discovering real and useful truths, so we cannot give them up.

We must strive to make what we do and what we stand for as engaging and accessible as possible without cutting the heart out of it. Being open about our own joy and passion for the truth and the scientific path to reach it is an important step, and being always clear that the truth, even when it may not be what we might wish it to be, is the only way to really better all our lives. Millennia of faith and wishful thinking have failed to accomplish what science has wrought in a few generations, and we must not allow the public to forget that. We must being humble, but at the same time not afraid to be definitive where it is justified. Homeopathy doesn’t work, vaccines don’t cause autism, and we needn’t tiptoe around those assertions to satisfy an excessive epistemological caution. And as always in life, we must make the effort to maintain our sense of humor, about ourselves as well as our ideological adversaries. This will not only make our own efforts more enjoyable to us, but it will do a lot to dispel the myth of the emotionless scientist out of touch with ordinary human feelings. We are as driven by our own feelings as anyone, we simply trust in a method of inquiry which diminishes the danger of these feelings misleading us, and hopefully we can succeed at illustrating that and thus humanizing science and scientists.

This entry was posted in Book Reviews, General. Bookmark the permalink.

4 Responses to Don’t Be Such a Scientist or The Negating Culture of Science and its Costs

  1. Bartimaeus says:

    I’ll have to add this to my list of books to read this summer. Thanks for the review.

  2. Pingback: #42) How “Nattering Nabobs of Negativism” Can Hurt Science | The Benshi

  3. David says:

    Great article! I found myself nodding vigorously through much of it. I think that anti-intellectualism is fairly well steeped throughout Australian culture (as in many Western countries) and expressing a passion or enthusiasm for science is less socially acceptable than, say, supporting a sports team.

    Launching into a conversation about Jupiter losing its band is much more likely to fall flat (in most circles) than a “So, did you watch the football?”, but for me, is immensely more interesting. Recognising that most people don’t feel the same way is not only difficult to accept but it’s also a bit isolating. I remain optimistic, though, because when I mention I’m a scientist to a stranger, they almost always ask a question about a topical issue which is obviously on their mind (the oil spill is dominant of late) – so science communication does have a way in, it’s just not easy.

    … I could go on like this for ages, but I won’t. Great post, and I’m about to get my hands on a copy of DBSAS.

  4. skeptvet says:

    Thanks, David. Glad (in a sad sort of way) to be reminded it’s not entirely a U.S. phenomenon, though I’d bet we lead the pack! 🙂

Leave a Reply

Your email address will not be published. Required fields are marked *