Without question, the most frequent comment I get in response to my articles is that people feel some of the therapies I write about must be effective regardless of the scientific evidence because they have had personal experiences that suggest this. The experience of having used a therapy and seen an improvement is very powerful psychologically, and it makes us very confident that the therapy we used caused the improvement we saw. Unfortunately, the evidence that this kind of conclusion is not reliable is overwhelming. This kind of thinking is so common, and so untrustworthy, it constitutes a unique ogical fallacy, the post hoc ergo propter hoc, or “false cause” fallacy.
Reality is complex, and our minds seek simple, direct causal explanations and satisfying narratives to explain things. Sometimes, of course, these explanations are true. But they are false much more often than we realize. The dismissal of anecdotal evidence by scientists isn’t casual, and it isn’t based on the idea that people who have anecdotal experiences must be stupid. It is based on centuries of study of the human mind, including decades of controlled research in cognitive psychology that shows things simply aren’t always as they seem.
John Stuart Mill described eloquently the problem that while we all realize we are imperfect and acknowledge in general terms that we can be wrong, we are very, very reluctant to ever admit we are wrong about any specific belief:
Unfortunately for the good sense of mankind, the fact of their fallibility is far from carrying the weight in their practical judgment, which is always allowed to it in theory; for while everyone knows himself to be fallible, few think it necessary to take any precautions against their own fallibility, or admit the supposition that any opinion, of which they feel very certain, may be one of the examples of the error to which they acknowledge themselves to be liable.
This helps to explain why I so often have to repeat, to pet owners and other veterinarians alike, that just because they have seen something appear to work with their own eyes, that isn’t really a good reason to believe it actually does work without supporting controlled scientific evidence.
A recent article on the subject both explains the problem, and discusses some of the possible solutions.
Your Brain is Primed to Reach False Conclusions by Christie Aschwanden
Here’s her conclusion:
With a lot of evidence that erroneous beliefs aren’t easily overturned, and when they’re tinged with emotion, forget about it. Explaining the science and helping people understand it are only the first steps. If you want someone to accept information that contradicts what they already know, you have to find a story they can buy into. That requires bridging the narrative they’ve already constructed to a new one that is both true and allows them to remain the kind of person they believe themselves to be.
The good news is that the fallibility of uncontrolled personal observation is well-documented, and this has been known for hundreds of years. This means it should be possible to inoculate people against excessive confidence in their own experiences early, teaching critical and skeptical thinking early in school, even before formal teaching of scientific facts. The bad news, however, is that once people reach a conclusion based on anecdotal experience, facts are not very effective at challenging that conclusion. Being given evidence that we are wrong tends to strengthen our false beliefs and impel us to build more and stronger arguments to support them.
Sadly, simple presentation of facts isn’t enough. People need to be led to reconsider their opinions through arguments that speak to their emotions and their core values, not simply their intellect. This is certainly more challenging than simply presenting the facts, especially for scientists who tend to think in terms of objective evidence and are suspicious or arguments that appeal primarily to emotions of beliefs rather than facts. And, of course, the biggest challenge in changing minds about scientific topics is that the stronger one’s belief, the less willing one is to seriously consider alternative explanations. The very people who most need to read about why anecdotes can’t be trusted won’t.
Despite all that, since I have to make the same argument over and over again, I have collected a few resources on this subject to which I refer anyone interested and open-minded enough to consider them: