By far the most common response to those posts I write questioning the claims for medical therapies advertised on the internet is a flood of testimonials from people who believe the scientific evidence is wrong or irrelevant because the product seemed to work for them and because so many people report it works. There are many reasons why anecdotes aren’t trustworthy, and I’ve written about this at length before. As psychologically compelling as our own experience are, and as hard-wired as we are to appreciate and believe stories more than data, the truth is that our uncontrolled observations are deeply unreliable.
Cognitive biases are a particularly significant part of the problem. These are little mental quirks and shortcuts that lead us to the wrong conclusions and to bad decisions. I’ve written about the effect of these biases on veterinary clinical decision making, and many, many books have been written about the specific details of these biases, how they work, and what effects they have on our reasoning**
A recent study has produced some specific and compelling evidence for the degree to which internet reviews of medical products grossly overestimate the actual value of the treatments being reviewed. The bottom line is that people who have positive experiences are far more likely to write testimonials than people who don’t, which creates a false impression of how well a therapy works, if it does at all.
Mícheál de Barra; Kimmo Eriksson; Pontus Strimling. How Feedback Biases Give Ineffective Medical Treatments a Good Reputation. J Med Internet Res 2014;16(8):e193)
The study first compared reviews of a diet plan on Amazon with results of clinical research on the same diet.
After 6 months on the diet, 93% (64/69) of online reviewers reported a weight loss of 10 kg or more while just 27% (19/71) of clinical trial participants experienced this level of weight change.
This figure shows the results of three clinical studies compared with those reported on Amazon. Clearly, the studies found consist results far less dramatic than those indicated by the online testimonials.
The same phenomenon was found when evaluating a dietary supplement/herbal product supposed to improve fertility.
And when the investigators evaluated the influence of reviews on decision-making, they found that positive reviews did affect people’s choice of a weight loss plan.
The author’s conclusions were:
We found that the reputed benefit of weight loss diets and fertility treatments is larger than the real benefit, apparently because people with typical or poorer outcomes are less inclined to tell others about their experiences. Thus, the real-world reputation of medical treatments seems to be subject to a reporting bias akin to the publication bias toward positive results that is seen in scientific research. Moreover, we found the resultant reputation distortion to be large enough to influence people’s decisions about which diet to begin…
Researchers have pointed out that several processes make it very difficult to identify benefits and harms of medical treatments when data are not systematically collected. In particular, treatments with no direct effect will sometimes appear effective because of the statistical phenomenon known as regression to the mean and the physiological phenomenon known as the placebo effect. It has also been suggested that treatments that prolong illness may, perversely, spread better because they are “demonstrated” for a longer period than effective treatments. Here, we have explored an additional mechanism, reporting bias, and its logical consequence: when people with poor outcomes remain silent, the reputed benefit of a treatment will exceed its real effect.
Though there are already more than enough nails in the coffin of the idea that anecdotes and testimonials can be trusted when making decisions about medical therapies, this study provides yet another solid reason not to rely on this kind of evidence. That will not, of course, stop people from responding to substantive, evidence-based critiques of specific products with pointless testimonials, but it will help remind all of us why these shouldn’t be taken very seriously.
Burton, R. (2008). On Being Certain: Believing You’re Right Even When You’re Not. New York: St. Martin’s Press
Carroll, RT. (2000) Becoming a Critical Thinker – A Guide for the New Millennium. Boston: Pearson Custom Publishing.
Gilovich, T. (1993). How We Know What Isn’t’ So: The Fallibility of Human Reason in Everyday Life. New York: The Free Press.
Kahneman, D. (2011). Thinking, Fast and Slow. New York: Farrar, Straus and Giroux.
Kida, T. (2006). Don’t Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. New York: Prometheus Books.
McKenzie, BA. Veterinary clinical decision-making: cognitive biases, external constraints,and strategies for improvement. Journal Amer Vet Med Assoc. 2014;244(3):271-276.
Park, RL. (2001) Voodoo Science: The Road from Foolishness to Fraud. Boston: Oxford University Press.
Sagan, C. (1995). The Demon-Haunted World: Science as a Candle in the Dark. New York: Random House.
Shermer, M. (1997). Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time. New York: Holt, Holt & Company.
Tavris C. Aronson, E. (2008) Mistakes Were Made (But Not by Me):Why we Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts. Boston: Mariner Books.
Burch, D. (2009). Taking the Medicine: A Short History of Medicine’s Beautiful Idea and our Difficulty Swallowing It. London: Chatto & Windus