The Problem with Randomized Controlled Trials
and Meta Analyses

This section was compiled by Frank M. Painter, D.C.
Send all comments or additions to:

The Evidence-based Practice Page
A Chiro.Org article collection

The Mythology Of Science-Based Medicine
The Huffington Post ~ 2-25-2011

One side, mainstream medicine, promotes the notion that it alone should be considered "real" medicine, but more and more this claim is being exposed as an officially sanctioned myth. When scientific minds turn to tackling the complex business of healing the sick, they simultaneously warn us that it's dangerous and foolish to look at integrative medicine, complementary and alternative medicine, or God forbid, indigenous medicine for answers. Because these other modalities are enormously popular, mainstream medicine has made a few grudging concessions to the placebo effect, natural herbal remedies, and acupuncture over the years. But M.D.s are still taught that other approaches are risky and inferior to their own training; they insist, year after year, that all we need are science-based procedures and the huge spectrum of drugs upon which modern medicine depends.

Fables or Foibles: Inherent Problems with RCTs
J Manipulative Physiol Ther 2003 (Sept);   26 (7):   460 ~ FULL TEXT

The 7 case studies reviewed in this report combined with an emerging concept in the medical literature both suggest that reviews of clinical research should accommodate our increased recognition of the values of cohort studies and case series. The alternative would have been to assume categorically that observational studies rather than RCTs (Randomized Controlled Trials) provide inferior guidance to clinical decision-making. From this discussion, it is apparent that a well-crafted cohort study or case series may be of greater informative value than a flawed or corrupted RCT. To assume that the entire range of clinical treatment for any modality has been successfully captured by the precision of analytical methods in the scientific literature, indicates Horwitz, would be tantamount to claiming that a medical librarian who has access to systematic reviews, meta-analyses, Medline, and practice guidelines provides the same quality of health care as an experienced physician.

Effect of Interpretive Bias on Research Evidence
British Medical Journal 2003 (Jun 28);   326 (7404):   1453–1455 ~ FULL TEXT

Doctors are being encouraged to improve their critical appraisal skills to make better use of medical research. But when using these skills, it is important to remember that interpretation of data is inevitably subjective and can itself result in bias. Facts do not accumulate on the blank slates of researchers' minds and data simply do not speak for themselves. (1) Good science inevitably embodies a tension between the empiricism of concrete data and the rationalism of deeply held convictions. Unbiased interpretation of data is as important as performing rigorous experiments. This evaluative process is never totally objective or completely independent of scientists' convictions or theoretical apparatus. This article elaborates on an insight of Vandenbroucke, who noted that "facts and theories remain inextricably linked... At the cutting edge of scientific progress, where new ideas develop, we will never escape subjectivity." (2) Interpretation can produce sound judgments or systematic error. Only hindsight will enable us to tell which has occurred. Nevertheless, awareness of the systematic errors that can occur in evaluative processes may facilitate the self regulating forces of science and help produce reliable knowledge sooner rather than later.

Tales from the Crypt: Fables of Foibles, or RCTs That Go Bump in the Night
Anthony Rosner, PhD, FCER Director of Research

On the eve of Thanksgiving (November 19, 1999), one would assume that another Halloween season is now safely behind us. Then again, maybe not, considering what some of the most recent medical journals have to say about that holy grail of evidence-based medicine: the randomized clinical trial. Visions of ghosts, hobgoblins and skeletons that were supposed to have retired for another year are upon us once again. This is because the revelations in these journals tell us that (I) humankind's interpretation of the results of clinical trials remains far from a complete science, and (II) outcome measures are clearly susceptible to the vagaries of human nature; not only from the perspective of the human subjects tested in clinical trials, but especially from the behavior of the trial investigators as well. Some of these images border on outright skullduggery.

Trash Talking in Science
Anthony Rosner, PhD, FCER Director of Research

My favorite example, which I may have quoted too many times, comes from the Cherkin low back pain study published in the New England Journal of Medicine, now some three years old. Despite whatever quality of evidence may have been present, the discussion section of this particular paper contained what can only be described as a blantantly political (if not outright false) statement: "Given the limited benefits and high costs, it seems unwise to refer patients with low back pain for chiropractic or McKenzie therapy." [2] In a peer-reviewed scientific journal that accepts only 10 percent of submitted papers for publication and has been considered by some to be the most prestigious journal of them all, a statement of this import is totally and inexcusably out of order.

Response to the “Manual Therapy for Asthma” Cochrane Review
  Anthony Rosner, PhD, FCER Director of Research

Hondras' recently published systematic review of randomized clinical trials [1] addressed to manual therapy represents a sincere effort to summarize those investigations in what is commonly regarded as the gold standard of clinical research. That said, however, one has to remain particularly vigilant against accepting randomized clinical trials at face value, particularly in those instances involving physical interventions, in which the complete blinding of practitioners [and most likely patients as well] in the traditional RCT design is all but impossible.

Scratching Where It Itches: Core Issues in Chiropractic Research
Anthony Rosner, PhD, FCER Director of Research

With the recent burst of media coverage of both alternative medicine and chiropractic intervention, I have felt compelled to redouble our ongoing efforts to identify some predominating elements and trends in health services research in general, and chiropractic research in particular. I was fortunate enough to find some help in doing this at the Third International Forum for Primary Care Research .

  Anthony Rosner, Ph.D.'s response to the NEJM Asthma Study
At a time when public interest in the application of alternative medicine is rising, it is regrettable that a study with such deep flaws should have found its way to the lead position in such a prominent journal. Major deficiencies of the study are summarized as follows. Thanks to FCER for permission to reproduce this article. Dr. Rosner is the FCER Director of Research.



Since 10-03-2006

Updated 3-20-2022

                  © 1995–2023 ~ The Chiropractic Resource Organization ~ All Rights Reserved