J Manipulative Physiol Ther 2004 (May); 27 (4): 253–261 ~ FULL TEXT
Fernandez CE, Delaney PM
Los Angeles College of Chiropractic,
Southern California University of Health Sciences,
PO Box 1160, 16200 E. Amber Valley Drive,
Whittier, CA 90609-1166, USA.
OBJECTIVE: To describe and measure the effectiveness of a problem-based educational strategy for teaching evidence-based health care (EBHC) to chiropractic interns, which focused on the development and appraisal of answerable clinical questions using actual musculoskeletal patients.
METHODS: A 1-group pretest-posttest design (simple panel design) with investigator-blinded survey administration was used to measure effectiveness of educational activities using adult learning theory with a study population of interns (n=31) at a chiropractic college (Los Angeles College of Chiropractic, Southern California University of Health Sciences [LACC/SCUHS]) teaching clinic. Activities included 2 workshops on constructing clinical questions and critical appraisal of published research and independent patient-based EBHC assignments. A qualitative self-assessment survey was administered before and after a 6-week period of EBHC activities to measure their effectiveness. Sign tests and paired t tests were utilized to determine P values for significant difference of score results.
RESULTS: Eighty-one percent of subjects completed the pretest-posttest surveys. All survey item responses showed an average increase in subjects' self-rating of skills and attitudes from pretest to posttest. There were statistically significant differences in interns' self-assessed ability to construct an answerable clinical question and appraise research articles and apply them to patient management, as well as their rating of importance of EBHC in patient decision making.
CONCLUSION: The results of this study suggest that having chiropractic interns apply EBHC to actual musculoskeletal patients along with attending EBHC workshops had a positive impact on interns' perceived ability to practice EBHC.
From the Full-Text Article:
Adult learning theory applied in the clinical setting requires interns to take responsibility for their learning, exploit their experience as a resource, and take advantage of real life situations.  The practice of evidence-based health care utilizes adult learning theory and principles by requiring clinical problem solving and self-directed learning. A teaching strategy for chiropractic interns on applying EBHC to musculoskeletal patients was presented here. Results of this 1-group pretest-posttest study suggest a positive impact on the self-reported EBHC attitudes and skills of chiropractic interns. This was consistent with preliminary data collected by the authors on 94 subjects from 3 separate graduating classes (3 student cohorts) who were given identical pretest surveys, EBHC workshops and assignments, and posttest surveys over an 18-month period. In this preliminary study, group-averaged data from pretest and posttest surveys were compared utilizing nonpaired t tests. Results showed an increase in group-averaged response score on all 4 survey items, though only item 2 showed a statistically significant difference. Post hoc statistical power analysis determined that use of paired t test and sign tests for noncontinuous data were required in a follow-up study to help determine if improvement in group-averaged posttest responses were statistically significant and not due to chance.
A subsequent study utilizing sign tests and paired t tests, with a smaller sample size from 1 student cohort (1 graduating class), demonstrated a statistically significant increase in group-averaged responses to 3 out of 4 survey items.
Survey item 1 measured chiropractic interns' self-assessed skills in computer on-line searching. It showed nonstatistically significant improvement from a mean pretest score of 3.52 (Likert scale, 1-5) to a mean posttest score of 3.72. This was consistent with the Green and Ellis  study of internal medicine residents, which reported no significant change in a similar survey item, although their results might be explained by ceiling effect. The lack of statistically significant change in this item in our study may be due to interns' earlier training received in computer-based literature searching and retrieval. A greater posttest effect may have been measured if pretest measurements were taken earlier in interns' curriculum. Also, we assumed that interns only needed a review in this area and therefore did not emphasize it in EBHC workshops we provided.
Haynes et al  reported that searching from clinical settings affected clinical decisions and was feasible with brief training. They also noted that inexperienced searchers miss many relevant citations and search inefficiently. This may also partly explain subjects' response to survey item 1 in our study.
Survey item 2 asked interns to rate their competence in clinical question development. There was a statistically significant improvement from a mean pretest score of 3.16 to a mean posttest score of 3.68, perhaps because workshops and assignments emphasized question development using interns' actual patient cases. These results contrast those of Green and Ellis,  who reported that neither the intervention nor control groups significantly improved their abilities to pose a focused question. Again, this may be explained by ceiling effect, since both control and intervention groups scored high on the pretest in this competency in their study.
Survey item 3 required interns to rate their competency in appraisal and application of research studies to clinical patients. There was a statistically significant improvement from a mean pretest score of 3.32 to a mean posttest score of 3.72. Again, the application to real patients was emphasized in both EBHC workshops and assignment. Also, the assignment form that interns were required to complete provided a checklist of quality filters (evidence criteria) for appraising research articles. Green and Ellis  and other studies of less methodological rigor [33, 34] have reported a significant change in critical appraisal skills following various teaching interventions. However, in the only randomized controlled trial conducted, which must be viewed as the strongest evidence, Linzer et al  did not report significant change in critical appraisal skills, but their study only used journal clubs without application of adult learning principles as described earlier. Seelig  reported significant change in this competency when journal clubs combined with adult learning theory were used in the teaching strategy.
Survey item 4 asked interns to rate the importance of EBHC in patient decision making. There was a statistically significant change from a mean pretest score of 2.72 to a mean posttest score of 4.08, a greater magnitude of increase than the other 3 survey items. Results regarding attitudes toward the process of EBHC were variable in the studies reviewed above, though some of these focused only on specific aspects such as literature searching, critical appraisal skills, and clinical knowledge.
Haynes et al  reported that most respondents acknowledged the value of on-line searching following training in its use. Pyne et al  reported that clinicians surveyed recognized the need to keep up-to-date with changes in their specialty and therefore frequently reviewed new research. The study by Tsafir  using survey questionnaires found significant correlation between respondents' preference for original research articles and their preferences for updating current professional knowledge, performing research, and writing for publication. The study by Rose and Adams  showed that EBHC is not being taught in the majority of chiropractic colleges, though studies by Green  and Green and Johnson  and our study indicate a positive response from chiropractic students following teaching interventions on critical appraisal, critical thinking, and professional communication.
Overview and Implications for Students' Curriculum
Our project, though not a randomized controlled trial, demonstrated an effective strategy for teaching EBHC in both the preliminary (n=94) and subsequent (n=31) studies. This 1-group study with blinded survey administration demonstrated a positive pretest-posttest difference in all 4 survey items, with 3 out of 4 demonstrating a statistically significant increase in group-averaged responses. Survey item 1 showed a positive but not statistically significant increase in chiropractic interns' self-assessed skills in computer-based literature searching and retrieval. Based on this, we suggest that more instruction time be devoted to this area, especially to use of Boolean operators, appropriate search terms including medical subject headings (MeSH), and terms derived from well-constructed clinical questions. Various health sciences literature databases and search tools (eg, PubMed's Clinical Queries) should be explored and routinely used by students throughout the curriculum. Ongoing use of EBHC skills by participation in regular journal clubs emphasizing application to actual cases should develop students' skills, along with teaching interventions such as those described herein. Brynin and Farrar  described a detailed protocol for conducting journal clubs in a chiropractic educational setting.
Students have varied backgrounds before entering chiropractic college, which affect their self-assessed EBHC knowledge, skills, and attitudes. Many chiropractic colleges already incorporate dedicated strategies for improving students' professional and academic subject competencies, but relatively few colleges expressly teach evidence-based approaches that focus on individuals as self-directed learners throughout the entire curriculum. At SCUHS/LACC, students continue to participate in lectures and laboratories, but they spend substantial time in problem-based small-group tutorials where they learn using clinical case studies (see Table 6
for list of SCUHS/LACC courses emphasizing EBHC). With the current study, we sought to develop basic strategies for teaching EBHC during the clinical internship, to better prepare competent chiropractic practitioners and lifelong learners who inform themselves with the best available evidence to deliver quality health care. This assignment and other similar assignments may be useful in courses at other institutions.
Ideal Study Design
The weaknesses of this 1-group pretest-posttest study design include internal validity threats of history, maturation, and testing. 
Since the posttest observations are made after the pretest, the difference between them may be the result of historical events intervening during the period.
During the course of the study, the individuals mature during their experience in a clinical setting and may change in ways that affect the outcome of the study.
If the pretest measurement of EBHC competency made individuals believe that they should be more competent, the pretesting alone could have produced higher scores on the posttest. Other limitations of this study include outcome measures that only include self-assessed skills and attitudes. Future studies may provide actual skills testing along with self-assessment. In addition, the dropout rate should ideally be lower. Sackett et al  describe 80% retention as the lowest acceptable for publication. This study retained only 81% of its study population, in part due to the increased number of weekly off-campus rotations in which interns participate, making them unavailable to complete both surveys. Finally, although the emphasis of this study was placed on the EBHC assignment, the EBHC workshops may also be considered as interventions or predictor variables. It is recommended that follow-up studies utilize a larger study population with control groups to account for the effects of this and other variables mentioned above.
In summary, future studies should require more rigorous trials, including outcomes utilizing EBHC skills testing, a larger sample size, use of a control group, follow-up, and multiple clinic involvement. A recent study by Taylor et al  was published after our data collection and analysis. They developed a questionnaire and measured its validity in evaluating the effectiveness of EBHC teaching. In comparing scores of “novices” with “experts” in both knowledge and attitude, they concluded that the questionnaire was a satisfactory tool. Future studies may also include specific skills testing in critical appraisal exercises and literature search outcomes and ultimately include valid clinical outcome measures to indicate an improvement or lack of improvement in patient care as a result of practicing EBHC.
There are few studies found in the literature that focus on chiropractic interns' EBHC knowledge, skills, and/or attitudes. The results of this study suggest that having interns apply EBHC to actual musculoskeletal patients and participating in EBHC workshops has a positive impact on chiropractic interns' perceived ability to practice EBHC. Adult learning strategies in the clinical environment stress the importance of self-directed problem-based learning and the application of knowledge and skills to solve clinical problems. Developing an answerable clinical question about a patient is the starting point of the practice of EBHC. Previously published studies and our limited study indicate the ability of interns to competently practice EBHC is influenced by their skills at structuring clinical questions, searching health sciences literature, critically appraising literature for validity and clinical usefulness, and applying the results to actual patients. The evolution of an evidence-based chiropractic curriculum is, in part, contingent on developing measures of its effectiveness and utility. Teaching practical EBHC skills to chiropractic interns may better prepare them to be lifelong learners and competent practitioners in delivering quality chiropractic health care to patients with musculoskeletal conditions.