dict.md logo
Advertisement:
Advertisement:

Long-term impact of four different strategies for delivering an on-line curriculum about herbs and other dietary supplements

As previously described, we conducted a randomized control trial comparing four different strategies for delivering an e-curriculum about herbs and other dietary supplements (HDS) to clinicians. The four strategies were delivering the curriculum by: a) email over 10 weeks; b) email within one week; c) web-site over 10 weeks; d) web-site within one week. Participants were surveyed at baseline, immediately after the course and 6–10 months after completing the course (long-term). Long-term outcomes focused on clinicians' knowledge, confidence and communication practices.

Of the 780 clinicians who completed the course, 385 (49%) completed the long-term survey. Completers and non-completers of the long-term survey had similar demographics and professional characteristics at baseline. There were statistically significant improvements from baseline to long-term follow-up in knowledge, confidence and communication practices; these improvements did not differ by curriculum delivery strategy. Knowledge scores improved from 67.7 ± 10.3 at baseline to 78.8 ± 12.3 at long-term follow-up (P < 0.001). Confidence scores improved from 53.7 ± 17.8 at baseline to 66.9 ± 12.0 at long term follow-up (P < 0.001); communication scores improved from 2.6 ± 1.9 at baseline to 3.6 ± 2.1 (P < 0.001) at long-term follow-up.

This e- curriculum led to significant and sustained improvements in clinicians' expertise about HDS regardless of the delivery strategy. Future studies should compare the impact of required vs. elective courses and self-reported vs. objective measures of behavior change.

Herbs and dietary supplements (HDS) are the most commonly used complementary medical therapies purchased in the United States [1], leading to concerns about HDS safety and efficacy[2]. Health care professionals have expressed a strong interest in HDS training courses[3-5]. However, face-to-face Continuing Medical Education (CME) courses often fail to result in sustained changes in physician behaviors [6,7]. On the other hand, online CME training has shown improved behavior and knowledge [8].

We previously reported the short-term outcomes of our randomized controlled trial (RCT) comparing four different strategies of delivering an on-line course about HDS to diverse clinicians [9]. The short-term results suggested that all four strategies of the e-curriculum similarly and significantly improved clinicians' knowledge, confidence, and communication practices.

To answer questions about the duration of these improvements and whether any differences between delivery strategies would emerge over a longer follow-up, we prospectively followed up study participants from the earlier RCT six to ten months after they'd completed the initial study.

We conducted a prospective 6 to 10 month follow-up of an RCT comparing four different strategies for delivering an e-curriculum about herbs and dietary supplements to diverse health professionals [9]. Baseline surveys questions regarding demographics, professional characteristics, knowledge, confidence and communication scales have been reported previously [9,10]. Dieticians, nurses, pharmacists, physicians, physician assistants, and trainees in one of these health professions were eligible for the study.

The intervention and delivery strategies have been described previously [9,10]. Briefly, the curriculum consisted of 40 case-based self-instructional modules, each of which contained links to evidence-based on-line HDS resources. Enrollees were randomized to one of four different curriculum delivery groups: email delivery over ten weeks (push-drip), email delivery over four days (push-bolus), web availability over ten weeks (pull-drip), and web availability over four days (pull-bolus). The curriculum was delivered in fall, 2004 (concluding in 12/04) and in spring, 2005 (concluding in 4/05). Immediate outcomes were assessed 11 – 15 weeks after randomization.

During the second week of October 2005 (approximately ten months after the first group and six months after the second group had completed the course) all original enrollees were asked to complete a final course evaluation. The email request contained a link to a web page which included the exact same questions as the immediate outcome survey to assess long-term retention and maintenance of knowledge, confidence, and communication practices among course enrollees. Non-respondents received up to three email requests to complete the survey before the November 30, 2005 deadline.

The primary study outcomes have also been described previously [9,10]. Briefly, knowledge scores were the percent of the knowledge questions answered correctly (potential range 0, 100%). A confidence scale score with a possible range of 19 to 95 was derived from responses to 19 Likert-type questions such as "I feel confident responding to patients' questions about HDS;" it had a Cronbach alpha reliability statistic of 0.96. Respondents who had seen patients within the past 30 days completed the communications practices scale, with a range of scores from 0 to 10; the Cronbach alpha reliability statistic was 0.84 for baseline and 0.92 for the immediate outcome assessments for this scale.

Chi-square methods were used for evaluation of associations of categorical variables. For continuous outcomes measures, t tests or analysis of variance (ANOVA) were utilized for normally distributed data, and Mann-Whitney U tests or Kruskal-Wallis tests for non-normally distributed variables. For repeated measures outcomes, paired samples t-tests or Wilcoxon signed rank tests were used, depending on data characteristics. Analyses were performed using SPSS 14.0 (SPSS Inc., Chicago, IL).

This study was approved as "exempt" as an educational research project by the Wake Forest University School of Medicine Institutional Review Board.

Of the 780 participants who completed the course, 385 (49%) completed the long-term follow-up survey six to ten months later (Table 1). Completers (n = 385) and non-completers (n = 395) of the long-term survey had similar age, gender, and practice characteristics and used a similar number of HDS in the week prior to the baseline survey (average of 5.6). There were no significant differences between the completers and non-completers by curriculum delivery strategy, baseline confidence, or communication scores. Knowledge scores were, on average, 1.5% higher among completers.

In repeated measures analyses, there were significant, sustained improvements in knowledge, confidence, and communication practices compared to baseline among those who completed questionnaires six to ten months after the course (Figure 1). Knowledge scores were highest immediately after the course, but the mean score of 78% at long-term follow-up remained higher and better than the baseline mean of 67% (P < 0.001). Confidence and communication scores continued to increase from immediately after the course to the long-term follow-up (Figure 1).

Changes in knowledge, confidence, and communications practices at the 6 to 10-month follow-up did not differ by curriculum delivery strategy. Nor were there significant differences in improvement by age, gender, profession, baseline HDS use, or having paid for CE/CME credit (data not shown). Improvements in knowledge and confidence were affected by enrollment period (fall vs spring) and whether the participant was a trainee or was in practice (Table 2). Communication practices were affected only by practice status, with trainees demonstrating greater improvements than practitioners.

In this long-term follow-up study, the on-line curriculum resulted in significant and sustained improvements in knowledge, confidence, and communication for diverse clinicians regardless of delivery strategy. Outcomes were only related to semester of enrollment and being a trainee versus a practitioner. Those who took the course in the spring had significantly greater improvements in knowledge and confidence scores than those enrolled in the fall. The differences between fall and spring may be because fall completers had substantially more time to forget learned information than their spring counterparts.

Similarly, trainees had significantly greater improvements than practitioners in all three outcomes (knowledge, confidence, and communication). These differences may be due to two factors. First, trainees had lower baseline scores than practitioners, allowing for greater opportunity for improvement. Secondly, trainees presumably have fewer experiences and habits to unlearn than practitioners.

As expected, knowledge scores decreased from initial follow-up to the long-term follow-up. However, even six to ten months after completing the course, knowledge scores were significantly higher than the baseline scores. This suggests significant knowledge retention of the curriculum material. Confidence and communication scores progressively increased from baseline to the long-term follow-up. These observations are consistent with the hypothesis that as individuals had more opportunity to practice the material they had learned, they could reinforce it and feel increasingly more confident and communicate with patients more comfortably.

The results of this study are consistent with previous research that demonstrate the effectiveness of online CME courses [6,7,9]. Although changes in communication in this study were statistically significant, the actual improvements were small. This is consistent with previous research which suggests low communication with patients regarding HDS use [11]. Although previous research has indicated that clinicians' behavior can be improved following training courses [8,12,13], the results of this study indicate that these behavior changes continue to improve long-term. However, additional strategies still need to be developed to more effectively improve clinician's communication practices.

This long-term follow-up study has several limitations. First, the sample consisted of self selected enrollees who elected to learn more about HDS, which limits generalizability to elective courses; it is possible that outcomes would differ for participants in required courses. Another limitation is the low response rate to the long-term follow-up. This limits the generalizability of the outcomes to those individuals who have a greater willingness to complete surveys even after the completion of the initially planned study. Those who are willing to complete such voluntary questionnaires (which were not part of the original study "contract") may have been more knowledgeable and confident about their ability to do well. This conjecture is supported by the observation that those who completed the long-term follow-up had slightly, but significantly higher knowledge scores than the non-respondents. Also, the study relied on self-reported changes in confidence and communication, which may overestimate actual behavioral changes [14, 15, 16]; future studies in this field should corroborate self-report with objective measures of clinician behavior. Finally, we did not collect information on the actual costs of delivering the curriculum through each method because study personnel were engaged in both offering and studying the intervention and did not separately allocate research and education efforts. However, it is our impression that bolus-pull delivery is the least expensive to deliver for participants such as those in this study.

Despite these limitations, results from this long-term follow-up study have important implications for professional education and future research. Online case-based curriculum with evidence-based resource links results in significant and sustained improvements in knowledge, confidence, and communication. These improvements are substantial and do not appear to depend on the delivery strategy, at least among motivated clinicians. Therefore, educators can choose to offer on-line Continuing Education (CE) courses with confidence. Because the delivery strategy of online curriculum does not affect attainment of learning goals, the most convenient and low-cost delivery method can be utilized. Future studies about one-line CME should focus on whether required curriculum would have similar outcomes as elective courses as well as developing interventions that would further improve clinicians' communication patterns.

The author(s) declare that they have no competing interests.

TB drafted the manuscript.

KK conceived of the project, designed the survey and revised the manuscript.

PG reviewed and edited survey questions and revised the manuscript.

CW edited survey questions, analyzed the data and revised the manuscript.

The pre-publication history for this paper can be accessed here:

http://www.biomedcentral.com/1472-6920/6/39/prepub