GTranslate

Abstract

The aims of this pilot study were to evaluate the short-term impact of evidence-based dentistry (EBD) workshops on educators’ use of clinical evidence in their clinical practice and educational activities and to identify barriers they encountered in implementing evidence in their teaching and clinical practice. Between April 2012 and January 2014, a series of EBD workshops was delivered to 31 dental faculty members and postdoctoral students at three Canadian dental schools. Survey I, assessing participants’ perceptions of various aspects of the workshops, was administered immediately following the workshops. Survey II, evaluating the impact of the workshops on participants’ EBD implementation, was conducted 10 to 31 months after their completion. Survey I was completed by all 31 participants (100% response rate); their mean scores ranged from 3.94 to 4.65 on a five-point scale. Survey II was completed by 20 participants (64.5% response rate; five postdoctoral students and 15 faculty members), using an online 20-item questionnaire. Of the respondents, 19 (95%) reported implementing EBD in their professional activities at that time, and 14 (70%) stated that the workshops had positively helped with EBD implementation. Eight respondents (40%) reported having experienced barriers to EBD implementation, while 15 (75%) reported that their patients/students welcomed use of EBD. The respondents reported believing that strategies such as increasing EBD education and dissemination and improving quality and accessibility of evidence would facilitate the transition to EBD practice. Reported barriers to EBD implementation included resistance and criticism from colleagues, difficulty in changing current practice model, and lack of time.

Traditionally, dental practice has been based primarily on professionals’ training and experience.1 Hence, practitioners have been accustomed to making clinical decisions based on established comparisons among their own patients.2 However, evidence-based practice (EBP)—the conscientious, explicit, and judicious use of the best current evidence—has now been defined as a more robust way to make individualized and efficient clinical decisions.3 Potential benefits from the adoption of EBP in dental settings include reduction of clinical errors, increased decision making ability, reduced treatment variability, and increased levels of patient and professional satisfaction.4,5

Although dentists may recognize the importance of evidence-based dentistry (EBD), previous studies have identified barriers that hinder its implementation, such as inaccessibility to relevant evidence sources, low-quality clinical research, lack of practitioners’ knowledge/training, lack of time, and financial restrictions.6–10 Therefore, it is not surprising that a significant gap between the available evidence and current dental practice has been found.11–13 Integrating scientific knowledge into clinical practice requires enormous effort, including not only identifying the barriers to knowledge transfer14,15 but also designing individualized and effective interventions to enable practical use of research findings.16–18

While EBD training in dental school curricula is now required in Canada,19 a significant variability in the scope and depth of EBD education has been identified among dental schools.20–23 For EBD to be successfully incorporated into curricula, faculty members must embrace it.24 Trained clinical educators would be able to demonstrate the application of EBD in patient care, helping to ensure students, as future practitioners, value, learn, and apply its concepts.25

Based on the model developed by the EBD section of the American Dental Association (ADA),26 a series of EBD workshops was conducted in Canada. The premise of the workshops was that, by training dental educators, a multiplier effect would occur in the way both faculty and students used EBD in their clinical decision making. The aims of this pilot study were to evaluate the short-term impact of the EBD workshops on educators’ use of clinical evidence in their clinical practice and educational activities and to identify barriers they encountered in implementing evidence in their teaching and clinical practice.

Methods

This observational descriptive study received ethical approval from the University of Alberta Research Ethics Board 2 (Pro00051364). Between April 2012 and January 2014, three separate two-day hands-on EBD workshops were delivered to faculty members and postdoctoral students at three Canadian dental schools. Invitations to attend were sent to academics from all ten dental schools in Canada. The workshops, funded by the ADA and the Canadian Institutes of Health Research, were designed to introduce attendees to key concepts of critical appraisal of the dental literature and to train them to prepare critical summaries of systematic reviews according to ADA guidelines.26 The itinerary and content of the workshops are available from the corresponding author.

Immediately after each workshop was concluded, the participants completed Survey I, comprised of an anonymous self-administered evaluation of the overall workshop methodology. The purpose of Survey I was to evaluate the clarity and efficacy of the content, activities, and materials used to train attendees during the workshops and to improve and refine future EBD training.

In addition, within 31 months of completion of the workshops, all attendees were invited via email to anonymously complete Survey II. This online questionnaire using SurveyMonkey assessed the impact of the workshops on attendees’ implementation of EBD in their clinical and teaching activities. To increase the response rate, three reminders were sent 11, 18, and 46 days after initial contact.27 The instrument used was an adapted version of the questionnaire created by Spallek et al.14 Our questionnaire had 20 questions: ten related to demographics and professional development; one focused on the impact of the workshops on one’s attitudes and actions towards EBD use; four addressed experience in implementing EBD in clinical practice and teaching activities; four regarded barriers experienced during implementation and suggestions for overcoming barriers; and one open-ended question asked for general comments.

Data were independently collected and analyzed by two of the authors (NCFM and KLD). Descriptive statistics was computed for all Survey II closed-ended question responses using SPSS Software for Mac OS, Version 22.0 (IBM Corp., Armonk, NY, USA). Although planned initially, inferential statistics could not be used due to the limited sample size in this pilot study. Open-ended question responses were independently coded and categorized by the same researchers, using cutting and sorting and searching for similarities and differences across units of data.28 Disagreements were resolved through discussion.

Results

A total of 31 dental faculty members and postdoctoral students attended one of the three two-day workshops. As Survey I (collecting participants’ evaluations of the overall workshop methodology) was administered immediately after delivery of each session, a 100% response rate was obtained. The online Survey II (assessing workshop impact on attendees’ implementation of EBD, sent within 31 months after the workshop) obtained a 64.5% (n=20) response rate.

Survey I

All 31 participants (100%) provided immediate feedback following the workshop, which covered their perspectives on training content, activities, materials provided, and overall workshop organization. The participants’ status as either faculty members or postgraduate students was not collected on this survey.

A five-point scale was used for the evaluations, with response options from 1=not useful to 5=very useful. All attendees (n=31; 100%) rated day 1 of the workshop as useful or very useful. On the five-point scale, respondents gave an average 4.53 to the lecture on systematic reviews and critical summaries and 4.65 to the hands-on exercises. Across all three workshop sessions, only five respondents (16.1%) stated they would prefer working individually rather than as pairs. The lecture on statistical methods and the effectiveness of group discussions received average scores of 3.94 and 4.55, respectively. The 28 respondents (90.3%) who evaluated the ADA EBD website portal considered it to be useful or very useful, with an average score of 4.57.

Survey I sections allowed for multiple comments from respondents. With respect to the workshop material used during the hands-on exercises and presentations, 12 participants pointed out that receiving those contents ahead of time would have made their learning more effective (“If we can get the presentations as handouts before the actual presentations, they will be helpful,” one wrote). Seven comments pointed out the value of receiving a summarized version that outlined what would be expected from attendees (for example, “A concise outline/structure with relevant points to cover in point form would have been useful”). When participants were asked if they read the materials provided before attending the workshops, seven comments (22.6%) included justifications for not reading them related to workload and lack of time (one noted, “Short time and I got them at students’ exam time and I was so busy with evaluations”).

Survey II

Twenty of the 31 (64.5%) attendees completed the online Survey II assessing the impact of the workshop on their current daily professional activities. Five of these respondents were graduate students, and the rest were faculty members. Characteristics of the respondents are shown in Figure 1 and Figure 2.

Of the respondents, 14 (70%) reported reading at least one dental or biomedical journal on a regular basis, with an average of 3.4 journals. Among the 48 specific journals reported (multiple answers were allowed), the Journal of Dental Research was mentioned most often (n=5; 10.4%), followed by the Journal of Evidence-Based Dentistry, Journal of the American Dental Association, Journal of Canadian Dental Association, and Journal of Prosthetic Dentistry (n=3; 6.2% each). The other journals named were reported once each. One participant reported “mostly [doing] PubMed searches and [pulling] articles of interest from [the university] library.”

With respect to continuing education, seven (35%) participants reported attending journal clubs, eight (40%) attending study groups, and seven (35%) reading journal articles for continuing education credits, either on an occasional or regular basis. On average, the respondents had attended 5.6 online and onsite courses (ranging from 0 to 15) and 4.2 conferences in the previous 12 months (ranging from 0 to 12). The respondents’ self-assessed knowledge of the difference between common EBD paired terms is shown in Figure 3.

Survey II respondents’ self-assessed knowledge of the distinction between common pairs of terms used in evidence-based dentistry (N=20)

Impact on EBD implementation

Of the respondents, 14 (70%) affirmed that the workshops had had a positive impact on their EBD implementation. Among the respondents who provided feedback on how the workshops had been specifically helpful in translating scientific evidence into their daily activities, six comments indicated the workshops were valuable in improving critical appraisal skills (e.g., “I have learned how to criticize clinical research papers in a more systematic manner”). One participant reported the workshop was helpful in changing his behavior and that it was a turning point in his decision making and teaching practice. One participant noted the workshop improved his access to evidence by making him aware of the availability of several EBD websites, and another commented that concepts and information about statistical analysis and meta-analysis had helped with EBD implementation.

Of the five who responded that the workshop was not helpful, only one provided feedback, stating that the workshop, although “good to a point,” had been “rigid and not very well established.” One of the survey respondents did not respond to this item.

Experience with EBD implementation

Nineteen of the 20 respondents (95%) reported currently implementing EBD in either clinical practice or teaching. One respondent noted not applying EBD at the time of the survey but being in the early planning stages of doing so.

Among those implementing EBD, 16 (84%) provided more details on what they were doing differently. These included incorporating EBD into their teaching or research practices (n=7; 36.8%). One respondent stated, “I use more SRs [systematic reviews] in my lectures [and] conduct more SRs before developing my research proposals.” Other changes reported were related to improvement of critical thinking skills (n=3; 15.8%), seeking evidence before decision making processes (n=3; 15.8%), incorporating EBD into the clinical decision making processes (n=2; 10.5%), and improvement of knowledge regarding concepts explained in the workshop (n=2; 10.5%).

Eighteen respondents (90%) stated that it is customary for them to share information with their colleagues or to guide colleagues through the use of an EBD approach. Of the 23 instances cited (multiple comments allowed), most reported relying on interactive knowledge transfer activities (n=8; 34.8%) (for example, “we discuss EBD when we are going to teach students”) or unidirectional knowledge transfer activities (n=6; 26.1%) (one noted “giving information orally”), mainly followed by either direct provision of evidence sources to colleagues (n=4; 17.4%) (one reported “providing pdf of recent Cochrane SR on a particular topic”) or guidance with locating evidence (n=3; 13.0%) (“Inform them of the availability of the EBD site and its ease of use,” one wrote).

When asked how patients and/or students reacted to their practicing EBD, five respondents (25%) were negative. The negative responses included that students found EBD difficult to use (n=2; 10%), that experienced students were locked in opinionated methods (n=1; 5%), and that students were solely interested in outcomes (n=1; 5%). One respondent who did not provide further details simply declared that “some [students] react negatively.” In most cases (n=15; 75%), however, the respondents reported that patients/students reacted well to EBD practice (one noted that “it makes them feel safer”).

Barriers to EBD implementation

Eight (40%) of the respondents reported having experienced barriers when using EBD tools or sources to support clinical decision making. These barriers included encountering resistance and criticism from colleagues (n=2; 10%) (one noted, “It is hard to convince other people about existing evidence on a particular topic”), difficulty in changing current practice model (n=2; 10%) (“At the level of the faculty,” one wrote, “it is difficult to implement EBD to all subjects because not everyone is willing to change their method of teaching”), and lack of time to search for evidence or practice EBD (n=2; 10%).

Six (30%) of the respondents reported having experienced concerns or objections against using an EBD approach to clinical decision making from their students and/or colleagues, with half describing difficulties in changing one’s behavior due to being overly reliant on personal experiences (one commented on “colleagues who firmly believe that what worked for them in the past still works now even though the evidence shows the opposite”). Table 1 shows respondents’ ratings of barriers to EBD implementation.

One attendee provided additional feedback, noting that general practitioners have limited access to evidence (journals and books) if they are not associated with an academic institution, including dental hygiene programs. Respondents’ suggestions of strategies to facilitate transition to EBD are shown in Table 2.

Discussion

Prior studies have identified strategies to implement EBP, including the use of systematic reviews and evidence-based guidelines, audiovisual resources, and electronic publications.16,17 However, studies have also favored active knowledge translation strategies, such as educational meetings and workshop trainings18,29 over passive ones.30 Therefore, it seems highly desirable to support efforts to encourage the conception, evaluation, and enhancement of dynamic and interactive knowledge translation interventions, such as the workshop we assessed in our study.

Faculty members’ positive overall attitudes towards EBD have been frequently reported.29,31–34 Our workshop participants expressed similar attitudes in Survey I, in which both the lectures and the hands-on exercises were found to be useful. However, other studies have found overall preference for practical activities.33,35 The participative and interactive format used in our workshops was perceived as beneficial for attendees to learn the EBD concepts. This finding supports the idea that learning environments should promote self-monitoring skills, allowing attendees to regulate their learning and actively engage themselves in learning tasks.35

According to the Survey II results, one of the most important self-reported benefits from the EBD workshops was improvement of attendees’ critical appraisal skills when assessing evidence. This ability is key to establishing an evidence-based approach and supports the development of students’ critical thinking skills.36 Our findings support previous studies in which similar educational strategies were used with educators from various health sciences disciplines.29,37 Critical appraisal is a core skill for EBP, so that professionals can make sense of evidence and judge its trustworthiness, value, and relevance in particular contexts.38 This skill becomes even more critical when dealing with the continuously increasing magnitude of published evidence, particularly when concerns are raised regarding the quality of that evidence39 or when evidence-based clinical guidelines synthesizing scientific evidence and helping professionals in their clinical decision making are still missing.40

On the other hand, a small part of the sample in our study did not take full benefit of the workshops, suggesting a need for improvement in the workshop format. However, we do not know if the variations in perceptions of the workshops’ value might have resulted from disparities in participants’ pre-workshop levels of EBD knowledge and skills since no pretest was performed. As suggested by Forrest and Miller, a more thorough assessment of the initial situation or problem can provide valuable information to help refine the goals and activities of future EBD training sessions.41

All workshop attendees in our study reported that they were currently implementing EBD or planning to do so in the near future, either during their clinical or teaching practice. They perceived that the workshops improved their knowledge and skills in EBD and facilitated application of these concepts during their routine decision making processes. Other practical effects of EBP workshops have been documented by other researchers, including the improved ability to determine clinical applicability of evidence, review one’s own practice, collaborate with other professionals, and teach EBD concepts in home institutions.29,37,41 The sharing of EBD knowledge was also reported by most of our respondents, as they stated they routinely guide their colleagues through an EBD approach, corroborating the idea that trained faculty members can mentor their peers.36

With respect to students’ and/or patients’ reactions to EBD practice among our respondents, there were only a few reports of resistance from students, mainly related to the difficulty in applying EBD concepts. Hence, attendees reported that the majority of students and patients reacted positively, including that it increased their trust in the treatment options provided. This result is consistent with another study’s finding that patients were seen as EBD facilitators.14 Therefore, making EBD courses more frequent, teaching students how to critically appraise evidence, and making the use of evidence-based treatments mandatory in the entire dental program should make EBD increasingly familiar, reducing the difficulty of embracing it later on. Moreover, teaching dental students how to be lifelong learners is key to maintaining the application of current best evidence in daily practice, thus optimizing patient outcomes.36

According to our respondents, their colleagues and students reacted well to the use of EBD in decision making, with only 30% reporting concerns or objections from their peers or students. The main barrier to implementation of EBP in these cases related to difficulties in changing one’s behavior due to overreliance on personal experiences.1,2,14 Integrating EBD in the early years of dental education may potentially reduce this issue, as students, with proper guidance, may begin to adopt EBP as a customary behavior.

Some of the barriers reported in a previous study were seen as less problematic obstacles in our study.15 These previously reported barriers included difficulties in interpreting research results due to academic language, lack of familiarity with searching for relevant information, and the high cost of academic journals. The absence of those barriers in our study could be predictable, as our sample was comprised of academics who share an interest in research and often undergo specific training. Furthermore, individuals affiliated with an educational institution have nearly unlimited access to publications and resources at no personal cost. General practitioners, on the other hand, have reported lack of appropriate training, inaccessibility of evidence, financial restrictions, and lack of time due to heavy workload as significant barriers.6–8,10

Corroborating the findings of Spallek et al.,14 we found the most demanding barriers to be lack of up-to-date evidence for many devices and products used in the profession and continuing dental education courses that were not up-to-date with respect to evidence. Whereas the former obstacle can be partially explained by current continual launching of new products into the dental market before related evidence becomes available to the users, the latter is indeed a disturbing barrier due to the fact that continuing education is usually intended to improve patients’ quality of care. Therefore, it would be expected to be grounded in sound scientific basis, but it seems to be otherwise, essentially based on case reports of new techniques or products that have yet to be subjected to rigorous scientific investigation.

Advantages of our study included that the time span in which participants completed the online Survey II (ten to 31 months after the workshops) allowed attendees to reflect on the concepts learned and reinforced during training sessions and to experiment and consolidate them into practice. This study also obtained relatively high response rates, particularly when compared to standard response rates to web-based surveys.42 This higher rate may have been due to the common interest in EBP shared by our sample. It is possible that the response rate could have been further improved if a mail option was also provided.43

The main limitation of our study was related to the fact that, as a pilot study, it was comprised of an initial small sample size and therefore no inferential statistical analysis could be justified. Therefore, the results from this study, while useful in informing the design and implementation of future larger scale studies of EBD training, must be interpreted with appropriate caution until replication research can verify our findings. Due to the very nature of a pilot study and its small sample size, our results should not be considered a preliminary test of hypotheses related to the efficacy of this format of workshops on attendees’ EBD implementation.44 In addition, the fact that we did not distinguish the faculty members’ responses from those of the postdoctoral students means we were unable to separately assess the workshops’ impact on the faculty members, who will have a continuing impact on dental students and dental education into the future.

Moreover, as this study was performed within a primarily academic population, our findings represent the views of a restricted group of individuals exposed to a very specific EBD workshop format. Hence, the representativeness of their views may not reflect the ones shared by the dental community as a whole or even of most dental academics. Nevertheless, the responses and comments can be considered a useful initial step for related research in the future. In that sense, it seems highly desirable to conduct future investigations describing academic and non-academic dentists, as well as to compare the behaviors of both groups of professionals after EBD educational interventions.

Thomas et al. found that teaching interventions tended to produce a greater impact on knowledge and skills than they did on sustainable EBP behaviors.35 Even though we did observe an overall positive short-term effect, future studies should thus focus on behavioral long-term effects of EBP teaching interventions. Additionally, as well as an introduction of EBD concepts through hands-on exercises, expanded workshops with more regular online feedback opportunities could be developed.29 Ongoing follow-ups, rather than a single evaluation, could provide a better understanding of how attendees’ implementation of EBD into practice unfolds over time.41 Furthermore, it would be helpful to further explore specific, individualized, and effective EBD teaching methods37 and to develop and implement distance-learning strategies, focused on supporting attendees during their everyday clinical or teaching practice.

Conclusion

This pilot study found that the participants rated the EBD workshops positively and reported a high level of EBD implementation in their professional activities afterwards, with 70% stating the workshops had helped with EBD implementation, mainly by improving their skills to critically appraise evidence. The attendees also reported that, while both patients and students reacted positively to EBD, some students found its implementation challenging. Reported barriers to the use of EBD included colleagues’ and students’ being overly reliant on previous personal experiences as well as resistance and criticism from colleagues, difficulty in changing current practice model, and lack of time. These findings will help in revising these workshops in the future and also contribute to the continuing search for the most effective ways to expand use of EBP in dental education and practice.

. Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. Cochrane Effective Practice and Organization of Care Review Group. BMJ1998;317(7156):465–8.

. Exploring knowledge, attitudes, and barriers toward the use of evidence-based practice amongst academic health care practitioners in their teaching in a South African university: a pilot study. Worldviews Evid Based Nurs2010;7(2):90–7.