Using a Simulated Infobutton Linked to an Evidence-Based Resource to Research Drug-Drug Interactions: A Pilot Study with Third-Year Dental Students

Dr. Dragan is Assistant Professor, Department of Periodontology, Tufts University School of Dental Medicine; Dr. Newman is Professor Emeritus, University of California, Los Angeles School of Dentistry; Dr. Stark is Adjunct Professor, Tufts University School of Dental Medicine; Dr. Steffensen is Professor and Chair, Department of Periodontology, Tufts University School of Dental Medicine; and Dr. Karimbux is Professor and Associate Dean for Academic Affairs, Tufts University School of Dental Medicine.

Dr. Dragan is Assistant Professor, Department of Periodontology, Tufts University School of Dental Medicine; Dr. Newman is Professor Emeritus, University of California, Los Angeles School of Dentistry; Dr. Stark is Adjunct Professor, Tufts University School of Dental Medicine; Dr. Steffensen is Professor and Chair, Department of Periodontology, Tufts University School of Dental Medicine; and Dr. Karimbux is Professor and Associate Dean for Academic Affairs, Tufts University School of Dental Medicine.

Dr. Dragan is Assistant Professor, Department of Periodontology, Tufts University School of Dental Medicine; Dr. Newman is Professor Emeritus, University of California, Los Angeles School of Dentistry; Dr. Stark is Adjunct Professor, Tufts University School of Dental Medicine; Dr. Steffensen is Professor and Chair, Department of Periodontology, Tufts University School of Dental Medicine; and Dr. Karimbux is Professor and Associate Dean for Academic Affairs, Tufts University School of Dental Medicine.

Dr. Dragan is Assistant Professor, Department of Periodontology, Tufts University School of Dental Medicine; Dr. Newman is Professor Emeritus, University of California, Los Angeles School of Dentistry; Dr. Stark is Adjunct Professor, Tufts University School of Dental Medicine; Dr. Steffensen is Professor and Chair, Department of Periodontology, Tufts University School of Dental Medicine; and Dr. Karimbux is Professor and Associate Dean for Academic Affairs, Tufts University School of Dental Medicine.

Dr. Dragan is Assistant Professor, Department of Periodontology, Tufts University School of Dental Medicine; Dr. Newman is Professor Emeritus, University of California, Los Angeles School of Dentistry; Dr. Stark is Adjunct Professor, Tufts University School of Dental Medicine; Dr. Steffensen is Professor and Chair, Department of Periodontology, Tufts University School of Dental Medicine; and Dr. Karimbux is Professor and Associate Dean for Academic Affairs, Tufts University School of Dental Medicine.

GTranslate

Abstract

Many health professions students and clinicians are using evidence-based databases that allow for quicker and more accurate clinical decisions. The aims of this pilot study were to compare third-year dental students’ speed and accuracy in researching questions about drug-drug interactions (DDI) when using two different methods: a simulated infobutton linked to the evidence-based clinical decision support resource UpToDate versus traditional Internet resources accessed through a computer or smart device. Students researched two simulated cases during two sessions. In the first session, half the students used the infobutton, while the other half used traditional electronic tools only. In the second session, ten days later, a cross-over took place. The sessions were timed, and after researching the case, students answered three questions on the use of antibiotics, analgesics, and local anesthetics. Of the 50 students who volunteered for the study, two were excluded, and 44 participated in both sessions and the exam. The results showed that the students took a similar amount of time to identify DDI whether they used the infobutton (mean=286.5 seconds) or traditional tools (265.2 seconds); the difference was not statistically significant (p=0.429). Their scores using the two research methods were similar in all three content areas: antibiotics (p=0.797), analgesics (p=0.850), and local anesthetics (p=0.850). In a post-intervention survey, students were generally favorable about infobutton and UpToDate, reporting the tool was easy to use (62.5%), provided the answer they were looking for (53.1%), was fast (50%), and they would use it again (68.8%). This pilot study found that the time and accuracy of these students conducting DDI research with the infobutton and UpToDate were about the same as using traditional Internet resources.

Electronic health records (EHRs) are digitally retained records of health care information that can improve the quality of care, education, and research. EHRs may contain information such as patients’ identifying information, medical history, clinical observations of their providers, laboratory tests, medical images, treatments, and drugs prescribed.1 The U.S. Department of Health and Human Services is establishing a network of regional extension centers to assist providers in adopting certified EHRs and making “meaningful use” of them.2–4 As part of the Affordable Care Act, the meaningful use program is designed to support health care professionals in using EHRs, thereby improving the quality and safety of the national health care system. Some of the goals of the meaningful use initiative are promotion of the privacy and security of patient information and provision of immediate access to evidence-based information in standardized systems known as context-aware knowledge retrieval applications. Studies such as those by Sackett and Kao have emphasized the need for evidence-based practice in the health professions and for graduates to become critical thinkers, be able to understand current research, and incorporate the results into their clinical decision making in an attempt to improve treatment outcomes.5,6 In dentistry, the Commission on Dental Accreditation (CODA) now requires that graduating students be competent in using evidence-based dentistry (EBD) principles as they pertain to patient care.7

In clinical settings, EBD content is accessed through various devices and computers through clinical decision-support systems (CDSS). The effects of CDSS on clinical outcomes, treatment efficiency, patient satisfaction, cost of treatment, and system implementation have been analyzed in research in medical settings, as in the study by Hayes and Haines.8 Studies by Bader et al., Forrest and Miller, and Hannes et al. are among those published on this topic in the dental setting.9–11

The tools known as “infobuttons” are context-aware knowledge retrieval applications embedded in the EHR designed to assist clinicians at the point of care by providing a list of links to resources that contain evidence-based clinical information. One of the external resources available through infobuttons is UpToDate (www.uptodate.com), a clinical decision support resource that offers the latest research-based evidence and is used mainly by physicians in their daily clinical activities. Infobuttons have been adopted by many health care organizations to assist providers in the clinical decision making process.12–14 A study by Maviglia et al. found that an infobutton provided answers to clinical questions in a timely manner, with a high level of user satisfaction.15

To the best of our knowledge, the use of infobuttons has not yet been integrated into the electronic health record in clinics in dental schools. To begin to explore the potential value of infobuttons in dental school clinics, the aims of this pilot study were to assess the effects of an infobutton linked to UpToDate on dental students’ speed and accuracy in researching drug-drug interactions (DDI) in a Simulation Clinic. The assessment was designed as a comparison of third-year students’ speed and accuracy in using the infobutton versus traditional Internet resources accessed through a computer or smart device. Students’ perceptions about the use of this technology were also assessed.

Materials and Methods

Following approval by the Tufts University Institutional Review Board, the study was conducted in the Simulation Clinic at Tufts University School of Dental Medicine. The design was a pilot randomized blinded crossover controlled trial and involved third-year dental students (Class of 2016). The entire class of 195 dental students was invited to participate in the study. Whether students volunteered to participate or not had no effect on their academic standing. Students enrolled in the international program were excluded because of their prior exposure to medically compromised patients. Based on previous studies,13,14 we determined that a convenience sample of 45 students would be needed for the pilot study.

In the first session, students who volunteered were randomly assigned to one of two cohorts: a test cohort that used an infobutton with a direct link to UpToDate to research DDI in simulated cases, and a control cohort that used general Internet access to research information for the same cases. Each cohort reviewed the cases in an EHR format on the clinic’s computers. For the test cohort, the cases presented had an “I” emblem next to the pertinent medical information that directly hyperlinked to UpToDate as a resource. By clicking on the hyperlink, these students were directed to the DDI section on UpToDate. The students were clearly instructed to use only the UpToDate resource for identifying the correct DDIs and to use no other materials. The control cohort researched the same two cases for DDI but had no infobutton links in their cases; these students used Internet resources accessed through a computer or smart device.

The participating students worked at individual work stations with similar computers to access the cases. The computers had been tested to make sure the Internet connection was working. The TUSK (Tufts University Sciences Knowledgebase) learning management system was used to house and provide access to the simulated patient cases. Specific guidelines were given to the participants via the TUSK platform.

A brief summary of each case was presented at the beginning of the session, followed by three questions regarding the DDI. The patient cases were selected from a library of cases used by the school for introducing axiUm into the preclinical setting. The criteria we used for case selection were a minimum of two medical conditions that are frequently encountered in the dental clinic patient population, such as hypertension or diabetes, and a minimum of three prescribed medications.

The time each student required for each case was recorded automatically on the computer. A timer showing a limit of ten minutes per case was set up in Qualtrics, a web-based platform. After reviewing the cases and researching information, the students were asked to answer the same three questions that had been used to guide their research. The questions focused on the indications/contraindications for prescribing analgesics and antibiotics and for administering local anesthesia with or without epinephrine. Each correct answer was scored with one point; the maximum score possible was 3 on each case. At the end of the session, those students using the infobutton (the test cohort) completed a survey regarding its use as a resource.

In the second session, ten days later, a cross-over took place. Those students who were in the test cohort in the first session were placed in the control cohort for the second session and vice versa (Figure 1). The new control cohort researched two simulated patient cases using Internet resources accessed through the computer or smart devices, while the new test cohort researched the same two cases using the infobutton and UpToDate resources. Data regarding time, correct answers, and perceptions were again collected.

Crossover design of the two study sessions with test and control conditions

A blinded investigator computed the students’ answers to the DDI questions and the survey data on their perceptions of the infobutton and UpToDate resources. Identification of exposure to the test or control conditions was revealed to the investigator only after data analysis. The mean duration of time taken to answer the three questions per case, two cases per session, for each session was calculated for each subject. The means in each group were compared using an independent sample t-test to assess differences between groups. The chi-square test was used to compare the scores on the DDI questions for the two groups. Percentages were determined for the survey responses. All analyses were performed using SAS Version 9.2 (SAS Institute, Cary, NC, USA). Any p-value less than 0.05 was considered statistically significant.

Results

A total of 50 students volunteered and presented themselves at the first session. Of these, two were not eligible as they had not started their clinical training at the time of the study. Of the 48 students who completed the first session, 44 students returned for the second session. Therefore, results from 44 participants were available for analysis.

The dental students who used the infobutton linked to UpToDate took a similar amount of time to identify the correct DDIs (mean=286.5 seconds) as the dental students who had access only to general Internet resources (mean=265.2 seconds) (Table 1). This difference was not statistically significant (p=0.429). Figure 2 shows the comparison between the cohorts exposed to both conditions (test and control) for each case in the two sessions. The mean difference in time taken to complete case 1 versus case 2 in each session was greater when the cohort was in the test condition than in the control condition. Similar examination scores were obtained under each condition for the questions in each content area: analgesics (p=0.850), antibiotics (p=0.797), and local anesthetics (p=0.850) (Table 2).

Comparison of time required for students to research a case under test conditions (blue) and control conditions (red)

Note: Cases 1 and 2 were researched in the first session, and cases 3 and 4 were researched in the second session. Students in the test condition used the infobutton and UpToDate resource; students in the control condition researched traditional sources on the Internet.

Record of correct scores on three drug-drug interaction questions in each session and total, by number of correct scores obtained and percentage of students who correctly answered questions in test condition and control condition

In the post-intervention survey, the students provided generally positive feedback regarding use of the infobutton and UpToDate (Table 3). Responses of agree/strongly agree, very often/always, and fast/very fast were combined to yield percentages of positive responses regarding the tool’s ease of use (62.5%), provision of the answer they were looking for (53.1%), speed of use (50%), and whether they would use it again (68.8%).

Students’ perceptions regarding use of the infobutton and UpToDate resources in researching drug-drug interactions in cases, by percentage of total respondents (n=44)

Discussion

Our study found that the use of an infobutton linked to UpToDate allowed students in the Simulation Clinic to answer DDI questions at about the same speed and accuracy as using traditional Internet resources and with a moderate level of student satisfaction. A randomized controlled trial in medical education found that using topic-subject infobuttons was faster than Internet research without infobuttons.14 In that study, the intervention group (with access to specific links) spent an average of 35.5 seconds gathering pertinent information compared with 43 seconds for users in the control group. In our study, the test and control cohorts had similar overall times for determining DDI during the first session. However, during the same session, we observed a decrease in the time needed for the second case among students in the test group. It is possible the students were adapting to using the link to UpToDate. Some of the participants noted that a tutorial would help them understand the layout for the resource.

In our study, the time was measured precisely with the Qualtrics software. This approach had the advantage of reducing the risk of bias that can occur in studies in which clinicians measured their own times.13,14 However, one variable that we did not account for, and may be considered a limitation of our study, was the lag experienced by some students if they were using the clinic’s computers rather than their own devices (tablets or smartphones) to access information. Future studies should consider devices with identical access speed for test and control conditions. Future research might also compare the individual results for each participant for both sessions to measure whether there is a relationship between infobutton access and the traditional method.

Our study found that the cohort using the infobutton achieved similar examination scores as the control cohort. The type of assessment used in this study (multiple-choice) may not have truly assessed the outcomes we hoped to measure and is another potential limitation. The value of using infobuttons with direct links to the latest and strongest information available goes beyond being able to answer multiple-choice questions. It is also very difficult for third-year dental students to critically evaluate the large amount of scientific evidence available. Therefore, it is possible, and of some concern, that students may rely on the first information they encounter online, without having the background or experience to assess the reliability of the source. Interestingly, these students’ scores indicated that they were able to identify the correct DDI as a first step in the prescription process. In this study, we tried to control for any ambiguity by selecting simple cases that were linked to a drug with specific relevance to treatment of the case.

As emphasized by the World Health Organization in its Guide to Good Prescribing, identifying DDIs and writing good prescriptions are only the initial steps in prescribing medications for patients.16 Equally important are educating patients on appropriate use of the prescribed medicines and monitoring treatment, including follow-up visits. For this study, we selected third-year dental students with limited experience in DDI. However, we could not control for any students who may have had previous experiences that helped them navigate the technology or the cases presented in this study.

When Litvin et al. assessed the impact of a clinical decision support system on antibiotic prescribing, their investigation examined inappropriate prescription of antibiotics.17 They concluded that the system showed promise for promoting judicious antibiotic use in primary care. Although they did not include a control group and the number of subjects tested was limited to 39 participants, they did conduct the first study showing an improvement in prescription of antibiotics based on use of a clinical decision support system.

In our study, the determination of sample size had to be based on previous studies. Due to the paucity of similar research, we could not conduct a power calculation and therefore decided on a convenience sample for this pilot study. Another limitation is that whereas we had determined that we needed 45 students for this pilot study, we were able to get only 44 participants. This limitation further restricts the generalizability of our findings beyond the fact that the study took place in only one year at one dental school. However, our results can be used by other investigators to determine sample size. For example, if we consider time as an outcome in our study (the mean time for the test condition was 286.5 seconds per session and the mean time for the control condition was 265.2 seconds per session), a total of 126 subjects should provide 80% power to detect true differences. Future studies should consider this sample size with a fully integrated infobutton system in a working EHR.

Conclusion

This study provides new information on the application of web-based resources of clinical information that may improve the quality of education for dental students and enhance patient care in a dental setting. Our study is the first to assess the impact of an infobutton linked to an evidence-based clinical decision system on students’ speed and accuracy in determining DDI in a dental school setting. Although the time required and accuracy were comparable for using this resource and traditional Internet searches, it is possible those similarities actually demonstrate a benefit of the new technologies since they were being introduced for the first time to students in this study and would likely have become easier to use and more helpful with repeated sessions and/or additional training. Future studies are needed to better define the costs and benefits of these technologies in ways that can help dental schools determine whether to integrate such systems into their simulation and patient clinics. Other studies might be designed to test and explore infobuttons that are functionally connected to EBD resources and integrated directly into a working EHR.

Acknowledgments

We would like to thank Ms. J. Murphy for her excellent support while designing and analyzing the surveys presented in the study, using the Qualtrics platform. In a similar manner, Dr. Britta Magnuson and Mr. Jacob Silberstein contributed significantly while working for the IRB approval of the project. The project could not have been implemented without the professionalism, commitment, and dedication of three very special individuals: Ms. D. Vannah, Pooyan Refahi D’15, and Shawheen Saffari D’16. Finally, we wish to express our deepest gratitude and appreciation to the predoctoral dental students who kindly volunteered to be part of the project.