Wolters Kluwer Health may email you for journal alerts and information, but is committed
to maintaining your privacy and will not share your personal information without
your express consent. For more information, please refer to our Privacy Policy.

Author Information

Abstract

Background: The degree to which scores on the mini-Clinical Evaluation Exercise (mCEX), a feedback tool, correlate with performance on future clinical skills examinations is unknown.

Method: Medicine clerkship students in 2006 and 2007 completed eight mCEX cards and took an end-of-clerkship standardized patient (SP) exam and an end-of-year multidisciplinary SP exam. Pearson correlations estimated relationships between average mCEX scores and scores on the SP exams.

Results: A total of 244 students met inclusion requirements. Observed correlations between mCEX performance and SP checklist/notes and interpersonal skills scores ranged from 0.15 to 0.39. Disattenuated estimates were 0.21 to 0.50.

Conclusions: This is one of the first studies comparing mCEX and SP exam performance. Explanations for the generally modest correlations merit further consideration.

Formative and summative assessment of medical students' clinical skills is a critical component of medical student education.1 As such, multiple tools have been developed to assess students' clinical skills with actual patients, with one of the most well studied being the mini-Clinical Evaluation Exercise (mCEX). The mCEX, developed by the American Board of Internal Medicine, is used to provide formative feedback to trainees about their history taking, physical exam, counseling, and interpersonal skills.1–3 Prior research has demonstrated the reliability, concurrent and predictive validity of scores when used with internal medicine (IM) residents4,5 and students1–3; however, the predictive validity of mCEX scores with medical students has largely focused on correlating mCEX performance with scores on multiple-choice exams, patient write-ups, and summative clinical evaluations of students by their residents and attendings.3 Whether mCEX performance during actual patient encounters correlates with students' short-term and longer-term clinical performance on standardized patient (SP) examinations has not been studied.

In this study, we evaluated the predictive validity of the mCEX by correlating IM core clerkship students' mCEX scores with their subsequent clinical performance on SP examinations. We hypothesized that mCEX performance would have a modest, positive correlation with student performance on two SP exams, one taken at the end of the IM core clerkship and the other at the conclusion of the core clerkship year. Additionally, we hypothesized that there would be a small correlation of competency-specific measures (i.e., history taking, physical exam, interpersonal skills) on the mCEX with the corresponding competencies on the SP exams, and that correlations would increase over the clerkship year as students became more clinically proficient.

Method

After obtaining IRB approval, we examined data for all students taking the eight-week inpatient IM core clinical clerkship at our institution in 2006 and 2007. Clerkships run from January to December and are scheduled in four 12-week blocks. The medicine block combines the eight-week IM rotation with a four-week family medicine clerkship. Students were required to complete eight mCEX cards (four by housestaff, four by attendings) and return them to the course director at the end of the IM rotation; however, card ratings did not influence clerkship grades.

Students were included in the study population if they (1) completed at least four of eight assigned mCEX cards during the IM rotation, (2) took the family medicine/IM (FIM) SP exam on the last day of the block clinical rotation, and (3) took the school of medicine's multidisciplinary SP clinical skills exam (CSE) in the spring after the completion of the core clerkship year. We chose a minimum of four mCEX cards because it is the number needed to document minimal clinical competency.1,3

SP exams

The FIM exam is a four-encounter, pass/fail exam given on the last day of the medicine block. Students are required to perform a focused history and physical exam and to counsel a patient in clinical scenarios focused on FIM. Students' grades were determined by their performance on a checklist completed by the SP, a clinical note graded by the clerkship director, and an interpersonal skills score assigned by the SP. The checklists for each case had between 15 and 18 yes/no items indicating whether key elements of the history, physical exam, or counseling were performed. Six interpersonal skills (Eliciting Information, Listening, Giving Information, Respectfulness, Empathy, and Professionalism) were rated on a four-point scale (1 = poor/almost never, 2 = fair/somewhat less, 3 = good/somewhat more, 4 = very good/almost always). An additional interpersonal skills item rated on a four-point scale (1 = not at all, 2 = somewhat, 3 = comfortable, 4 = very comfortable) asked, “If this student were a doctor, how comfortable would you feel referring a family member or friend to him/her?” Students taking the FIM received two final scores: a combined checklist/patient notes score and an interpersonal skills score.

The CSE, a multidisciplinary exam that incorporated cases and clinical skills taught in IM as well as the other core clerkships, was given after the completion of all core clerkships. In 2007 it was a six-encounter exam; in 2008, it had 12 cases. Students taking the CSE were evaluated by the SP with a checklist and interpersonal skills ratings, and a patient note rated by the course director. For both SP exams, the checklist/notes scores were calculated as percentages of total possible points awarded, and students were required to pass both the checklist/notes portion and the interpersonal skills portion to pass the exam.

Statistical analysis

For each student, we calculated a mean score for each mCEX competency, a mean score for the overall clinical performance rating, and a mean interpersonal skills score (a composite of mean professionalism and counseling scores). We used Pearson correlations to estimate the relationship between the mean overall mCEX score and the overall combined checklist/notes score on the FIM and CSE, as well as the mean interpersonal skills scores on these exams. Correlations were disattenuated to correct for lack of reliability.

For the secondary analyses, we correlated specific mCEX competencies with corresponding competencies on the CSE. Ordinary regressions estimated the R2 with CSE and FIM checklist/notes and interpersonal scores as dependent variables and mCEX as the independent variable. We assessed whether correlations among the mCEX and CSE varied depending on time of year that students took the IM clerkship (Block 1 versus 2 versus 3 versus 4).

Results

A total of 310 students took the IM clerkship in 2006 and 2007, and 244 students (79%) met inclusion criteria. Of the 66 excluded students, 8 (12%) completed fewer than four cards; the remainders were nontraditional students (e.g., MD/PhD, MD/MBA) who did not complete the CSE secondary to an interrupted clinical training.

Table 1 shows that mean scores were greater than seven for all mCEX competencies. Estimated reliability coefficients were 0.90 for the mCEX, 0.59 for the FIM checklist/notes, 0.67 for the CSE checklist/notes, 0.31 for the FIM interpersonal scores, and 0.68 for the CSE interpersonal scores.

Discussion

Our analysis demonstrates performance on the mCEX has a small, but modest, correlation with overall future clinical skills performance in both the short term (at the end of the IM clerkship) and in the longer term (after the completion of the core clerkship year). Prior studies evaluating tools for direct observation of clinical skills with actual patients across medical specialties have largely focused on correlations with written and oral exams and faculty/resident evaluations of students.6–9 In many studies, the mCEX has been shown to be able to detect differences in trainee performance.1,3,4,10 In one study, residents' mCEX performance was related to performance on a high-stakes exam that included an SP component,10 and faculty ratings of a medical student with an SP using the mCEX have been shown to correlate with the SP's ratings of that same encounter.11 However, few studies,8 and none in IM, have explored the relationship between scores on tools used to observe trainees with actual patients and future performance on SP/OSCE exams.

There are several possibilities to explain the generally low magnitude of the observed correlations and the lack of competency-specific correlations that we predicted. One explanation is that attendings and residents are not formally trained on criteria that must be met for a student to receive a certain mCEX score. This lack of training contributes to inconsistency in how ratings are assigned. It is also reasonable to expect that rating biases, such as the halo effect, may exist. Although each mCEX is intended as an evaluation of a student during a single patient encounter, assessment of student performance by an attending or resident who has a longer-term relationship with a student may be influenced by factors (i.e., work ethic, interpersonal skills) other than the interaction he or she has with that student at the time of the mCEX. This contrasts with a true single interaction as evaluated by each SP during an SP exam. If mCEX evaluators are influenced by factors other than competency-specific factors (i.e., interpersonal skills while evaluating physical exam skills), then this may also help explain the higher observed correlation of mCEX scores with interpersonal skills scores on the SP exams. There is a relative paucity of research on best approaches to train raters to use direct observation tools, and more research in the area is needed.

Another difference between the mCEX and the SP exams is that the former is intended as a tool for formative feedback and the latter are higher-stakes tools that impact grades. These different stakes could affect student performance and limit the observed correlations; however, our experience has been that students still try to perform well on the mCEX because attendings and residents who ultimately submit summative evaluations are assessing them.

The low correlations might also reflect the restricted range in scores on the mCEX which places boundaries on the magnitude of correlations that might be observed, and the relatively short FIM and 2006 CSE exams.12

Another limitation of our study is the increase in the number of cases on the CSE from 2007 to 2008. Despite this change, we chose to combine the data for the two CSE exams because it increased the power of the study. In addition, we viewed each case as a sample of representative items from the domain of general clinical skills for students who had completed the core clerkships. In 2008 we substituted two paper-based cases (not included in the 2007 CSE analysis) with more case-based stations to improve test characteristics and test a wider range of content; however, the format of the case-based stations and all checklists remained the same.

Our finding that correlations between mCEX and CSE scores were generally higher for Block 1 and 2 students is intriguing, especially when all students took the CSE at the same time upon completing the entire clerkship year. Why mCEX scores better discriminated amongst students earlier in their clinical training merits additional exploration.

Finally, although it is helpful to know that some predictive validity exists, we acknowledge that critical analysis of a feedback tool, such as the mCEX, should also focus on whether the feedback provided affects outcome-based results, such that all students can improve their future clinical performance based on the feedback they receive during an mCEX. If this is occurring, then it may explain why our observed correlations between mCEX and SP exam scores were not higher.

Conclusions

This is another study providing additional validity evidence for the mCEX. The mCEX can be useful for feedback; however, to maximize its utility, raters should be better trained to use it so that scores and associated feedback are accurate assessments of performance, thereby providing students the opportunity to improve their clinical skills.

Future research should address whether the observed correlations exist with a larger sample size of students from multiple institutions and whether feedback provided by the mCEX actually impacts students' clinical skills.

Cited By:

This article has been cited 1 time(s).

Teaching and Learning in MedicineAssessing a Method to Limit Influence of Standardized Tests on Clerkship GradesLurie, SJ; Mooney, CJTeaching and Learning in Medicine, 24(4):
287-291.10.1080/10401334.2012.715256CrossRef

Enter and submit the email address you registered with. An email with instructions to reset your password will be sent to that address.

Email:

Password Sent

Link to reset your password has been sent to specified email address.

Remember me

What does "Remember me" mean?
By checking this box, you'll stay logged in until you logout. You'll get easier access to your articles, collections,
media, and all your other content, even if you close your browser or shut down your
computer.

To protect your most sensitive data and activities (like changing your password),
we'll ask you to re-enter your password when you access these services.

What if I'm on a computer that I share with others?
If you're using a public computer or you share this computer with others, we recommend
that you uncheck the "Remember me" box.