The study, which included 21,284 internal medicine residents, showed that milestone-based assessment correlated with nondevelopmental ratings but with a greater difference across the training years. In addition, the study indicated that a milestone-based assessment correlated with American Board of Internal Medicine certification examination scores.1

Internal medicine residency programs in the United States now rate residents using milestones. However, there has been little evidence on how effective this approach may be. Researchers compared developmental milestone ratings to a nondevelopmental rating scale for internal medicine residents using the pre-2015 resident annual evaluation summary (RAES). Quality is now defined by outcomes measures for residents and fellows in graduate medical education (GME).

“More work is needed to gather additional evidence of validity of the milestones—that is to show that they are measuring what we want to show, which is residents' ability to provide high-quality, efficient, safe patient care during training and after they graduate to unsupervised practice,” said study author Karen Hauer, MD, PhD, associate dean and professor of medicine at the University of California, San Francisco (UCSF).

It is important, she noted, that residents are followed longitudinally to determine how they progress on the milestones. It is also important to determine how programs intervene with residents who are not progressing as expected and what strategies best help residents remediate.

“Ultimately, educators and the public want to know how well milestones predict performance in practice. Answering this question will require additional study,” Dr Hauer told Endocrinology Advisor.

The current study showed that the corresponding RAES ratings and milestone ratings were correlated across training years. Ratings differed more between postgraduate-year 1 (PGY-1) and postgraduate-year 3 (PGY-3) using milestone ratings than the RAES, suggesting that milestones better capture progressive development.1 The study showed that 618 of the 6260 residents who failed the certification examination had lower ratings using both systems for medical knowledge than did those who passed.1

Internal medicine educators, as well as the graduate medical education community overall, have made great strides in shifting the focus in training programs from what is taught to what is learned, according to Dr Hauer. This has been achieved by defining the expected outcomes of training and implementing competencies as a framework for measuring those outcomes. She noted that in internal medicine, 100% of program directors work with clinical competencies and report on their residents' progress using milestones.

She said changes to the systems of clinical care in teaching institutions to build in time for faculty to directly observe residents with patients and give them feedback is essential. Dr Hauer also noted that many clinical competency committees are still struggling to design efficient ways to synthesize resident performance information in ways that enable ready visualization of each resident's performance over time. Technical solutions such as dashboards now are offering promise to help with data management, according to Dr Hauer.