Residents need to learn medicine, not how to pass a test

In his famous novel, Moneyball, Michael Lewis illustrates the phenomenon of professional baseball scouts focusing on all the wrong characteristics when looking at players. He describes how scouts focus on fastball velocity as a way to compare pitchers, despite the lack of correlation between fastball speed and the quality of a pitcher. As it turns out, the most important factor in a pitcher is deception, not a high-velocity fastball. The goal, after all, is to get batters out, not throw the hardest pitch. Lewis theorizes that because pitch speed is easy to measure, and easy to compare, it became overemphasized. The principle of overvaluing that which is easily measured is the same mistake made in graduate medical education in regards to test scores.

Every year, emergency medicine residents take a multiple choice test known colloquially known as the in service exam. The exam is an excellent predictor of the ability of a physician to eventually pass the board certification exam. The Accreditation Council for Graduate Medical Education (ACGME) monitors each residency program’s scores, and they are felt to reflect the quality of each programs education. Their logic is that if a program has low test scores, it must mean they are not properly educating their residents. The ACGME requires residency programs to have an 80 percent first-time pass rate on the board certification exam. If a program does not have adequate test success, programs can be placed on probation, and eventually be shut down. A lot of residency programs are feeling the pressure.

The problem with using multiple choice questions as a quality marker is that they are a poor indicator of clinical competency. In a multiple choice question, you are given a quick vignette, expected to make a single correct diagnosis, then answer a factual question about that diagnosis. Good test takers learn to cue into certain buzzwords that hint toward the correct answer. Doing well on the exam requires pattern recognition of different but similar question stems. It is a skill in itself, unfortunately, one that is useless when practicing actual medicine.

The thought process used for a multiple choice test is antithetical to medicine at the bedside. During training, we are taught to create a differential of possible diagnoses, then to systematically rule them in or out through a series of physical exam, laboratory, and radiology studies. Clinical medicine is subtle. Patients rarely present with a classic triad that you learn in the textbooks. Any experienced physician has a hundred personal stories about weird presentations, and patients that just don’t follow the rules.

The purpose of residency training is to produce competent emergency medicine physicians, but ACGME policies force residency programs to “teach to the test.” Every residency has a weekly lecture series called didactics that is a critical component of medical education. Increasingly, I have seen didactics time dominated with test prep sessions. We now spend time reviewing practice multiple choice questions instead of practical, patient centered case studies. I have even heard of programs limiting clinical hours in the emergency department for their residents in favor of board prep sessions.

The ACGME is making the same mistake with test scores as baseball scouts made with pitchers. I don’t mean to say that passing the board certification exam is meaningless, and there is no excuse for having a poor knowledge base, but test scores have simply been overemphasized. The value associated with answering multiple choice questions is out of proportion to their actual worth. It is easy to track and compare test scores, and so they are being used as a marker of quality. This is a mistake.

I fear the current test-centered approach toward residency education will lead to exactly what the ACGME has concocted; a generation of emergency physicians who are excellent at answering multiple choice questions, but aren’t as good at being an actual doctor.