Winny Li is a first-year resident at St. Michael's Hospital in Toronto, where senior doctors observe her work and offer coaching as part of a new strategy in teaching hospitals across the country.

JENNIFER ROBERTS/The Globe and Mail

At work in the emergency department of St. Michael’s Hospital in Toronto, first-year medical resident Winny Li interviews a patient with chest pain while her supervisor hides behind a curtain, within earshot, but out of view. The oversight – and the feedback she will receive later in the shift – is a hallmark of Canada’s new system for teaching and evaluating junior doctors.

The program, which launched at teaching hospitals across the country on July 1, requires senior doctors to observe resident doctors – those who have finished medical school and are doing on-the-job training in one of more than 30 specialties – and provide constant coaching, unlike the hands-off approach in the past.

“It’s the biggest change in medical education in over a century,” says Jason Frank, an emergency physician in Ottawa and the director of specialty education at the Royal College of Physicians and Surgeons of Canada who is leading the overhaul, which started with this year’s crop of 3,000 residents. While universities conduct the training, the college sets out the program they will follow.

Story continues below advertisement

The new system is meant to identify struggling residents early, so educators can intervene and make sure they have the skills they need when they move on to their own practices.

Under the old way, which had been in place since 1910, doctors training to be specialists entered a residency program for a predetermined length of time – three years for pediatrics, five for emergency medicine, seven for neurosurgery, for example – and wrote a qualifying exam at the end.

But were not evaluated on skills such as communicating with patients and residents often worked unobserved by senior doctors, receiving little feedback to help them correct bad habits or misconceptions. Some may have missed out on developing particular skills because situations they needed to experience did not occur in the time allowed.

The problems have been acknowledged for decades. The World Health Organization said in a 1978 report that some doctors were learning critical skills on the job long after being licensed. And as medicine becomes increasingly complex, with patients living longer, requiring more specialized care, and new health problems emerging, medical educators wanted to establish a list of tasks on which residents must be assessed before they can be licensed.

“We are training excellent physicians, most of the time,” says Linda Snell, the college’s senior clinician educator. But she adds that the exit-exam system does not test for everything that a doctor needs to know to treat patients well.

Work on a new model began about 20 years ago, but until recently, there was not enough evidence to support switching the mammoth medical education system away from the old approach, in which residents learned on the job and helped teach medical students at the same time. Under the new system, if a resident has not yet experienced a situation or procedure integral to their specialty, a supervising doctor would note the fact and universities would try to make it happen, sometimes using simulation when real situations are too rare.

The new model, called competency-based medical education (CBME), requires senior doctors to directly observe residents. When they have been seen to perform well at a function deemed essential to their training, they can be declared “entrustable” in that area, meaning they may no longer require supervision to do it. Once they have reached those milestones – such as being able to manage a patient in need of resuscitation or deliver a baby – they progress to more independence and responsibility. Like residents in the old system, they will also take a final exam.

Dr. Li welcomes the new system's extra supervision and guidance for doctors in training. 'It’s so beneficial to get specific advice you can act on. You want to hear more than "good job," because at the end of residency, you’re the attending [physician].'

JENNIFER ROBERTS/The Globe and Mail

Daily results

The new system may do more than catch doctors who might be not fully prepared. An idea that remains controversial among education scientists and front-line physicians is that it might also accelerate learning for some, possibly shaving an entire year off residency for those who are evaluated as having done all the tasks and are ready to practice. That could pump fully qualified doctors into the system sooner.

The University of Saskatchewan has already advanced one resident to the next stage early because of positive feedback, but Rob Woods, director of the emergency-medicine residency program, says this does not necessarily mean they will finish early. “Some residents will progress fast through some stages, and more slowly through others.”

Dr. Woods, who assesses the progress of emergency medicine residents, receives daily written reports from supervising physicians that evaluate residents' performance, assign it a numerical rating and offer direction on how they can improve. “CBME is going to make my job a lot easier,” Dr. Woods says. In the time-based system, reports were less detailed, and problems were often not brought to the director’s attention even if hospital staff had noticed them. He says information in multiple comments from supervisors this summer allowed him to coach a resident whose communication style was a bit too casual. If it were not for CBME “it would have been whispers in the shadows,” he says. “The old way, we only had enough data to make sure people were just meeting the bar. Now we have a development mindset.”

The College of Family Physicians of Canada, which is responsible for the residency training programs of doctors who will become general practitioners, began using a form of CBME in 2011. It had noticed that family doctors were abandoning their traditional role of caring for everyone from babies to the elderly, limiting hospital visits and referring complex patients to specialists more often. More than 7,000 family doctors have now been trained under the system, a few of whom needed more than the traditional two years, says the college’s director of education, Ivy Oandasan. Graduates of the CBME program are now more likely to practise the full spectrum of family medicine, she says, a role viewed as vital outside urban areas.

The Dutch medical training system has used a form of CBME for 15 years and, in Cincinnati, the technique has been found to identify struggling residents sooner. Canada’s changes are broader than those programs – including the data collection and the list of required competencies – and are seen as cutting edge.

A question of time and resources

Over the past few years, pilot projects for CBME at McMaster University, Queen’s University and the University of Ottawa have also found that struggling residents are identified earlier and some are advanced sooner.

But a 2009 study with 14 orthopedic surgery residents at the University of Toronto provided little proof that CBME is better than the time-based model. Two finished early, two later and the rest in the same amount of time. No discernable differences in competence were found compared with those trained in a time-based model.

That does not surprise Geoff Norman, professor emeritus in education and clinical epidemiology at McMaster University. He says the theory behind competency-based education is flawed.

“This has been tried [in other areas of study] before, back in the 1970s,” he says, when behaviour theory was dominant in education. A resident who has managed a situation well in the past is not guaranteed to do so each time in the future, Dr. Norman says. Similar cases might present themselves in thousands of ways, so the idea that a resident can be declared competent based on a few observations does not add up, he says.

The biggest flaw, he says, is that CBME assumes an upward trajectory of gaining competence. “It suggests a smooth curve, but what actually happens is a sawtooth up-and-down,” he says of how residents learn and perform. In other words, a resident could perform very well in a situation a few times, and move to the next level of responsibility, then perform very poorly in a similar situation a few weeks later. While some tasks such as gall bladder surgery may seem routine, complications such as bleeding or accidentally cutting the bile duct can happen long after a resident performs their first case by themselves. It gets even trickier for cognitive tasks such as diagnosing rare diseases, because lab tests, patient histories and physical exams require interpretation and “a good amount of gut feeling,” says Dr. Norman, who argues that there is no substitute for the experience gained in a time-based system.

The program also puts too much of a burden on the supervising doctors, who have their own work, he says. "Staff doctors are spending all their time watching residents instead of seeing patients.” If exceptional residents graduate early, he adds, hospitals could end up short staffed.

Story continues below advertisement

Provincial ministries of health say they are monitoring the impact CBME could have on budgets and hospital staffing, but the Royal College’s Dr. Frank says the impact will be marginal, and that universities funded the start-up costs. If an outstanding resident graduates a year early, Dr. Frank says, one of about 100 medical school graduates who did not get a residency spot this year can take their place.

While educators and academics argue over how best to measure successful training, the Royal College is pressing on, says Dr. Snell, who is also a professor of medicine at McGill University’s Centre for Medical Education.

“We have to dive in and do it.”

Back at St. Michael’s, Dr. Li says she appreciates the feedback and extra supervision the new system provides.

“We need this type of disruption in medical education,” she says. “It’s so beneficial to get specific advice you can act on. You want to hear more than ‘good job,’ because at the end of residency, you’re the attending [physician].”

After she interviewed the patient with chest pain, her supervisor said her assessment relied too much on test results and not enough on the patient’s personal history, and this led her to underestimate the risk for heart problems and recommend that he be discharged.

Story continues below advertisement

“Just because your electrocardiogram and enzymes are normal, doesn’t always mean you can go home,” she says later. “That just changed the way I think about everyone with chest pain.”

Due to technical reasons, we have temporarily removed commenting from our articles. We hope to have this fixed soon. Thank you for your patience. If you are looking to give feedback on our new site, please send it along to feedback@globeandmail.com. If you want to write a letter to the editor, please forward to letters@globeandmail.com.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.

Welcome to The Globe and Mail’s comment community. This is a space where subscribers can engage with each other and Globe staff. Non-subscribers can read and sort comments but will not be able to engage with them in any way. Click here to subscribe.