Subscribe to our Newsletter

Can quality be assured for diagnostic imaging?

News headlines from across Canada are periodically dominated by scandals and errors in diagnostic imaging. The list grows each year, with errors exposed from coast to coast.

The narrative follows the same arc – an error is discovered in an area of diagnostic imaging. A radiologist – generally the physician involved in the interpretation of diagnostic imaging results – is disciplined or dismissed, followed by calls to improve quality assurance in diagnostic imaging.

However, there are technical and practical challenges to putting in place robust systems to monitor and improve the quality of diagnostic imaging services and providers.

Healthy Debate spoke to radiologists from across Canada and decision makers who are leading the implementation of these systems to understand the challenges and opportunities in quality assurance for diagnostic imaging.

Errors uncovered at Trillium Health Partners

Earlier this month, one radiologist at Trillium Health Partners was found to have made errors in interpreting at least three computed tomography (CT) scans and mammograms.

Robert Sevick, head of the University of Calgary Department of Radiology, points to a need for systems to reduce human error. “Are radiologists making errors? Of course they are. We are human and humans make errors. It doesn’t help them to be attacked in the media and singled out.”

Sevick characterizes this as a “destructive way of dealing with the problem” and proposes that a quality assurance system for radiology needs to have a “supportive and educational structure, rather than punitive.”

Currently, British Columbia and Alberta are the only Canadian provinces putting in place province-wide quality assurance systems for diagnostic imaging. In all other provinces, quality assurance is done on a “hospital by hospital, and practice by practice, ad hoc basis” according to Sevick.

Why is quality assurance in diagnostic imaging so difficult?

The first challenge is that there is a debate within the field around the acceptable rate of error associated with reading diagnostic imaging.

Alan Moody, chair of the department of medical imaging at the University of Toronto notes that “the figures for missed rates are not translatable across different specialties [within radiology].”

Diagnostic imaging is a massive field. Technologies like ultrasounds and MRIs are widely used for diagnosing or assessing diseases in many medical specialties. While radiologists receive specialist training to read and interpret diagnostic imaging tests of the body, it is also a rapidly evolving field with new technologies being introduced continuously.

For example, the output of a CT scan, which uses x-rays to produce images of slices from areas of the body, has increased significantly. Sevick notes that “the amount of data to be processed is a quantum leap above where it used to be.” He gives the example of a CT scan of the head. He says in the 1980s this test would have an output of 15 images printed on 2 films; and today it is 100’s of digital images.

Doug Cochrane, chair of the BC Patient Safety and Quality Council, and lead investigator in a review of the quality of diagnostic imaging in BC notes that this is both a blessing and challenge to the field.

Cochrane raised concerns in his 2011 report around the challenges for practicing radiologists, as well as radiology training programs, “to keep up with these changes whilst ensuring competency.”

Radiologists also practice in many different settings, with more specialized radiologists based in large teaching hospitals. Generalist radiologists tend to practice in community hospitals or smaller communities. Almost one half of Canadian radiologists work in a private office or clinic, with about the other half working in community or academic hospitals. About 70% of radiologists work in a group practice, meaning they share responsibilities, equipment and office space. However, about 14% of radiologists work in solo practice settings.

“I might see something on a chest CT I don’t understand so I can phone or page a colleague to ask their opinion” she says. Molnar notes that the practice setting influences how radiologists can consult with colleagues when they have a problem or question. “If you are working in a small community hospital, who do you phone for help when you have a rare or difficult case?” she asks.

Radiologists are faced with reading significant volumes of diagnostic imaging scan outputs. There are growing volumes both of diagnostic tests being ordered, and the outputs of these tests.

A recent publication by three Toronto radiologists reported that the total number of images their radiology department interpreted on a daily basis has doubled every 4.5 years. A 2011 Canadian Institute of Health Information report identified that the number of magnetic resonance imaging (MRI) and CT scans had doubled within six years across Canada.

Radiologists also work under significant time pressures. Most hospitals in Canada measure the turnaround time for radiologists’ reports. Molnar suggests part of a radiologists’ role on a clinical team is to “provide an answer quickly.” However she reflects – “if you are measuring time but no other performance indicators, you get a distorted incentive driver.”

In this context, what are the best approaches to putting in place quality assurance programs for diagnostic imaging?

Systems for Quality Assurance in Diagnostic Imaging

There are no national quality assurance standards for diagnostic imaging in Canada.

Approaches to quality assurance vary based on factors like the practice setting, type of diagnostic imaging being used and number of radiologists in a practice.

Wade Hillier, Director of the Quality Management Division of the College of Physicians and Surgeons of Ontario notes that there are a number of different approaches to ensuring quality in the practices of radiologists within their practice settings, which vary depending on “resources and commitment”.

While the College conducts assessments of radiologists and their facilities when they practice outside hospitals, in Independent Health Facilities, this is different than having two radiologists consult on a difficult case or image. Hillier suggests that peer to peer review, or second read of diagnostic images “flows up from the profession itself.”

In 2011, the Canadian Association of Radiologists published a Guide to Peer Review Systems which sets outs requirements for components of peer review processes, and provides suggestions on how processes could be integrated into daily practice. However, one radiologist informant noted that the Association lacks both “teeth and dollars” to enforce or implement such processes.

Provincial Quality Assurance Systems for Diagnostic Imaging

However, some provinces are taking a leadership role in establishing quality assurance systems.

In 2011, as a response to concerns about the quality of diagnostic imaging by four radiologists in British Columbia, the provincial government launched a review on the quality of diagnostic imaging in that province.

The findings of this review led to 35 recommendations, intended to address the concerns that prompted the review, as well as improve quality assurance systems in that province. This includes setting up a peer review system where images can be concurrently reviewed by two or more radiologists. This system is currently being piloted in one of BC’s health authorities, Vancouver Coastal Health.

Alberta is following in BC’s footsteps, after mistakes in diagnostic imaging were found at the Drumheller Hospital in 2011. Part of Alberta Health Services’ response has been to put out a request for proposals from technology companies around a province-wide, concurrent peer review system.

Mauro Chies, Acting Vice President of Clinical Supports for Alberta Health Services notes that an advantage of a single health authority is that the province’s hospitals share the same RIS and PACS applications. The absence of technical roadblocks means that “we can send images from one corner of the province to another.” He says that implementing this system is a top priority for his office, and plans for it to be operational across the province by summer 2014.

In response to the errors uncovered at Trillium Health Partners, Ontario is considering putting in place stronger systems of peer review for diagnostic imaging.

Sheamus Murphy, a spokesperson for Ontario’s Minister of Health Deb Matthews stated that the Minister has met with provincial hospital, medical and radiology representatives to discuss quality assurance in diagnostic imaging. He noted that “those around the table all agreed, including radiologists themselves, [that] we could do more to ensure quality in radiology.”

“Peer review is one of the steps we are giving serious consideration to” says Murphy.

Radiologists are well aware that it is easier to assess their performance than it is for other medical specialties. There is some tension around having province-wide systems in place to assess quality, when this was previously left up to the profession. However, given the growing complexity in diagnostic imaging, and growing need for radiologists’ services, systems to measure and monitor quality seem inevitable.

The 2011 Canadian Association of Radiologists guide states “radiology is at a crossroads of rapid technological advance and globalization; with the advent of PACS and off-site on call service provision, radiologists need to embrace quality assurance not only to safeguard patients but to safeguard their own profession.”

Should all provinces and territories put in place quality assurance programs for diagnostic imaging?

Yes eventually. And well before full AI we can use AI to select a sample for concurrent human review. This tightens and focuses where we need to invest effort to improve diagnosis. Again, probably applicable to most other image-based diagnostic processes. Common for ECGs for example. Not a substitute yet but a complement

I would like to know what evidence Dr. Falk is relying upon to conclude that AI is an effective way to “select a sample for concurrent human review”. To be honest I am not even sure what Dr. Falk means by that. As a practicing radiologist, my own experience with AI — which we most often call “CAD” or “computer-assisted detection” — is that it is functions poorly. Also, in my experience it is employed for relatively narrow questions such as detecting ischemia on a nuclear medicine cardiac study, finding cancer on a mammogram or detecting a colonic polyp on CT colonography.

Even if these systems work well, they aid only in the detection of a very narrow range of pathologies, and failure to detect is only one source of error. I have never seen an AI system that attempts to read an abdominal CT scan, for example, where there are thousands of different types of pathologies.

Dr. Falk similarly made claims in the past regarding the applicability of Moore’s Law to radiology, claiming that technological efficiencies have increased the speed at which studies can be interpreted by radiologists. The link obliquely referenced in this article (http://www.longwoods.com/content/22892) was a rebuttal to this claim. The mistaken belief that radiologists can work faster because the machines they use work faster is surely contributing to the problem.

%featured%Error is an inevitable part of the practice of radiology, as it is all fields of medicine and indeed all human endeavors. That does not mean that we should not strive to minimize it, but it does mean we need to avoid creating unrealistic expectations, both of humans and of technology.%featured%

Your final comment is entirely appropriate. Human error is an inevitable consequence of human action. Though we must mitigate the risk of performing errors, we must be realistic as to what rate of errors we should expect.

Radiology is a far more humble and realistic field than my own field, pathology. Some pathology loudmouths espouse a 0.5 to 1% error rate as being realistic! This is based on no data or evidence, and appears to be a hurried reply to government pressure on the field after multiple egregious mistakes were made (some even by the surgeons, but the pathologists were blamed).

There are problems with even defining what constitutes an “error”. Some cases are very clear-cut, but others are more ambiguous. This often gets lost in the discussion

If we want to actually do something to improve quality — rather than just creating an appearance of improvement — then we had better develop a deeper understanding of why errors happen. They emphatically do not happen because physicians don’t care. Everyone I have ever worked with is devastated when an error adversely affects a patient. (That does not diminish the fact that it is the patient who suffes most.)

Regarding Will Falk’s comments, I and others have challenged his “insights” and have never seen any response. Yet the unsupported ideas that he espouses seem to be driving government policy right now. So much for Healthy Debate.

Pathology often gets “the last word” which I think leads to overconfidence, at least until another pathologist comes along. Radilogy virtually never enjoys that luxury. That may explain some of the differences that you have noticed.

Thanks for your comments. I believe we are now a community of two on this thread.

Great article and timely. Applicable to any and all image-based virtualized diagnostician. Currently that includes derm, ophth, pathology but in the future will likely include most of internal medicine Virtualization of the service enables quality assuarance programs

Being on the pathologist side of things, I have seen that government interference into the practice of medicine does nothing to ensure quality, but instead produces multitudes of inefficiencies, astronomical waste, and opportunities for even more errors.

My practice has changed in scope over the years, and a good portion of that change is due to the implementation of mandatory data entry for cancer cases. This does nothing for patient care, but does everything to increase error rates, waste physician time, and slow everything down. If I refuse, I am shamed by my colleagues as not being focused on ‘quality’. If I were a patient, I would want my physician to be focused on providing the right diagnosis, not filling out data entry tables because the ministry refuses to hire people to do this kind of work. It’s not medicine, and it’s not charting, its data entry.

Turnaround time is a bad quality indicator for all but emergency cases. Fast does not equal good.

Radiology has been very resistant to the kind of mandatory ‘quality’ initiatives put forth by the government, seeing them for the Trojan horses they are. Hopefully they will continue to hold that stance.

I couldn’t agree more – fast does NOT equal good – in any aspect of health care. Fast leads to more errors.

%featured%I also agree with the comment that media “attacks” are not helpful – I don’t believe that any health care provider at any level goes to do their shift with the intention of harming patients or making errors – we need to focus more on why the errors occur and what we can do to prevent them- proactive always beats reactive%featured%

I have to think in this case that the people who are doing the work day in and day out (radiologists) are those most familiar with the problems and how and why errors occur – this also puts them in the position of being able to come up with solutions. I’d be happier as a patient with that approach rather than a dictate from the gov’t.

Nice post
Presently, best CDM practices have evolved due to the increasing demands from pharmaceutical companies as well as from regulatory authorities in order to generate high quality data. On the other hand there has been a shift from the paper-based to the electronic systems of data management. There has been a positive influence of technology development on CDM process and systems. CDM should be evaluated through systems that have been put into practice. Logon to website http://www.worksure.org

So far, the quality assurance effort for radiology in BC consists of a pilot project at the Vancouver Island Health Authority — I believe it’s far from being a province-wide initiative. AT VIHA, McKesson is supplying software that will enable the work of individual radiologists to be checked by others.

In Ontario, a regional pilot project has been launched in the Hamilton area. Software from RealTime Medical is being used to check exam readings of one radiologist by another. The identity of each is protected, so there is no professional embarrassment; if mistakes are spotted, they can be corrected before the results are sent to the referring physician.

These are excellent starts, and if the solutions are successful, they may very well be expanded. The methodology could be used in pathology, as well.

It won’t happen in pathology, at least not the way you’re describing – that takes willingness to accept technological advancements.

Pathology – a profession that is quite good at being its own worst enemy – is still using 400 year old tech(microscopes), even though a modern, superior alternative (digital scanner) exists. This superior alternative was, however, slapped with a class III device label by the FDA, which makes any future adoption of this method expensive and inconvenient; a guarantee that “digital pathology” will not happen in this generation.

QA will be confined to “showing slides around” and crucifying a few pathologists now and then who don’t conform to the trends (see the Olive Williams case).

Is this an overall issue of establishing professional and systemic standards for –
> Methods and Tools,
> Data collection, analysis and synthesis,
> People inherent to the system,
> Ensuring coherence of work processes, and
> Alignment of purpose ?

This is experienced at the individual patient, radiologist, technologist and support staff level and consolidating to higher levels of organization in the department/clinic – institutional – regional – provincial and national levels. Each level of organization requires specific attention with a clear and sustained focus on vertical and horizontal integration and alignment of purpose, resources and policies/regulations.

The degree we individually and collectively are able to obtain this alignment of effort and purpose will determine the scope of success that is achieved – personally, departmentally, organizationally, regionally, provincially, nationally and internationally.

Give me a break Douglas Cochrane didn’t lead any review he was just the provincial puppet which he signed his name to a bogus review. Other radiologists were left out and thousands upon thousands of hard copy films distroyed. Cochrane sold patient saftey down the drain and it is the expert patients that say the issues of BC are still not over……. all the time, all the time

This document is provided under the terms of a CreativeCommons Attribution Non-commercial Share Alike license. The terms of the license are available at: http://creativecommons.org/licenses/by-nc-sa/3.0/. Attributions are to be made to HealthyDebate.ca, a project under the direction of Dr. Andreas Laupacis, at the Keenan Research Centre, Li Ka Shing Knowledge Institute of St. Michael’s Hospital.