Will Computers Be Reading Your Chest X-Ray?

Pneumonia is a common disease, affecting 5 million Americans per year, and is the 6th leading cause of death in the United States. Patient survival improves significantly when pneumonia is diagnosed and treated within 4 hours of the patient arriving in the hospital. And a key part of the diagnostic algorithm is accurate and timely interpretation of chest radiographs.

As a diagnostic radiologist, I typically review dozens of patient chest x-rays during a busy ER shift checking for possible pneumonia. Numerous diseases can cause a chest radiograph to look abnormal. A skilled radiologist has to be able to distinguish between pneumonia and other conditions such as atelectasis (the lung tissue is partially collapsed but not due to infection), benign scar tissue, fluid accumulations within the chest cavity but outside the lung, lung injury due to trauma, and early unsuspected cancers.

1) The AI was better than human radiologists in both sensitivity and specificity.

In other words, the AI was able to correctly diagnose pneumonia when the patients had the disease (true positives), but with fewer false positives (the patient has no pneumonia, but the radiologist thinks they do) and false negatives (the patient has pneumonia, but the radiologist thinks they don’t.)

2) The AI was able to generate a “heat map” showing the region of the image it considers abnormal.

This is important because most deep learning algorithms arrive at conclusions in a fashion that isn’t necessarily transparent to humans. A human radiologist might not know what image features caught the attention of the AI. But if the AI is able to “flag” the abnormal area, a human radiologist can easily double-check the AI’s diagnosis.

3) The AI did surprisingly well in diagnosing other abnormalities.

These other conditions include emphysema, pneumothorax (collapsed lung), and lung nodules. The accuracy exceeded other chest radiology AI systems.

4) Reliable computer diagnosis of pneumonia can help with patients in underserved or remote areas.

For patients in remote or rural areas, access to a clinic or small hospital with basic x-ray machinery is relatively available. The bottleneck is getting the image to a skilled board-certified radiologist able to render an accurate interpretation of the image. Even though telemedicine has helped tremendously, it can still be difficult for those patients to have their images interpreted within the preferred 4-hour window. An AI chest x-ray reader could provide near instantaneous interpretations. And AIs will never need to take lunch breaks or get distracted by noisy working conditions or complain about working 24-hour shifts.

The image on the left is a normal chest x-ray (public domain, by Stillwaterising, via Wikimedia Commons). The image pair on the right shows the result of the Stanford CheXNet 121-layer convolutional neural network on an abnormal chest x-ray. The network accepts as input a chest x-ray image and outputs the probability of pneumonia along with a heatmap localizing the areas of the image most indicative of pneumonia (reprinted with permission by Pranav Rajpurkar, et al).1st image: Public domain via Wikimedia Commons. Second image: Reprinted with permission from Stanford University.

The capabilities of AI radiology systems continue to improve dramatically. I don’t foresee computers replacing human radiologists in the next 5 years — but I wouldn’t be surprised if this occurred in the next 10-15 years. As these systems become increasingly reliable and accurate, my hope is that radiologists and government regulators embrace these systems. And that regulators don’t impede progress that could save millions of dollars — and lives.

I am a physician with long-standing interests in health policy, medical ethics and free-market economics. I am the co-founder of Freedom and Individual Rights in Medicine (FIRM). I graduated from University of Michigan Medical School and completed my residency in diagnostic ...