How AI Is Getting Even Better at Spotting Illness

Nancy Crotti

April 28, 2016

Machines equipped with deep-learning algorithms may be as good as humans in detecting cancer in ultrasound images and in identifying it in pathology reports, according to recent news out of Samsung, the Regenstrief Institute, and Indiana University.

Evidence continues to mount when it comes to the potential of artificial intelligence to help health providers spot signs of illness in patient, including such deadly maladies as cancer.

Samsung Medison, for example, has updated its RS80A ultrasound imaging machine with a feature it's calling S-Detect for Breast to analyze breast lesions. It uses big data collected from breast-exam cases and recommends whether a particular lesion is benign or malignant, according to a report in Kurzweil Accelerating Intelligence.

The affiliate of Samsung Electronics designed the update for use in lesion segmentation, characteristic analysis, and assessment processes for more accurate results, the company said in a statement.

"We saw a high level of conformity from analyzing and detecting lesion in various cases by using the S-Detect," said professor Han Boo Kyung, a radiologist at Samsung Medical Center, in the statement. "Users can reduce taking unnecessary biopsies and doctors-in-training will likely have more reliable support in accurately detecting malignant and suspicious lesions."

Researchers at the Regenstrief Institute and Indiana University School of Informatics have also found that computers beat humans in cancer detection. These researchers determined that existing algorithms and open- source, machine-learning tools were as good as, or better than, human reviewers in detecting cancer cases using data from free-text pathology reports. The computerized approach was also faster and used fewer resources than people, according to a statement from Regenstreif.

The machine-assisted process makes for speedier reporting of cancer cases to state health departments by automatically and quickly extracting crucial meaning from plaintext, also known as free-text, pathology reports.

A study of the the Regenstreif research was published in the April 2016 issue of the Journal of Biomedical Informatics.

"We think that it's no longer necessary for humans to spend time reviewing text reports to determine if cancer is present or not," said study senior author Shaun Grannis, MD, interim center director. "We have come to the point in time that technology can handle this. A human's time is better spent helping other humans by providing them with better clinical care."

Meanwhile, a machine-learning initiative called Singularity Healthcare is working to create an AI engine that will be incorporated into Imaging Advantage's (Santa Monica, CA) proprietary exam routing technology. The AI would instantly pre-read digital x-rays, identifying potential areas of injury and disease. The initiative, expected to launch this year, is a partnership between Goldman Sachs-backed Imaging Advantage and faculty members from the Massachusetts Institute of Technology and Harvard Medical School/Massachusetts General Hospital.

IBM has been working on a similar idea to help radiologists reach a diagnosis. With the Avicenna software acquired through its $1 billion acquisition of Chicago-based Merge Healthcare, it is working on enabling its supercomputer Watson to "see" and "read" images and compare them with the patient's other images and electronic health data. Merge's medical imaging management platform is used at more than 7500 U.S. healthcare sites, as well as many of the world's leading clinical research institutes and pharmaceutical firms.

With the help of deep learning, Avicenna can judge how far down a patient's chest a CT scan slice was taken, according to an article in Technology Review. Others can label organs or anomalies such as blood clots.

Learn more about cutting-edge medical devices at MD&M East, June 14-15, 2016 in New York City.