“I spend a lot of time in the clinic, and don’t have the time or the technical expertise to learn, configure, and maintain software. MATLAB makes it easy for physicians like me to get work done and produce meaningful results.”

A heart transplant recipient’s survival depends on dozens of variables, including the weight, gender, age, and blood type of both donor and recipient, and the ischemic time—or the time during a transplant when there is no blood flow to the organ.

To better understand transplant risk factors and improve patient outcomes, researchers at Lund University and Skåne University Hospital in Sweden use artificial neural networks (ANNs) to explore the complex non­linear relationships among multiple variables. The ANN models are trained using donor and recipient data obtained from two global databases: the International Society for Heart and Lung Transplantation (ISHLT) registry and the Nordic Thoracic Transplantation Database (NTTD). The Lund researchers accelerated the training and simulation of their ANNs by using MATLAB®, Deep Learning Toolbox™, and MathWorks parallel computing products.

“Many of the techniques we use are computer-intensive and time-consuming,” says Dr. Johan Nilsson, Associate Professor in the Division of Cardiothoracic Surgery at Lund University. “We used Parallel Computing Toolbox with MATLAB Distributed Computing Server to distribute the work on a 56-processor cluster. This enabled us to rapidly identify an optimal neural network configuration using MATLAB and Deep Learning Toolbox, train the network using data from the transplantation databases, and then run simulations to analyze risk factors and survival rates.”

Challenge

Understanding how various risk factors affect survival rates involved hundreds of thousands of computationally and data-intensive operations—for example, the team had to test hundreds of ANN configurations to identify the best one. An analysis of six variables requires the simulation of 30,000 different combinations. Simulating all these combinations for 50,000 patients took weeks using an open-source software package.

Nilsson and his colleagues encountered reliability problems with the software they were using, as well. “The software was unstable, which led to crashes during long, multiday simulations,” Nilsson explains. “In addition, some of the results it produced were not quite right. When we publish our findings, we need to be very sure we can trust the results.”

Solution

To address the speed and reliability challenges, Lund University researchers developed their initial ANN model using MATLAB and Deep Learning Toolbox. To find the optimal network configuration, they wrote MATLAB scripts that varied the number of hidden nodes used in the network for a range of weight decay (or regularization) values.

The team used Parallel Computing Toolbox™ and MATLAB Distributed Computing Server™ to accelerate the simulation of more than 200 ANN configurations. They then evaluated the results to find the best-performing configuration.

After training the ANN using donor and recipient information from the databases, they verified the model’s accuracy by simulating outcomes for 10,000 patients that had been omitted from the training set. They then compared the results against actual survival rates.

In the next phase, the team conducted thousands of simulations in parallel to rank the 57 risk factors considered in the study for predicting long-term survival.

Using results from Monte Carlo simulations on the computer cluster and simulated annealing techniques, the researchers identified the best and worst possible donors for any particular recipient.

As a final step, the team developed an automated process that ranks the recipient waiting list to identify the best candidates for a prospective donor.

In the next major phase of the project, Lund University researchers are using the ANN to investigate the use of Human Leukocyte Antigen (HLA) genetic profiles to match donors with recipients.

Results

Prospective five-year survival rate raised by up to 10%. “In a simulated randomized trial, the preliminary results show that the ANN model we developed using MATLAB and Deep Learning Toolbox would transplant approximately 20% more patients than would have been considered using traditional selection criteria,” says Nilsson. “The prospective five-year survival rate for the ANN-selected patients was 5–10% higher than those matched with the criteria physicians use today.” 1, 2

Network training time reduced by more than two-thirds. “Using Deep Learning Toolbox and MATLAB, it took us 5 to 10 minutes to train our ANNs,” says Nilsson. “Training took 30 to 60 minutes using open-source software. That is a big difference, because we were training and evaluating hundreds of network configurations.”

Simulation time cut from weeks to days. “When we switched to MATLAB and MathWorks parallel computing technologies, we completed experiments that regularly took 3 to 4 weeks in about 5 days,” says Nilsson. “More importantly, the simulations were completed reliably, with no crashes.”

This website uses cookies to improve your user experience, personalize content and ads, and analyze website traffic. By continuing to use this website, you consent to our use of cookies. Please see our Privacy Policy to learn more about cookies and how to change your settings.