Is Customized Healthcare a Near Term Reality?

In this special guest feature, Abdul Hamid Halabi, the global business development lead for healthcare and life sciences at NVIDIA, discusses how personalized or precision medicine is becoming a reality with the help of a machine learning method called deep learning. Halabi is responsible for helping drive the company’s growth and innovation strategies across the healthcare ecosystem. With nearly 20 years of experience in advanced technologies, he partners with thought leaders and world-class organizations to transform healthcare through the application of deep learning and high performance computing to enable precision medicine initiatives and evidence-based medicine. Halabi earned a Bachelor of Science degree in computer engineering from the University of Toronto, and an MBA degree from Texas A&M University – Commerce. Additionally, he holds a Master of Science degree in management science and engineering from Stanford University, where he is currently a Ph.D. candidate.

Wouldn’t it be nice if we were able to obtain medical treatment that works for us as individuals rather than a general “one-size-fits-all” approach? Much like with clothing, people are all different and what works for one person, may not work for another. Therefore, we want medical alternatives that are customized for our bodies.

Personalized or precision medicine is not a new concept, but our ability to implement it has just been significantly upgraded with a machine learning method called deep learning. Deep Learning is what sparked the most recent revolution in Artificial Intelligence. In some ways, it is a way that allows the computer to learn the same way we do: by listening, reading and watching!

EHRs have a wealth of information about a person’s medical history. Some of the data is easy to parse out such as age and gender, but most of the data is in the doctor’s clinical notes and extracting the intent from these notes has been very challenging until deep learning came around. Let me walk you through some examples of how deep learning is enabling us to extract knowledge out of the EHRs.

Researchers at Oak Ridge National Laboratory have applied deep learning to extract meaningful insight from the pathology reports of cancer patients that can then be used to assess the efficacy of different cancer treatments at a population level. This historical data allows us to then determine the best treatment for a specific person or even the best healthcare policy for our nation.

Hospitals such as Johns Hopkins and Mount Sinai have developed, and in some cases deployed, patient monitoring systems that can predict patient health outcomes by parsing through their electronic health records. The physician and medical staff are alerted when the deep learning system suspects that an adverse event such as sepsis may happen or a severe disease such as diabetes may develop based upon key factors found in the EHR.

Finally, bringing all of the information in the EHR together to build a custom model and personalize medical treatment to each individual specific patient is always the goal. Technology companies such as CloudMedX have done a phenomenal job at offering solutions. CloudMedX offers a clinical AI portal that improves clinical decision support by surfacing the right insights at the right time, helping physicians assess and understand their patient’s risk and inform treatment and preventative action.

It is truly a transformative time for healthcare. Deep learning within EHRs can enable physicians to save more lives, reduce costs in the healthcare system, and increase access to the best level of care. However, to make it happen, we need the patients, the physicians, the data scientists and the engineers all to work together. The great news is that we have a super helper, deep learning!

Resource Links:

Industry Perspectives

In this special guest feature, Assaf Katan, CEO & Co-Founder of Apertio, the Open Data deep search engine, suggests that there are huge social and financial benefits that businesses and economies can realize if they can successfully leverage Open Data. Despite this, there are still some hurdles for data professionals to leap. A great way to start is to consider whether your data meets the criteria for what’s known as the FAIR principles. These are Findability, Accessibility, Interoperability and Reusability. [READ MORE…]

White Papers

The data catalog has come from nowhere in the past five years to become a key enabling technology for multiple use cases including self-service analytics, self-service data preparation and multi-location data management. Download a new white paper from Unifi Software that explores the data catalog as a major data management breakthrough, as well as its importance in enabling modern analytics architecture.