Machine Learning With Heart

PinIt

Advances in both computational modeling and machine learning have shown the promise of a clear paradigm shift in personalized medicine. As such, a new research effort seeks to leverage machine learning to create new ways to identify and track heart disease localization and progression.

The World Health Organization tells us that cardiovascular disease claims 17.9 million lives each year, making it the leading cause of death worldwide. The numbers look bleak, but machine learning stands ready to offer hope.

Driving physics-based models of the heart with machine learning 

Researchers have spent over a decade developing high-fidelity, virtual models capable of predicting how the blood will flow through our veins and arteries down to the cellular level. We began our research by studying coarctation of the aorta, a heart defect that narrows the aorta and forces the heart to pump harder to get oxygenated blood to the body. This condition is usually present at birth and — if symptoms are mild — can go undetected until adulthood. Coarctation of the aorta requires lifelong follow-up, but thanks to machine learning, 3D printing, physics-based modeling, and supercomputing, we can evaluate patients’ risk factors under a wide range of lifestyles and activities. 

Advances in both computational modeling and machine learning have shown the promise of a clear paradigm shift in personalized medicine. Using image-based models, 3D representations of an individual patient’s vascular anatomy are acquired and used as input to high-resolution blood flow simulations. By leveraging the computing power provided through large-scale supercomputers, transient blood flow can be simulated through a patient’s aorta with this congenital disease. In our work, we developed a parallel software package called “HARVEY” that takes in a patient’s 3D vascular anatomy as the input and calculates their hemodynamics under different conditions. These models were validated through comparison with measurements from flow experiments in 3D prints of CT-derived patient vascular anatomies and invasive measurements in the clinic. 

Working closely with researchers at the Department of Energy, we fine-tuned HARVEY to run efficiently on the world’s biggest supercomputers. These efforts allowed us to create one of the largest training data sets for 3D flow models of coarctation of the aorta using over 70 million compute hours. Using conventional computational fluid mechanics models, millions of hours of such simulations were needed to calculate how different stressors, such as high elevation or pregnancy, impacted flow through the narrowed aorta. These models allow clinicians and researchers to accurately assess the effect of different lifestyle choices imposed on each patient.

By combining machine learning with the physics-based modeling from HARVEY, our team, led by Dr. Bradley Feiger, was able to identify the minimal number of physiological states that would need to be modeled for any new patient in order to accurately predict the impact of different states (e.g., resting or exercise). As the tens of millions of compute hours required with conventional methods are not tractable for each new patient, AI enables personalized simulation and prediction to be more tenable for routine practice. Moreover, we expect this framework to be extensible to other types of heart disease.

See also: AI Predicts Heart Disease with Single X-Ray

Moving from single heartbeats to longitudinal blood flow maps

Presently, we stand at the edge of vast potential. Tools like HARVEY can create high-fidelity virtual models of a patient’s vascular system to help diagnose a disease or simulate how a patient will respond to modifications from surgical intervention. That insight can be incredibly helpful to doctors as they plan interventions before they ever enter the operating room.

Unfortunately, the data processing needed to examine 3D cardiovascular dynamics is immense. Currently, we can look at metrics for a few heartbeats. This timeframe gives us valuable insight into a person’s risk for heart disease or for having an adverse event after a procedure but has clear limitations in terms of tracking disease progression and response to future activity.

Imagine the benefits of tracking blood flow and vascular dynamics over months — even years. The world doesn’t have enough supercomputers to give us that level of computational power, but machine learning puts this goal within reach. By combining machine learning and physics-based modeling, we are developing accurate techniques to create personalized, long-term maps of blood flow dynamics across a wide variety of activities. Moreover, we are working to take advantage of the unprecedented data made available through continuous monitoring with wearable devices to enable data-driven longitudinal mapping.

In my lab, we are developing techniques to drive blood flow simulations from data obtained with wearables. Our goal is to leverage machine learning to create new ways to identify and track disease localization and progression. These methods will allow us to monitor changing risks associated with heart disease. 

In the not-so-distant future, we expect to see a transition from single heartbeat metrics to using methods combining AI and physics-based models to accurately and efficiently predict how a patient’s blood flow will change over time. The benefit this shift will bring to our ability to diagnose and treat heart disease will demonstrate the strong potential of truly personalized healthcare.

Dr. Amanda E. Randles

About Dr. Amanda E. Randles

Dr. Amanda E. Randles, Ph.D., is the Alfred Winborne Mordecai and Victoria Stover Mordecai Assistant Professor of Biomedical Sciences at Duke University. She is the recipient of the NSF CAREER award, the ACM Grace Murray Hopper Award, the NIH Pioneer Award, IEEE-CS Technical Consortium on High-Performance Computing (TCHPC) Award, the NIH Director's Early Independence Award, and the LLNL Lawrence Fellowship. Dr. Randles was also named to the World Economic Forum Young Scientist and the MIT Technology Review World’s Top 35 Innovators under the Age of 35 lists. She holds 120 US patents and has published over 80 peer-reviewed articles.

Leave a Reply

Your email address will not be published. Required fields are marked *