Mathematical modeling of the heart, as many other models in biomedical sciences, involves a large number of parameters and simplifying approximations. Uncertainties for cardiac models are ubiquitous, including anatomy, fiber direction, and electric and mechanical properties of the tissue. Hence, both UQ and parameter sensitivity naturally arise during modeling, and they shall become fundamental in view of clinical applications.

For high-dimensional input uncertainties, e.g., substrate heterogeneity or cardiac fibers orientation, and high-dimensional output quantities of interest, e.g., the activation map, the method of choice for UQ is the classic Monte Carlo (MC) method. MC convergence rate does not suffer from the curse of dimensionality, but it is notoriously slow. While sampling a random field can be done very efficiently via the pivoted Cholesky decomposition, computing the cardiac activation from the bidomain equation is a computational demanding task. A single patient-tailored simulation can take several CPU -hours even on a large cluster. This makes uncertainty quantification (UQ) unfeasible, unless modeling reduction strategies are employed.

One such strategy is represented by multifidelity methods [1]. A key ingredient of the multifidelity approach is the choice of low-fidelity models. Typical strategies are projection-based or data-fit surrogates, which however need to be trained anew for each patient and may become inefficient for a large dimensionality of the input, as in the case under consideration. Instead, a more physics-based approach is to take advantage of the natural hierarchy of available models. These include different cellular models for the monodomain equation, the time-independent eikonal equation, and the 1D geodesic point activation [2,3]. By exploiting statistical correlations in this hierarchy, we observed a reduction of the computational cost by at least two orders of magnitude, enabling to perform a full analysis within a reasonable time frame. Moreover, we incorporate Bayesian techniques, which provide confidence intervals and full probability distributions at selected points, thus augmenting the information provided by standard frequentist approaches.