A Bayesian network is a compact and intuitive graphical representation of the joint probability distribution. Belief update in a Bayesian network is the task of computing posterior probability distributions of unobserved variables given a set of evidence.

Even though existing algorithms for belief update in Bayesian networks (BNs) have exponential time and space complexity, belief update in many real world BNs is feasible. However, in some cases the efficiency of belief update may be insufficient. In such cases minor improvements in efficiency may be important or even necessary to make a task tractable. We describe two improvements to the message computation in Lazy Propagation (LP) (1) we introduce myopic methods for sorting the operations involved in a variable elimination using Arc-Reversal and (2) extend LP with the any-space property. The performance impacts of the methods are assessed empirically.

The presentation will include an introduction to Bayesian networks and belief update by message passing in a junction tree representation of the Bayesian network.