Garvesh Raskutti - University of Wisconsin

Main Content

In this talk, I consider the problem of learning a Bayesian network from observational data. A number of constraint-based, score-based and hybrid algorithms have been developed for inferring DAG models. For constraint-based methods, statistical consistency guarantees typically rely on the faithfulness assumptions which have been shown to be extremely restrictive. So far there is limited work on consistency guarantees for score-based and hybrid algorithms and it is unclear whether consistency guarantees can be proven under weaker conditions than the faithfulness assumption. In this talk I present a score-based method that is consistent under strictly weaker conditions than faithfulness assumptions. The algorithm, which called the sparsest permutation (SP) algorithm is based on finding the causal ordering of the variables that yields the sparsest DAG. I demonstrate through simulations on small random network models and theory that the SP algorithm compares favorably to the constraint-based PC and SGS algorithms as well as the score-based greedy equivalence search and hybrid max-min hill-climbing methods. In the Gaussian setting, I use existing results to demonstrate that the SP algorithm boils down to finding the permutation of the variables with sparsest Cholesky decomposition for the inverse covariance matrix.