Stable Directed Belief Propagation in Gaussian DAGs using the auxiliary variable trick
نویسندگان
چکیده
We consider approximate inference in a class of switching linear Gaussian State Space models which includes the switching Kalman Filter and the more general case of switch transitions dependent on the continuous hidden state. The method is a novel form of Gaussian sum smoother consisting of a single forward and backward pass, and compares favourably against a range of competing techniques, including sequential Monte Carlo and Expectation Propagation.
منابع مشابه
Construction and comparison of approximations for switching linear gaussian state space models
We introduce a new method for approximate inference in Hybrid Dynamical Graphical models, in particular, for switching dynamical networks. For the important special case of switching linear Gaussian state space models (switching Kalman Filters), our method is a novel form of Gaussian sum smoother, consisting of a single forward and backward pass. Our method is particularly well suited to switch...
متن کاملLoop corrections for message passing algorithms in continuous variable models
In this paper we derive the equations for Loop Corrected Belief Propagation on a continuous variable Gaussian model. Using the exactness of the averages for belief propagation for Gaussian models, a different way of obtaining the covariances is found, based on Belief Propagation on cavity graphs. We discuss the relation of this loop correction algorithm to Expectation Propagation algorithms for...
متن کاملAn Efficient Algorithm for Computing Interventional Distributions in Latent Variable Causal Models
Probabilistic inference in graphical models is the task of computing marginal and conditional densities of interest from a factorized representation of a joint probability distribution. Inference algorithms such as variable elimination and belief propagation take advantage of constraints embedded in this factorization to compute such densities efficiently. In this paper, we propose an algorithm...
متن کاملThe Hidden Life of Latent Variables: Bayesian Learning with Mixed Graph Models
Directed acyclic graphs (DAGs) have been widely used as a representation of conditional independence in machine learning and statistics. Moreover, hidden or latent variables are often an important component of graphical models. However, DAG models suffer from an important limitation: the family of DAGs is not closed under marginalization of hidden variables. This means that in general we cannot...
متن کاملInterval Propagation on Directed Acyclic Graphs Interval Propagation and Search on Directed Acyclic Graphs for Numerical Constraint Solving
The fundamentals of interval analysis on directed acyclic graphs (DAGs) for global optimization and constraint propagation have recently been proposed by Schichl and Neumaier [2005]. For representing numerical problems, the authors use DAGs whose nodes are subexpressions and whose directed edges are computational flows. Compared to tree-based representations [Benhamou et al. 1999], DAGs offer t...
متن کامل