Sequential Monte Carlo Samplers
نویسندگان
چکیده
In this paper, we propose a methodology to sample sequentially from a sequence of probability distributions known up to a normalizing constant and defined on a common space. These probability distributions are approximated by a cloud of weighted random samples which are propagated over time using Sequential Monte Carlo methods. This methodology allows us to derive simple algorithms to make parallel Markov chain Monte Carlo algorithms interact in a principled way, to perform global optimization and sequential Bayesian estimation and to compute ratios of normalizing constants. We illustrate these algorithms for various integration tasks arising in the context of Bayesian inference.
منابع مشابه
Sequential Monte Carlo samplers for Bayesian DSGE models
Bayesian estimation of DSGE models typically uses Markov chain Monte Carlo as importance sampling (IS) algorithms have a difficult time in high-dimensional spaces. I develop improved IS algorithms for DSGE models using recent advances in Monte Carlo methods known as sequential Monte Carlo samplers. Sequential Monte Carlo samplers are a generalization of particle filtering designed for full simu...
متن کاملError Bounds and Normalizing Constants for Sequential Monte Carlo Samplers in High Dimensions
In this article we develop a collection of results associated to the analysis of the Sequential Monte Carlo (SMC) samplers algorithm, in the context of high-dimensional i.i.d. target probabilities. The SMC samplers algorithm can be designed to sample from a single probability distribution, using Monte Carlo to approximate expectations w.r.t. this law. Given a target density in d−dimensions our ...
متن کاملInteracting Particle Markov Chain Monte Carlo
We introduce interacting particle Markov chain Monte Carlo (iPMCMC), a PMCMC method based on an interacting pool of standard and conditional sequential Monte Carlo samplers. Like related methods, iPMCMC is a Markov chain Monte Carlo sampler on an extended space. We present empirical results that show significant improvements in mixing rates relative to both noninteracting PMCMC samplers and a s...
متن کاملMeasuring the non-asymptotic convergence of sequential Monte Carlo samplers using probabilistic programming
A key limitation of sampling algorithms for approximate inference is that it is difficult to quantify their approximation error. Widely used sampling schemes, such as sequential importance sampling with resampling and MetropolisHastings, produce output samples drawn from a distribution that may be far from the target posterior distribution. This paper shows how to upper-bound the symmetric KL d...
متن کاملStatic-parameter estimation in piecewise deterministic processes using particle Gibbs samplers
We develop particle Gibbs samplers for static-parameter estimation in discretelyobserved piecewise deterministic processes (pdps). pdps are stochastic processes that jump randomly at a countable number of stopping times but otherwise evolve deterministically in continuous time. A sequential Monte Carlo (smc) sampler for ltering in pdps has recently been proposed. We rst provide new insight into...
متن کامل