Information bounds for Gibbs samplers
نویسندگان
چکیده
If we wish to eeciently estimate the expectation of an arbitrary function on the basis of the output of a Gibbs sampler, which is better: deterministic or random sweep? In each case we calculate the asymptotic variance of the empirical estimator, the average of the function over the output, and determine the minimal asymptotic variance for estimators that use no information about the underlying distribution. The empirical estimator has noticeably smaller variance for deter-ministic sweep. The variance bound for random sweep is in general smaller than for deterministic sweep, but the two are equal if the target distribution is continuous. If the components of the target distribution are not strongly dependent, the empirical estimator is close to eecient under deterministic sweep, and its asymptotic variance approximately doubles under random sweep.
منابع مشابه
Sufficient Burn - in for Gibbs Samplers for a Hierarchical Random Effects Model
We consider Gibbs and block Gibbs samplers for a Bayesian hierarchical version of the one-way random effects model. Drift and minorization conditions are established for the underlying Markov chains. The drift and minorization are used in conjunction with results from J. S. Rosenthal [J. Amer. Statist. Assoc. 90 (1995) 558– 566] and G. O. Roberts and R. L. Tweedie [Stochastic Process. Appl. 80 ...
متن کاملImproving Gibbs Sampler Scan Quality with DoGS
The pairwise influence matrix of Dobrushin has long been used as an analytical tool to bound the rate of convergence of Gibbs sampling. In this work, we use Dobrushin influence as the basis of a practical tool to certify and efficiently improve the quality of a Gibbs sampler. Our Dobrushin-optimized Gibbs samplers (DoGS) offer customized variable selection orders for a given sampling budget and...
متن کاملUniform Ergodicity of the Iterated Conditional SMC and Geometric Ergodicity of Particle Gibbs samplers
We establish quantitative bounds for rates of convergence and asymptotic variances for iterated conditional sequential Monte Carlo (i-cSMC) Markov chains and associated particle Gibbs samplers [1]. Our main findings are that the essential boundedness of potential functions associated with the i-cSMC algorithm provide necessary and sufficient conditions for the uniform ergodicity of the i-cSMC M...
متن کاملGibbs Sampling for (Coupled) Infinite Mixture Models in the Stick Breaking Representation
Nonparametric Bayesian approaches to clustering, information retrieval, language modeling and object recognition have recently shown great promise as a new paradigm for unsupervised data analysis. Most contributions have focused on the Dirichlet process mixture models or extensions thereof for which efficient Gibbs samplers exist. In this paper we explore Gibbs samplers for infinite complexity ...
متن کاملMulticore Gibbs Sampling in Dense, Unstructured Graphs
Multicore computing is on the rise, but algorithms such as Gibbs sampling are fundamentally sequential and may require close consideration to be made parallel. Existing techniques either exploit sparse problem structure or make approximations to the algorithm; in this work, we explore an alternative to these ideas. We develop a parallel Gibbs sampling algorithm for shared-memory systems that do...
متن کامل