نتایج جستجو برای: gibbs sampling

تعداد نتایج: 219418  

Journal: :Communications for Statistical Applications and Methods 2015

2009
Philip Resnik Eric Hardisty

This document is intended for computer scientists who would like to try out a Markov Chain Monte Carlo (MCMC) technique, particularly in order to do inference with Bayesian models on problems related to text processing. We try to keep theory to the absolute minimum needed, and we work through the details much more explicitly than you usually see even in “introductory” explanations. That means w...

2012
Fredrik Lindsten Michael I. Jordan Thomas B. Schön

We present a novel method in the family of particle MCMC methods that we refer to as particle Gibbs with ancestor sampling (PG-AS). Similarly to the existing PG with backward simulation (PG-BS) procedure, we use backward sampling to (considerably) improve the mixing of the PG kernel. Instead of using separate forward and backward sweeps as in PG-BS, however, we achieve the same effect in a sing...

2007
Robert E. McCulloch

Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your perso...

2016

Scaling probabilistic inference algorithms to large datasets and parallel computing architectures is a challenge of great importance and considerable current research interest, and great strides have been made in designing parallelizeable algorithms. Along with the powerful and sometimes complex new algorithms, a very simple strategy has proven to be surprisingly useful in some situations: runn...

2016

Proof. Let x, y 2 ⌦ be two configurations. We will prove the claim for the visible conditional distributions. The proof for the hidden conditional distributions will follow symmetrically. For each visible node v i , let (X(v i ), Y (v i )) be the maximal coupling of P (v)(X(v i ) |x(h)) and P (v)(Y (v i ) | y(h)) guaranteed in Lemma 1. By doing this independently for all visible nodes, we have ...

1998
Steven N. MacEachern Merlise Clyde Jun S. Liu Steve MacEachern

There are two generations of Gibbs sampling methods for semi-parametric models involving the Dirichlet process. The rst generation suuered from a severe drawback; namely that the locations of the clusters, or groups of parameters, could essentially become xed, moving only rarely. Two strategies that have been proposed to create the second generation of Gibbs samplers are integration and appendi...

2010
Daan Fierens

There is currently a large interest in probabilistic logical models. A popular algorithm for approximate probabilistic inference with such models is Gibbs sampling. From a computational perspective, Gibbs sampling boils down to repeatedly executing certain queries on a knowledge base composed of a static part and a dynamic part. The larger the static part, the more redundancy there is in these ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید