نتایج جستجو برای: gibbs sampling

تعداد نتایج: 219418  

2008
David Mimno Hanna M. Wallach Andrew McCallum

Previous work on probabilistic topic models has either focused on models with relatively simple conjugate priors that support Gibbs sampling or models with non-conjugate priors that typically require variational inference. Gibbs sampling is more accurate than variational inference and better supports the construction of composite models. We present a method for Gibbs sampling in non-conjugate l...

2012
Tong Zhao Chunping Li Mengya Li

In this paper, we propose a novel approach called Wikipedia-based Collapsed Gibbs sampling (Wikipedia-based CGS) to improve the efficiency of the collapsed Gibbs sampling(CGS), which has been widely used in latent Dirichlet Allocation (LDA) model. Conventional CGS method views each word in the documents as an equal status for the topic modeling. Moreover, sampling all the words in the documents...

2001
John Geweke

This article provides an exact Bayesian frame­ work for analyzing the arbitrage pricing the­ ory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor modeL In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry ...

2007
Yasuhiro Omori

We consider Bayesian estimation of a sample selection model and propose a highly efficient Gibbs sampler using the additional scale transformation step to speed up the convergence to the posterior distribution. Numerical examples are given to show the efficiency of our proposed sampler.

2007
Yee Whye Teh Dilan Görür Zoubin Ghahramani

The Indian buffet process (IBP) is a Bayesian nonparametric distribution whereby objects are modelled using an unbounded number of latent features. In this paper we derive a stick-breaking representation for the IBP. Based on this new representation, we develop slice samplers for the IBP that are efficient, easy to implement and are more generally applicable than the currently available Gibbs s...

2015
Jean-Baptiste Tristan Joseph Tassarotti Guy L. Steele

We introduce Mean-for-Mode estimation, a variant of an uncollapsed Gibbs sampler that we use to train LDA on a GPU. The algorithm combines benefits of both uncollapsed and collapsed Gibbs samplers. Like a collapsed Gibbs sampler — and unlike an uncollapsed Gibbs sampler — it has good statistical performance, and can use sampling complexity reduction techniques such as sparsity. Meanwhile, like ...

Journal: :Pakistan Journal of Statistics and Operation Research 2019

2010
Bob Carpenter

This note shows how to integrate out the multinomial parameters for latent Dirichlet allocation (LDA) and naive Bayes (NB) models. This allows us to perform Gibbs sampling without taking multinomial parameter samples. Although the conjugacy of the Dirichlet priors makes sampling the multinomial parameters relatively straightforward, sampling on a topic-by-topic basis provides two advantages. Fi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید