Divergence measures and a general framework for local variational approximation

نویسندگان

  • Kazuho Watanabe
  • Masato Okada
  • Kazushi Ikeda
چکیده

The local variational method is a technique to approximate an intractable posterior distribution in Bayesian learning. This article formulates a general framework for local variational approximation and shows that its objective function is decomposable into the sum of the Kullback information and the expected Bregman divergence from the approximating posterior distribution to the Bayesian posterior distribution. Based on a geometrical argument in the space of approximating posteriors, we propose an efficient method to evaluate an upper bound of the marginal likelihood. Moreover, we demonstrate that the variational Bayesian approach for the latent variable models can be viewed as a special case of this general framework.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational Inference for Bayesian Mixtures of Factor Analysers

We present an algorithm that infers the model structure of a mixture of factor analysers using an efficient and deterministic variational approximation to full Bayesian integration over model parameters. This procedure can automatically determine the optimal number of components and the local dimensionality of each component (Le. the number of factors in each factor analyser) . Alternatively it...

متن کامل

Alpha-Divergences in Variational Dropout

We investigate the use of alternative divergences to Kullback-Leibler (KL) in variational inference(VI), based on the Variational Dropout [10]. Stochastic gradient variational Bayes (SGVB) [9] is a general framework for estimating the evidence lower bound (ELBO) in Variational Bayes. In this work, we extend the SGVB estimator with using Alpha-Divergences, which are alternative to divergences to...

متن کامل

Variational Particle Approximations

Monte Carlo methods provide a powerful framework for approximating probability distributions with a set of stochastically sampled particles. In this paper, we rethink particle approximations from the perspective of variational inference, where the particles play the role of variational parameters. This leads to a deterministic version of Monte Carlo in which the particles are selected to optimi...

متن کامل

A robust variational approach for simultaneous smoothing and estimation of DTI

Estimating diffusion tensors is an essential step in many applications - such as diffusion tensor image (DTI) registration, segmentation and fiber tractography. Most of the methods proposed in the literature for this task are not simultaneously statistically robust and feature preserving techniques. In this paper, we propose a novel and robust variational framework for simultaneous smoothing an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 24 10  شماره 

صفحات  -

تاریخ انتشار 2011