Variational Cumulant Expansions for Intractable Distributions Pi Erre Van De Laar Rwcp (real World Computing Partnership) Theoretical Foundation Snn (foundation for Neural Networks)
نویسنده
چکیده
Intractable distributions present a common diiculty in inference within the proba-bilistic knowledge representation framework and variational methods have recently been popular in providing an approximate solution. In this article, we describe a perturbational approach in the form of a cumulant expansion which, to lowest order, recovers the standard Kullback-Leibler variational bound. Higher-order terms describe corrections on the variational approach without incurring much further computational cost. The relationship to other perturbational approaches such as TAP is also elucidated. We demonstrate the method on a particular class of undirected graphical models, Boltzmann machines, for which our simulation results conrm improved accuracy and enhanced stability during learning.
منابع مشابه
Variational Cumulant Expansions for Intractable Distributions
Intractable distributions present a common diiculty in inference within the proba-bilistic knowledge representation framework and variational methods have recently been popular in providing an approximate solution. In this article, we describe a perturbational approach in the form of a cumulant expansion which, to lowest order, recovers the standard Kullback-Leibler variational bound. Higher-or...
متن کاملAn Application of Linear Response Learning
Linear response is an approximation method for Boltzmann machines based on mean field theory. It is known that in the absence of hidden units this method can learn the network quite accurately with the costs of only one matrix inversion. We show that adding a flat distribution to the target can decrease the classification error. We apply linear response learning to a real world data set of digi...
متن کاملOn Structured Variational Approximations
The problem of approximating a probability distribution occurs frequently in many areas of applied mathematics including statistics communication theory machine learning and the theoretical analysis of complex systems such as neural networks Saul and Jordan have recently proposed a powerful method for e ciently ap proximating probability distributions known as structured variational approximati...
متن کاملDiscrete-valued Neural Networks Using Variational Inference
The increasing demand for neural networks (NNs) being employed on embedded devices has led to plenty of research investigating methods for training low precision NNs. While most methods involve a quantization step, we propose a principled Bayesian approach where we first infer a distribution over a discrete weight space from which we subsequently derive hardware-friendly low precision NNs. To t...
متن کاملDiscrete-valued Neural Networks Using Variational Inference
The increasing demand for neural networks (NNs) being employed on embedded devices has led to plenty of research investigating methods for training low precision NNs. While most methods involve a quantization step, we propose a principled Bayesian approach where we first infer a distribution over a discrete weight space from which we subsequently derive hardware-friendly low precision NNs. To t...
متن کامل