Kullback-Leibler Simplex

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kullback-Leibler Boosting

In this paper, we develop a general classification framework called Kullback-Leibler Boosting, or KLBoosting. KLBoosting has following properties. First, classification is based on the sum of histogram divergences along corresponding global and discriminating linear features. Second, these linear features, called KL features, are iteratively learnt by maximizing the projected Kullback-Leibler d...

متن کامل

Gaussian Kullback-Leibler approximate inference

We investigate Gaussian Kullback-Leibler (G-KL) variational approximate inference techniques for Bayesian generalised linear models and various extensions. In particular we make the following novel contributions: sufficient conditions for which the G-KL objective is differentiable and convex are described; constrained parameterisations of Gaussian covariance that make G-KL methods fast and scal...

متن کامل

Symmetrizing the Kullback-Leibler Distance

We define a new distance measure the resistor-average distance between two probability distributions that is closely related to the Kullback-Leibler distance. While the KullbackLeibler distance is asymmetric in the two distributions, the resistor-average distance is not. It arises from geometric considerations similar to those used to derive the Chernoff distance. Determining its relation to we...

متن کامل

Kullback-Leibler Proximal Variational Inference

We propose a new variational inference method based on a proximal framework that uses the Kullback-Leibler (KL) divergence as the proximal term. We make two contributions towards exploiting the geometry and structure of the variational bound. Firstly, we propose a KL proximal-point algorithm and show its equivalence to variational inference with natural gradients (e.g. stochastic variational in...

متن کامل

Min-Max Kullback-Leibler Model Selection

This paper considers an information theoretic min-max approach to the model selection problem. The aim of this approach is to select the member of a given parameterized family of probability models so as to minimize the worst-case KullbackLeibler divergence from an uncertain “truth” model. Uncertainty of the truth is specified by an upper-bound of the KL-divergence relative to a given reference...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: SSRN Electronic Journal

سال: 2011

ISSN: 1556-5068

DOI: 10.2139/ssrn.1964719