Cost-sensitive learning based on Bregman divergences
نویسندگان
چکیده
منابع مشابه
Adaptive mixture methods based on Bregman divergences
Article history: Available online 26 September 2012
متن کاملClustering with Bregman Divergences
A wide variety of distortion functions, such as squared Euclidean distance, Mahalanobis distance, Itakura-Saito distance and relative entropy, have been used for clustering. In this paper, we propose and analyze parametric hard and soft clustering algorithms based on a large class of distortion functions known as Bregman divergences. The proposed algorithms unify centroid-based parametric clust...
متن کاملOn the Centroids of Symmetrized Bregman Divergences
In this paper, we generalize the notions of centroids and barycenters to the broad class of information-theoretic distortion measures called Bregman divergences. Bregman divergences are versatile, and unify quadratic geometric distances with various statistical entropic measures. Because Bregman divergences are typically asymmetric, we consider both the left-sided and right-sided centroids and ...
متن کاملMetrics Defined by Bregman Divergences †
Bregman divergences are generalizations of the well known Kullback Leibler divergence. They are based on convex functions and have recently received great attention. We present a class of “squared root metrics” based on Bregman divergences. They can be regarded as natural generalization of Euclidean distance. We provide necessary and sufficient conditions for a convex function so that the squar...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2009
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-009-5132-8