Generalized Exponential Concentration Inequality for Renyi Divergence Estimation
نویسندگان
چکیده
Estimating divergences in a consistent way is of great importance in many machine learning tasks. Although this is a fundamental problem in nonparametric statistics, to the best of our knowledge there has been no finite sample exponential inequality convergence bound derived for any divergence estimators. The main contribution of our work is to provide such a bound for an estimator of Rényi-α divergence for a smooth Hölder class of densities on the d-dimensional unit cube [0, 1]. We also illustrate our theoretical results with a numerical experiment.
منابع مشابه
Stochastic Comparisons of Series and Parallel Systems with Heterogeneous Extended Generalized Exponential Components
In this paper, we discuss the usual stochastic‎, ‎likelihood ratio, ‎dispersive and convex transform order between two parallel systems with independent heterogeneous extended generalized exponential components. ‎We also establish the usual stochastic order between series systems from two independent heterogeneous extended generalized exponential samples. ‎Finally, ‎we f...
متن کاملClosed-Form Jensen-Renyi Divergence for Mixture of Gaussians and Applications to Group-Wise Shape Registration
In this paper, we propose a generalized group-wise non-rigid registration strategy for multiple unlabeled point-sets of unequal cardinality, with no bias toward any of the given point-sets. To quantify the divergence between the probability distributions--specifically Mixture of Gaussians--estimated from the given point sets, we use a recently developed information-theoretic measure called Jens...
متن کاملEstimation in Simple Step-Stress Model for the Marshall-Olkin Generalized Exponential Distribution under Type-I Censoring
This paper considers the simple step-stress model from the Marshall-Olkin generalized exponential distribution when there is time constraint on the duration of the experiment. The maximum likelihood equations for estimating the parameters assuming a cumulative exposure model with lifetimes as the distributed Marshall Olkin generalized exponential are derived. The likelihood equations do not lea...
متن کاملAlpha-Divergence for Classification, Indexing and Retrieval (Revised 2)
Motivated by Chernoff’s bound on asymptotic probability of error we propose the alpha-divergence measure and a surrogate, the alpha-Jensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha-divergence, also known as Renyi divergence, is a generalization of the Kullback-Liebler divergence and the Hellinger affinity between the probability densi...
متن کاملExtended inequalities for weighted Renyi entropy involving generalized Gaussian densities
In this paper the author analyzes the weighted Renyi entropy in order to derive several inequalities in weighted case. Furthermore, using the proposed notions α-th generalized deviation and (α, p)-th weighted Fisher information, extended versions of the moment-entropy, Fisher information and Cramér-Rao inequalities in terms of generalized Gaussian densities are given. 1 The weighted p-Renyi ent...
متن کامل