Statistical Estimation of the Kullback–Leibler Divergence
نویسندگان
چکیده
Asymptotic unbiasedness and L2-consistency are established, under mild conditions, for the estimates of Kullback–Leibler divergence between two probability measures in Rd, absolutely continuous with respect to (w.r.t.) Lebesgue measure. These based on certain k-nearest neighbor statistics pair independent identically distributed (i.i.d.) due vector samples. The novelty results is also treating mixture models. In particular, they cover mixtures nondegenerate Gaussian measures. mentioned asymptotic properties related estimators Shannon entropy cross-entropy strengthened. Some applications indicated.
منابع مشابه
Penalized Bregman Divergence Estimation via Coordinate Descent
Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...
متن کاملthe investigation of research articles in applied linguistics: convergence and divergence in iranian elt context
چکیده ندارد.
Statistical Topology Using the Nonparametric Density Estimation and Bootstrap Algorithm
This paper presents approximate confidence intervals for each function of parameters in a Banach space based on a bootstrap algorithm. We apply kernel density approach to estimate the persistence landscape. In addition, we evaluate the quality distribution function estimator of random variables using integrated mean square error (IMSE). The results of simulation studies show a significant impro...
متن کاملRobust Estimation in Linear Regression Model: the Density Power Divergence Approach
The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...
متن کاملNonparametric Divergence Estimation
A. The von Mises Expansion Before diving into the auxiliary results of Section 5, let us first derive some properties of the von Mises expansion. It is a simple calculation to verify that the Gateaux derivative is simply the functional derivative of in the event that T (F ) = R (f). Lemma 8. Let T (F ) = R (f)dμ where f = dF/dμ is the Radon-Nikodym derivative, is differentiable and let G be som...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematics
سال: 2021
ISSN: ['2227-7390']
DOI: https://doi.org/10.3390/math9050544