A New Estimator of Entropy

Authors

  • Hadi Alizadeh Noughabi
  • Naser Reza Arghami
Abstract:

In this paper we propose an estimator of the entropy of a continuous random variable. The estimator is obtained by modifying the estimator proposed by Vasicek (1976). Consistency of estimator is proved, and comparisons are made with Vasicek’s estimator (1976), van Es’s estimator (1992), Ebrahimi et al.’s estimator (1994) and Correa’s estimator (1995). The results indicate that the proposed estimator has smaller mean squared error than above estimators.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

A recursive Renyi's entropy estimator

Abstract. Estimating the entropy of a sample set is required in solving numerous learning scenarios involving information theoretic optimization criteria. A number of entropy estimators are available in the literature; however, these require a batch of samples to operate on in order to yield an estimate. In this paper, we derive a recursive formula to estimate Renyi’s quadratic entropy on-line,...

full text

Bias Adjustment for a Nonparametric Entropy Estimator

Zhang in 2012 introduced a nonparametric estimator of Shannon’s entropy, whose bias decays exponentially fast when the alphabet is finite. We propose a methodology to estimate the bias of this estimator. We then use it to construct a new estimator of entropy. Simulation results suggest that this bias adjusted estimator has a significantly lower bias than many other commonly used estimators. We ...

full text

A new diversity estimator

*Correspondence: [email protected] 2Department of Mathematics and Statistics, UNC Charlotte, 9201 University City Blvd, 28223 Charlotte, USA Full list of author information is available at the end of the article Abstract The maximum likelihood estimator (MLE) of Gini-Simpson’s diversity index (GS) is widely used but suffers from large bias when the number of species is large or infinite. We prop...

full text

Fast Calculation of Entropy with Zhang's Estimator

Entropy is a fundamental property of a repertoire. Here, we present an efficient algorithm to estimate the entropy of types with the help of Zhang’s estimator. The algorithm takes advantage of the fact that the number of different frequencies in a text is in general much smaller than the number of types. We justify the convenience of the algorithm by means of an analysis of the statistical prop...

full text

On the Kozachenko-leonenko Entropy Estimator

We study in details the bias and variance of the entropy estimator proposed by Kozachenko and Leonenko [10] for a large class of densities on Rd. We then use the work of Bickel and Breiman [2] to prove a central limit theorem in dimensions 1 and 2. In higher dimensions, we provide a development of the bias in terms of powers of N−2/d. This allows us to use a Richardson extrapolation to build, i...

full text

Order Statistics Based Estimator for Renyi’s Entropy

Several types of entropy estimators exist in the information theory literature. Most of these estimators explicitly involve estimating the density of the available data samples before computing the entropy. However, the entropyestimator using sample spacing avoids this intermediate step and computes the entropy directly using the order-statistics. In this paper, we extend our horizon beyond Sha...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 9  issue None

pages  53- 64

publication date 2010-03

By following a journal you will be notified via email when a new issue of this journal is published.

Keywords

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023