On the Estimation of Shannon Entropy
author
Abstract:
Shannon entropy is increasingly used in many applications. In this article, an estimator of the entropy of a continuous random variable is proposed. Consistency and scale invariance of variance and mean squared error of the proposed estimator is proved and then comparisons are made with Vasicek's (1976), van Es (1992), Ebrahimi et al. (1994) and Correa (1995) entropy estimators. A simulation study is performed and the results indicate that the proposed estimator has smaller mean squared error than competing estimators.
similar resources
A note on Shannon entropy
We present a somewhat different way of looking on Shannon entropy. This leads to an axiomatisation of Shannon entropy that is essentially equivalent to that of Fadeev.
full textDivergence measures based on the Shannon entropy
A new class of information-theoretic divergence measures based on the Shannon entropy is introduced. Unlike the well-known Kullback divergences, the new measures do not require the condition of absolute continuity to be satisfied by the probability distributions involved. More importantly, their close relationship with the variational distance and the probability of misclassification error are ...
full textShannon Entropy Estimation in $\infty$-Alphabets from Convergence Results
The problem of Shannon entropy estimation in countable infinite alphabets is revisited from the adoption of convergence results of the entropy functional. Sufficient conditions for the convergence of the entropy are used, including scenarios with both finitely and infinitely supported distributions. From this angle, four plug-in histogram-based estimators are studied showing strong consistency ...
full textOn convergence properties of Shannon entropy
Convergence properties of Shannon Entropy are studied. In the differential setting, it is shown that weak convergence of probability measures, or convergence in distribution, is not enough for convergence of the associated differential entropies. A general result for the desired differential entropy convergence is provided, taking into account both compactly and uncompactly supported densities....
full textNotes on the Shannon Entropy of the Neural Response
In these notes we focus on the concept of Shannon entropy in an attempt to provide a systematic way of assessing the discrimination properties of the neural response, and quantifying the role played by the number of layers and the number of templates.
full textRényi Extrapolation of Shannon Entropy
Relations between Shannon entropy and Rényi entropies of integer order are discussed. For any N–point discrete probability distribution for which the Rényi entropies of order two and three are known, we provide an lower and an upper bound for the Shannon entropy. The average of both bounds provide an explicit extrapolation for this quantity. These results imply relations between the von Neumann...
full textMy Resources
Journal title
volume 12 issue 1
pages 57- 70
publication date 2015-09
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023