نتایج جستجو برای: information entropy

تعداد نتایج: 1203337  

2004
Erwin Lutwak Deane Yang Gaoyong Zhang

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

1997
E. C. van der Meulen

We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper ([55]). Since then, entropy has been of great theoretical and applied interest. The basic properties ∗This research was supported by the Scientific Exchange Program between the Belgian Academy of Sciences and the Hungarian Academy of Sciences in the field of Mathemat...

Journal: :CoRR 2015
Hélio Magalhães de Oliveira

This paper reports a new reading for wavelets, which is based on the classical ’De Broglie’ principle. The waveparticle duality principle is adapted to wavelets. Every continuous basic wavelet is associated with a proper probability density, allowing defining the Shannon entropy of a wavelet. Further entropy definitions are considered, such as Jumarie or Renyi entropy of wavelets. We proved tha...

2010
E. Bouhova-Thacker

This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures. A brief summary of the origins of the concept of physical entropy are provided in an appendix.

Journal: :IEEE Trans. Information Theory 2017
Masahito Hayashi Vincent Yan Fu Tan

In this paper, we evaluate the asymptotics of equivocations, their exponents as well as their second-order coding rates under various Rényi information measures. Specifically, we consider the effect of applying a hash function on a source and we quantify the level of non-uniformity and dependence of the compressed source from another correlated source when the number of copies of the sources is...

2013
Charles Marsh

Classically, Shannon entropy was formalized over discrete probability distributions. However, the concept of entropy can be extended to continuous distributions through a quantity known as continuous (or differential) entropy. The most common definition for continuous entropy is seemingly straightforward; however, further analysis reveals a number of shortcomings that render it far less useful ...

Journal: :Axioms 2017
Sonja Jäckle Karsten Keller

The Tsallis entropy given for a positive parameter α can be considered as a generalization of the classical Shannon entropy. For the latter, corresponding to α = 1, there exist many axiomatic characterizations. One of them based on the well-known Khinchin-Shannon axioms has been simplified several times and adapted to Tsallis entropy, where the axiom of (generalized) Shannon additivity is playi...

Considering Rao et al. (2004) and Di Crescenzo and Longobardi (2009) studies, Misagh et al. (2011) proposed a weighted information which is based on the cumulative entropy called Weighted Cumulative Entropy (WCE). The above-mentioned model is a Shiftdependent Uncertainty Measure. In this paper, we examine some of the properties of WCE and obtain some bounds for that. In order to ...

Business cost is acknowledged as one of the priorities in SMEs research. In thisstudy, the business cost of SMEs in Shanghai was primarily measured using Factor-Entropy analysis method. The purpose of this study is to effectively resolve the issueof simplification and assignment evaluation index system on business costs of SMEsin Shanghai. However, this study uses factor analysis to interpret t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید