نتایج جستجو برای: and 054 disregarding shannon entropy

تعداد نتایج: 16840017  

Journal: :international journal of nonlinear analysis and applications 2013
b. afhami m. madadi

in this paper, we derive the exact analytical expressions for the shannon entropy of generalized orderstatistics from pareto-type and related distributions.

Journal: :Entropy 2015
Qiaoning Yang Jianlin Wang

In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy) are defined. They take full advantage of sen...

Journal: :Int. J. Math. Mathematical Sciences 2005
C. G. Chakrabarti Indranil Chakrabarty

We have presented a new axiomatic derivation of Shannon entropy for a discrete probability distribution on the basis of the postulates of additivity and concavity of the entropy function. We have then modified Shannon entropy to take account of observational uncertainty.The modified entropy reduces, in the limiting case, to the form of Shannon differential entropy. As an application, we have de...

Journal: :IEEE Trans. Information Theory 2002
Dan A. Simovici Szymon Jaroszewicz

The aim of this paper is to present an axiomatization of a generalization of Shannon’s entropy starting from partitions of finite sets. The proposed axiomatization yields as special cases the Havrda-Charvat entropy, and thus, provides axiomatizations for the Shannon entropy, the Gini index, and for other types of entropy used in classification and data mining. Keywords—Shannon entropy, Gini ind...

Journal: :IACR Cryptology ePrint Archive 2014
Maciej Skorski

We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were pre...

2016
John Preskill

Contents 10 Quantum Shannon Theory 1 10.1 Shannon for Dummies 2 10.1.1 Shannon entropy and data compression 2 10.1.2 Joint typicality, conditional entropy, and mutual information 6 10.1.3 Distributed source coding 8 10.1.4 The noisy channel coding theorem 9 10.2 Von Neumann Entropy 16 10.2.1 Mathematical properties of H(ρ) 18 10.2.2 Mixing, measurement, and entropy 20 10.2.3 Strong subadditivit...

Journal: :IEEE Transactions on Information Theory 2013

Journal: :Information and Control 1978

2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. One-parameter extensions for Shannon entropy have been studied by many researchers. The Rényi entropy [2] and the Tsallis entropy [3] are famous. In the paper [4], the uniqueness theorem for the Tsallis entropy was proved. Also, in our...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید