نتایج جستجو برای: modified shannon entropy

تعداد نتایج: 321667  

2016
John Preskill

Contents 10 Quantum Shannon Theory 1 10.1 Shannon for Dummies 2 10.1.1 Shannon entropy and data compression 2 10.1.2 Joint typicality, conditional entropy, and mutual information 6 10.1.3 Distributed source coding 8 10.1.4 The noisy channel coding theorem 9 10.2 Von Neumann Entropy 16 10.2.1 Mathematical properties of H(ρ) 18 10.2.2 Mixing, measurement, and entropy 20 10.2.3 Strong subadditivit...

The Kolmogorov-Sinai entropy is a far reaching dynamical generalization of Shannon entropy of information systems. This entropy works perfectly for probability measure preserving (p.m.p.) transformations. However, it is not useful when there is no finite invariant measure. There are certain successful extensions of the notion of entropy to infinite measure spaces, or transformations with ...

2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. One-parameter extensions for Shannon entropy have been studied by many researchers. The Rényi entropy [2] and the Tsallis entropy [3] are famous. In the paper [4], the uniqueness theorem for the Tsallis entropy was proved. Also, in our...

Journal: :CoRR 2008
Shigeru Furuichi

Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. Oneparameter extensions for Shannon entropy have been studied by many researchers [2]. The Rényi entropy [3] and the Tsallis entropy [4] are famous. In the paper [5], the uniqueness theorem for the Tsallis entropy was proven. See also ...

2000
A. Zeilinger Michael J. W. Hall

It is pointed out that the case for Shannon entropy and von Neumann entropy, as measures of uncertainty in quantum mechanics, is not as bleak as suggested in quant-ph/0006087. The main argument of the latter is based on one particular interpretation of Shannon’s H-function (related to consecutive measurements), and is shown explicitly to fail for other physical interpretations. Further, it is s...

In many life-testing and reliability studies, the experimenter might not always obtain complete information on failure times for all experimental units. One of the most common censoring schemes is progressive type-II censoring. The aim of this paper is characterizing the parent distributions based on Shannon entropy of progressive type-II censored order statistics. It is shown that the equality...

In this paper, we obtain the Rényi entropy rate for irreducible-aperiodic Markov chains with countable state space, using the theory of countable nonnegative matrices. We also obtain the bound for the rate of Rényi entropy of an irreducible Markov chain. Finally, we show that the bound for the Rényi entropy rate is the Shannon entropy rate.

2008

5 Information 15 5.1 Information As Surprise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 5.2 Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 5.3 How Surprised Should You Be? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 5.4 Entropy As Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . 20 5.5 Dicing...

Journal: :CoRR 2005
Ambedkar Dukkipati M. Narasimha Murty Shalabh Bhatnagar

By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-averages) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive en...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید