Some More Noiseless Coding Theorem on Generalized R-Norm Entropy
نویسندگان
چکیده
A parametric mean length is defined as the quantity L R = R R−1 [ 1 − ∑N i=1 p β i D −ni( R−1 R ) ∑N j=1 p β j ] where R > 0 ( 1) , β > 0, pi > 0, ∑ pi = 1, i = 1, 2, . . . ,N. This being the mean length of code words. Lower and upper bounds for L R are derived in terms of R-norm information measure for the incomplete power distribution. AMS Subject classification. 94A15, 94A17, 94A24, 26D15.
منابع مشابه
Some Noiseless Coding Theorem Connected with Havrda and Charvat and Tsallis’s Entropy
A new measure Lα, called average code word length of order α and type β has been defined and its relationship with a result of generalized Havrda and Charvat and Tsallis’s entropy has been discussed. Using Lα, some coding theorem for discrete noiseless channel has been proved.
متن کاملA Coding Theorem Connected on R-Norm Entropy
A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is well known in information theory. In this communication, first we generalized Shannon inequality and then given its application in coding theory.
متن کاملThe capacity of hybrid quantum memory
The general stable quantum memory unit is a hybrid consisting of a classical digit with a quantum digit (qudit) assigned to each classical state. The shape of the memory is the vector of sizes of these qudits, which may differ. We determine when N copies of a quantum memory A embed in N(1 + o(1)) copies of another quantum memory B. This relationship captures the notion that B is as at least as ...
متن کاملThe role of the asymptotic equipartition property in noiseless source coding
The (noiseless) fixed-length source coding theorem states that, except for outcomes in a set of vanishing probability, a source can be encoded at its entropy but not more efficiently. It is well known that the Asymptotic Equipartition Property (AEP) is a sufficient condition for a source to be encodable at its entropy. This paper shows that the AEP is necessary for the source coding theorem to ...
متن کاملThe Rényi redundancy of generalized Huffman codes
If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...
متن کامل