نتایج جستجو برای: information entropy theory

تعداد نتایج: 1867273  

2008

5 Information 15 5.1 Information As Surprise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 5.2 Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 5.3 How Surprised Should You Be? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 5.4 Entropy As Average Shannon Information . . . . . . . . . . . . . . . . . . . . . . 20 5.5 Dicing...

Journal: :IEEE Trans. Information Theory 2017
Masahito Hayashi Vincent Yan Fu Tan

In this paper, we evaluate the asymptotics of equivocations, their exponents as well as their second-order coding rates under various Rényi information measures. Specifically, we consider the effect of applying a hash function on a source and we quantify the level of non-uniformity and dependence of the compressed source from another correlated source when the number of copies of the sources is...

2010
Tim van Erven Peter Harremoës

Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including it...

Journal: :Entropy 2004
Michael Devereux

Using an isolated measurement process, I’ve calculated the effect measurement has on entropy for the multi-cylinder Szilard engine. This calculation shows that the system of cylinders possesses an entropy associated with cylinder total energy states, and that it records information transferred at measurement. Contrary to other’s results, I’ve found that the apparatus loses entropy due to measur...

Journal: :CoRR 2015
Oliver Johnson

We describe five types of results concerning information and concentration of discrete random variables, and relationships between them. These results are motivated by their counterparts in the continuous case. The results we consider are information theoretic approaches to Poisson approximation, the maximum entropy property of the Poisson distribution, discrete concentration (Poincaré and loga...

Journal: :Entropy 2011
Shinto Eguchi Osamu Komori Shogo Kato

We discuss a one-parameter family of generalized cross entropy between two distributions with the power index, called the projective power entropy. The cross entropy is essentially reduced to the Tsallis entropy if two distributions are taken to be equal. Statistical and probabilistic properties associated with the projective power entropy are extensively investigated including a characterizati...

Journal: :Entropy 2015
Luca Faes A. Porta Giandomenico Nollo

In the framework of information dynamics, the temporal evolution of coupled systems can be studied by decomposing the predictive information about an assigned target system into amounts quantifying the information stored inside the system and the information transferred to it. While information storage and transfer are computed through the known self-entropy (SE) and transfer entropy (TE), an a...

2001
Benjamin Schumacher Michael D. Westmoreland

We review the properties of the quantum relative entropy function and discuss its application to problems of classical and quantum information transfer and to quantum data compression. We then outline further uses of relative entropy to quantify quantum entanglement and analyze its manipulation. 1 Quantum relative entropy In this paper we discuss several uses of the quantum relative entropy fun...

2012

What does this mean in terms of chess? A common characteristic of every piece is that it could move to certain squares, including by capture. In any given position, therefore, the pieces by the rules of the game possess certain states, only one of which will be realized on the next move. The difference of the logarithm of the numbers of such states for Black and White respectively is the "entro...

2015
William Fedus

Entropy is a central concept in both classical and quantum information theory, measuring the uncertainty and the information content in the state of a physical system. This paper reviews classical information theory and then proceeds to generalizations into quantum information theory. Both Shannon and Von Neumann entropy are discussed, making the connection to compressibility of a message strea...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید