نتایج جستجو برای: discrete entropy
تعداد نتایج: 223106 فیلتر نتایج به سال:
Let (G, μ) be a discrete group with a generating probability measure. Nevo shows that if G has property (T) then there exists an ε > 0 such that the Furstenberg entropy of any (G, μ)-stationary space is either zero or larger than ε. Virtually free groups, such as SL2(Z), do not have property (T). For these groups, we construct stationary actions with arbitrarily small, positive entropy. This co...
We will show that • the entropy for a random variable gives a lower bound on the number of bits needed per character for a binary coding • Huffman codes are optimal in the average number of bits used per character among binary codes • the average bits per character used by Huffman codes is close to the entropy of the underlying random variable • one can get arbitrarily close to the entropy of a...
In this paper a non-stationary processes that tend to maximize the Tsallis entropy are considered. Systems with discrete probability distribution for the Tsallis entropy have already been investigated on the basis of the Speed-Gradient principle. The evolution of probability density function and continuous form of the Tsallis entropy are considered. A set of equations describing dynamics of a s...
In this work, we extend a variable-length source coding theorem for discrete memoryless sources to ergodic time-invariant Markov sources of arbitrary order. To accomplish this extension, we establish a formula for the R enyi entropy rate lim n!1 H (n)=n. The main tool used to obtain the R enyi entropy rate result is Perron-Frobenius theory. We also examine the expression of the R enyi entropy r...
How low can be the joint entropy of n d-wise independent (for d ≥ 2) discrete random variables? This question has been posed and partially answered in a recent work of Babai [Bab13]. In this paper we improve some of his bounds, prove new bounds in a wider range of parameters and show matching upper bounds in some special cases. In particular, we prove tight lower bounds for the min-entropy (as ...
The paper examines relationships between the conditional Shannon entropy and the expectation of `α-norm for joint probability distributions. More precisely, we investigate the tight bounds of the expectation of `α-norm with a fixed conditional Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the conditional Shannon entropy and several informati...
We introduce a new technique for proving a priori error estimates between the entropy weak solution of a scalar conservation law and a finite–difference approximation calculated with the scheme of EngquistOsher, Lax-Friedrichs, or Godunov. This technique is a discrete counterpart of the duality technique introducedbyTadmor [SIAMJ.Numer.Anal. 1991]. The error is related to the consistency error ...
Statistical entropy was introduced by Shannon as a basic concept in information theory, measuring the average missing information on a random source. Extended into an entropy rate, it gives bounds in coding and compression theorems. I here present how statistical entropy and entropy rate relate to other notions of entropy, relevant either to probability theory (entropy of a discrete probability...
Quantum uncertainty relations are formulated in terms of relative entropy between distributions measurement outcomes and suitable reference with maximum entropy. This type entropic relation can be applied directly to observables either discrete or continuous spectra. We find that a sum entropies is bounded from above nontrivial way, which we illustrate some examples.
This document describes a package of Python code for implementing various non-parametric continuous entropy estimators (and some discrete ones for convenience). After describing installation, Sec. 4 provides a wide-ranging discussion of technical, theoretical, and numerical issues surrounding entropy estimation. Sec. 5 provides references to the relevant literature for each estimator implemente...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید