نتایج جستجو برای: shannon
تعداد نتایج: 9904 فیلتر نتایج به سال:
We say that two free p.m.p. actions of countable groups are Shannon orbit equivalent if there is an equivalence between them whose associated cocycle partitions have finite entropy. show the acting sofic and each has a w-normal amenable subgroup which neither locally nor virtually cyclic then implies same maximum This extends result Austin beyond finitely generated setting consequence Bernoulli...
Abstract Hidden Markov chains are widely applied statistical models of stochastic processes, from fundamental physics and chemistry to finance, health, artificial intelligence. The hidden processes they generate notoriously complicated, however, even if the chain is finite state: no expression for their Shannon entropy rate exists, as set predictive features generically infinite. As such, date ...
Direct evaluation of the rate-distortion function has rarely been achieved when it is strictly greater than its Shannon lower bound. In this paper, we consider the rate-distortion function for the distortion measure defined by an ε-insensitive loss function. We first present the Shannon lower bound applicable to any source distribution with finite differential entropy. Then, focusing on the Lap...
Shannon entropy [1] is one of fundamental quantities in classical information theory and uniquely determinded by the Shannon-Khinchin axiom or the Faddeev axiom. Oneparameter extensions for Shannon entropy have been studied by many researchers [2]. The Rényi entropy [3] and the Tsallis entropy [4] are famous. In the paper [5], the uniqueness theorem for the Tsallis entropy was proven. See also ...
The paper examines relationships between the Shannon entropy and the `α-norm for n-ary probability vectors, n ≥ 2. More precisely, we investigate the tight bounds of the `α-norm with a fixed Shannon entropy, and vice versa. As applications of the results, we derive the tight bounds between the Shannon entropy and several information measures which are determined by the `α-norm. Moreover, we app...
1.2 REMARK ON INTERPRETATIONS OF THE SHANNON ENTROPY There are standard ways to interpret the Shannon entropy. For instance, the quantity H(p) can be viewed as a measure of the amount of uncertainty in a random experiment described by the probability mass function p, or as a measure of the amount of information one gains by learning the value of such an experiment. Indeed, it is possible to sta...
For an undirected graph G = (V,E), let G denote the graph whose vertex set is V n in which two distinct vertices (u1, u2, . . . , un) and (v1, v2, . . . , vn) are adjacent iff for all i between 1 and n either ui = vi or uivi ∈ E. The Shannon capacity c(G) of G is the limit limn→∞(α(G)), where α(G) is the maximum size of an independent set of vertices in G. We show that there are graphs G and H ...
Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...
Compressed sensing is a signal processing technique to encode analog sources by real numbers rather than bits, dealing with efficient recovery of a real vector from the information provided by linear measurements. By leveraging the prior knowledge of the signal structure (e.g., sparsity) and designing efficient non-linear reconstruction algorithms, effective compression is achieved by taking a ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید