نتایج جستجو برای: the shannon
تعداد نتایج: 16053382 فیلتر نتایج به سال:
The aim of this thesis is to formulate and prove quantum extensions of the famous Shannon-McMillan theorem and its stronger version due to Breiman. In ergodic theory the Shannon-McMillan-Breiman theorem is one of the fundamental limit theorems for classical discrete dynamical systems. It can be interpreted as a special case of the individual ergodic theorem. In this work, we consider spin latti...
In the search for robust, accurate, and highly efficient financial option valuation techniques, we here present the SWIFT method (Shannon wavelets inverse Fourier technique), based on Shannon wavelets. SWIFT comes with control over approximation errors made by means of sharp quantitative error bounds. The nature of the local Shannon wavelets basis enables us to adaptively determine the proper s...
We introduce a novel parametric family of symmetric information-theoretic distances based on Jensen’s inequality for a convex functional generator. In particular, this family unifies the celebrated Jeffreys divergence with the Jensen-Shannon divergence when the Shannon entropy generator is chosen. We then design a generic algorithm to compute the unique centroid defined as the minimum average d...
For an undirected graph G = (V,E), let G denote the graph whose vertex set is V n in which two distinct vertices (u1, u2, . . . , un) and (v1, v2, . . . , vn) are adjacent iff for all i between 1 and n either ui = vi or uivi ∈ E. The Shannon capacity c(G) of G is the limit limn→∞(α(G)), where α(G) is the maximum size of an independent set of vertices in G. We show that there are graphs G and H ...
Source coding, also known as data compression, is an area of information theory that deals with the design and performance evaluation of optimal codes for data compression. In 1952 Hu man constructed his optimal code that minimizes the average code length among all pre x codes for known sources. Actually, Hu man codes minimizes the average redundancy de ned as the di erence between the code len...
The Shannon capacity of a graph G is c(G) = supd>1(α(G d)) 1 d , where α(G) is the independence number of G. The Shannon capacity of the Kneser graph KGn,r was determined by Lovász in 1979, but little is known about the Shannon capacity of the complement of that graph when r does not divide n. The complement of the Kneser graph, KGn,2, is also called the triangular graph Tn. The graph Tn has th...
Abstract A symmetric variant of the Shannon capacity graphs is defined and computed.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید