نتایج جستجو برای: shannon entropy
تعداد نتایج: 72543 فیلتر نتایج به سال:
In this paper, we introduce a goodness of fit test for expo- nentiality based on Lin-Wong divergence measure. In order to estimate the divergence, we use a method similar to Vasicek’s method for estimat- ing the Shannon entropy. The critical values and the powers of the test are computed by Monte Carlo simulation. It is shown that the proposed test are competitive with other tests of exponentia...
We study how the Shannon entropy of sequences produced by an information source converges to the source's entropy rate. We synthesize several phenomenological approaches to applying information theoretic measures of randomness and memory to stochastic and deterministic processes by using successive derivatives of the Shannon entropy growth curve. This leads, in turn, to natural measures of appa...
On the standard microscopic model of friction we confirm the common belief that the irreversible entropy production originates from the increase of Shannon information. We reveal that the reversible microscopic dynamics would continuously violate the Gibbsian interchangeability of molecules. The spontaneous restoration of interchangeability constitutes the mechanism of irreversibility. This is ...
Shannon, Kullback-Leibler, and Klimontovich's renormalized entropies are applied as three different complexity measures on gait data of patients with Parkinson's disease (PD) and healthy control group. We show that the renormalized entropy of variability of total reaction force of gait is a very efficient tool to compare patients with respect to disease severity. Moreover, it is a good risk pre...
We consider the concept of generalized Kolmogorov–Sinai entropy, where instead of the Shannon entropy function, we consider an arbitrary concave function defined on the unit interval, vanishing in the origin. Under mild assumptions on this function, we show that this isomorphism invariant is linearly dependent on the Kolmogorov–Sinai entropy.
Tsallis relative operator entropy is defined and then its properties are given. Shannon inequality and its reverse one in Hilbert space operators derived by T.Furuta [4] are extended in terms of the parameter of the Tsallis relative operator entropy. Moreover the generalized Tsallis relative operator entropy is introduced and then several operator inequalities are derived.
Tsallis relative operator entropy is defined and then its properties are given. Shannon inequality and its reverse one in Hilbert space operators derived by T.Furuta [5] are extended in terms of the parameter of the Tsallis relative operator entropy. Moreover the generalized Tsallis relative operator entropy is introduced and then several operator inequalities are derived.
This article resolves a longstanding question in the axiomatisation of entropy as proposed by Shannon and highlighted in renewed concerns expressed by Jaynes. We introduce a companion measure of a probability distribution that we suggest be called the extropy of the distribution. The entropy and the extropy of an event distribution are identical. However, this identical measure bifurcates into ...
We revisit the classical problem: given a memoryless source having a certain amount of Shannon Entropy, how many random bits can be extracted? This question appears in works studying random number generators built from physical entropy sources. Some authors use a heuristic estimate obtained from the Asymptotic Equipartition Property, which yields roughly n extractable bits, where n is the total...
Recently, entropy measures have shown a significant promise in detecting diverse set of network anomalies. While many different forms of entropy exist, only a few have been studied in the context of network anomaly detection. In the paper, results of our case study on entropy-based IP traffic anomaly detection are prestented. Besides the well-known Shannon approach and counter-based methods, va...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید