نتایج جستجو برای: shannon entropy numerical simulation
تعداد نتایج: 876050 فیلتر نتایج به سال:
This paper deals with the estimation of transfer entropy based on the k-nearest neighbors (k-NN) method. To this end, we first investigate the estimation of Shannon entropy involving a rectangular neighboring region, as suggested in already existing literature, and develop two kinds of entropy estimators. Then, applying the widely-used error cancellation approach to these entropy estimators, we...
The uniequness theorem for the Tsallis entropy by introducing the generalized Faddeev’s axiom is proven. Our result improves the recent result, the uniqueness theorem for Tsallis entropy by the generalized Shannon-Khinchin’s axiom in [7], in the sence that our axiom is simpler than his one, as similar that Faddeev’s axiom is simpler than Shannon-Khinchin’s one.
rapid technological and economic growth over the last several decades has changed human lives and made modern society face complex decision making problems. these kinds of problems are characterized by incommensurate and conflicting criteria or objectives such as cost, reliability, performance, safety and productivity. multi criteria decision making is an approach that can be used to deal compl...
Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...
We examine the entropy of stationary nonequilibrium measures of boundary driven symmetric simple exclusion processes. In contrast with the Gibbs–Shannon entropy [1, 10], the entropy of nonequilibrium stationary states differs from the entropy of local equilibrium states.
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫ X ln dP dR dP , where P and R are probability measures on a measurable space (X,M), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and oth...
Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE...
If optimality is measured by average codeword length, Huffman's algorithm gives optimal codes, and the redundancy can be measured as the difference between the average codeword length and Shannon's entropy. If the objective function is replaced by an exponentially weighted average, then a simple modification of Huffman's algorithm gives optimal codes. The redundancy can now be measured as the d...
Based on the local Shannon entropy concept in information theory, a new measure of aromaticity is introduced. This index, which describes the probability of electronic charge distribution between atoms in a given ring, is called Shannon aromaticity (SA). Using B3LYP method and different basis sets (6-31G**, 6-31+G** and 6-311++G**), the SA values of some five-membered heterocycles, C(4)H(4)X, a...
Random and pseudorandom number generators (RNG and PRNG) are used for many purposes including cryptographic, modeling and simulation applications. For such applications a generated bit sequence should mimic true random, i.e., by definition, such a sequence could be interpreted as the result of the flips of a fair coin with sides that are labeled 0 and 1. It is known that the Shannon entropy of ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید