نتایج جستجو برای: and 054 disregarding shannon entropy
تعداد نتایج: 16840017 فیلتر نتایج به سال:
We provide a full behavioral characterization of the standard Shannon model rational inattention. The key axiom is Invariance under Compression, which identifies this as capturing an ideal form attention-constrained choice. introduce tractable generalizations that allow for many known violations from ideal, including asymmetries and complementarities in learning, context effects, low responsive...
In this paper, we reconsider the problem of deciding whether one probability distribution is more informative (in the sense of representing a less indeterminate situation) than another one. Instead of using well-established information measures such as the Shannon entropy, however, we take up the idea of comparing probability distributions in a qualitative way. More specifically, we focus on a ...
The uniequness theorem for the Tsallis entropy by introducing the generalized Faddeev’s axiom is proven. Our result improves the recent result, the uniqueness theorem for Tsallis entropy by the generalized Shannon-Khinchin’s axiom in [7], in the sence that our axiom is simpler than his one, as similar that Faddeev’s axiom is simpler than Shannon-Khinchin’s one.
Reviewing the damage caused by landslide proves the need to examine the factors influencing the occurrence of this phenomenon and the prediction of its occurrence. Therefore, the purpose of this study was to improve the prediction of landslide occurrence in the Taleghan watershed using Shannon Entropy Theory. Among the factors influencing the occurrence of landslide, ten factors of elevation, s...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫ X ln dP dR dP , where P and R are probability measures on a measurable space (X,M), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and oth...
Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE...
A new concept named nonsymmetric entropy which generalizes the concepts of Boltzman’s entropy and shannon’s entropy, was introduced. Maximal nonsymmetric entropy principle was proven. Some important distribution laws were derived naturally from maximal nonsymmetric entropy principle.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید