نتایج جستجو برای: information entropy theory
تعداد نتایج: 1867273 فیلتر نتایج به سال:
In information theory, mutual information (MI) is a difference concept with entropy.[1] In this paper, we prove with copula [2] that they are essentially same – mutual information is also a kind of entropy, called copula entropy. Based on this insightful result, We propose a simple method for estimating mutual information. Copula is a theory on dependence and measurement of association.[2] Skla...
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1, . . . , N} representable by joint entropies of components of an N -dimensional rand...
We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper ([55]). Since then, entropy has been of great theoretical and applied interest. The basic properties ∗This research was supported by the Scientific Exchange Program between the Belgian Academy of Sciences and the Hungarian Academy of Sciences in the field of Mathemat...
The limitations of Shannon information theory are pointed out from new perspectives. The limitations mainly exist in the neglects of the information reliability and completeness. The significances of the information reliability to the information measurements are further illustrated through example analysis. It is pointed out that such limitations originate from neglects of multilevel informati...
information on water quality and pollution sources is important for the implementation of sustainable water resource management strategies. in this research, to evaluate spatial variation and the interpretation of large complex water quality data taken at seven different sites along the gorganrood river were subjected to multivariate statistical analysis. the factor analysis generated two signi...
Classically, Shannon entropy was formalized over discrete probability distributions. However, the concept of entropy can be extended to continuous distributions through a quantity known as continuous (or differential) entropy. The most common definition for continuous entropy is seemingly straightforward; however, further analysis reveals a number of shortcomings that render it far less useful ...
By replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo average (KN-averages) or quasilinear mean and further imposing the additivity constraint, Rényi proposed the first formal generalization of Shannon entropy. Using this recipe of Rényi, one can prepare only two information measures: Shannon and Rényi entropy. Indeed, using this formalism Rényi characterized these additive en...
evaluation criteria for different versions of the same database the concept of information, during its development, is connected to the concept of entropy created by the 19th century termodynamics scholars. information means, in this view point, order or negentropy. on the other hand, entropy is connected to the concepts as chaos and noise, which cause, in turn, disorder. in the present paper, ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید