نتایج جستجو برای: information entropy theory

تعداد نتایج: 1867273  

Journal: :Inf. Comput. 2015
Mladen Kovacevic Ivan Stanojevic Vojin Senk

This paper studies bivariate distributions with fixed marginals from an information-theoretic perspective. In particular, continuity and related properties of various information measures (Shannon entropy, conditional entropy, mutual information, Rényi entropy) on the set of all such distributions are investigated. The notion of minimum entropy coupling is introduced, and it is shown that it de...

Journal: :CoRR 2008
Zesheng Chen Chuanyi Ji

This work investigates three aspects: (a) a network vulnerability as the non-uniform vulnerable-host distribution, (b) threats, i.e., intelligent malwares that exploit such a vulnerability, and (c) defense, i.e., challenges for fighting the threats. We first study five large data sets and observe consistent clustered vulnerable-host distributions. We then present a new metric, referred to as th...

2012
Om Parkash

The measure of entropy introduced by Shannon [12] is the key concept in the literature of information theory and has found tremendous applications in different disciplines of science and technology. The various researchers have generalized this entropy with different approaches. The object of the present manuscript is to develop a generalized measure of entropy by using the property of concavit...

2004
Dongming Xu

Methods of feature evaluation are developed and discussed based on information theoretical learning (ITL). Mutual information was shown in the literature to be more robust and precise to evaluate a feature set. In this paper; we propose to use quadratic mutual information (QMI) for feature evaluation. The concept of information potential leads to a more clearly physical meaning of the evaluatio...

Journal: :Entropy 1999
Shu-Kun Lin

Entropy has been launched as a scientific journal to provide an advanced forum for the community of entropy and information researchers. There are many types of entropy reported in the scientific literature [1]. The great diversity in the concept and definition may cause tremendous problems. My own humble suggestion is the following regarding the main two kinds of entropy: 1. Any information-th...

Journal: :Int. J. General Systems 2010
Vladik Kreinovich Gang Xiang

Sometimes, we know the probability of different values of the estimation error ∆x def = e x− x, sometimes, we only know the interval of possible values of ∆x, sometimes, we have interval bounds on the cdf of ∆x. To compare different measuring instruments, it is desirable to know which of them brings more information – i.e., it is desirable to gauge the amount of information. For probabilistic u...

2004
Erwin Lutwak Deane Yang Gaoyong Zhang

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...

2015
Philippe Elbaz-Vincent Herbert Gangl

We show that the entropy function—and hence the finite 1-logarithm—behaves a lot like certain derivations. We recall its cohomological interpretation as a 2-cocycle and also deduce 2n-cocycles for any n. Finally, we give some identities for finite multiple polylogarithms together with number theoretic applications. 1 Information theory, Entropy and Polylogarithms It is well known that the notio...

Journal: :Int. J. General Systems 2002
Jiye Liang Kwai-Sang Chin Chuangyin Dang Richard C. M. Yam

Based on the complement behavior of information gain, a new definition of information entropy is proposed along with its justification in rough set theory. Some properties of this definition imply those of Shannon’s entropy. Based on the new information entropy, conditional entropy and mutual information are then introduced and applied to knowledge bases. The new information entropy is proved t...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید