نتایج جستجو برای: mutual information theory (mi)

تعداد نتایج: 1876105  

2011
Yan Xu Gareth Jones JinTao Li Bin Wang ChunMing Sun

Feature selection plays an important role in text categorization. Automatic feature selection methods such as document frequency thresholding (DF), information gain (IG), mutual information (MI), and so on are commonly applied in text categorization. Many existing experiments show IG is one of the most effective methods, by contrast, MI has been demonstrated to have relatively poor performance....

Journal: :CoRR 2003
Alexander Kraskov Harald Stögbauer Ralph G. Andrzejak Peter Grassberger

We present a method for hierarchical clustering of data called mutual information clustering (MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X, Y, and Z is equal to the sum of the MI between X and Y , plus the MI between Z and the combined object (XY ). We use this both in the Shannon (probabilistic) versi...

2006
Peter Grassberger

We present a method for hierarchical clustering of data called mutual information clustering (MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X, Y, and Z is equal to the sum of the MI between X and Y , plus the MI between Z and the combined object (XY ). We use this both in the Shannon (probabilistic) versi...

Journal: :CoRR 2008
Jian Ma Zengqi Sun

In information theory, mutual information (MI) is a difference concept with entropy.[1] In this paper, we prove with copula [2] that they are essentially same – mutual information is also a kind of entropy, called copula entropy. Based on this insightful result, We propose a simple method for estimating mutual information. Copula is a theory on dependence and measurement of association.[2] Skla...

2010
Erik Schaffernicht Robert Kaltenhaeuser Saurabh Shekhar Verma Horst-Michael Groß

Mutual Information (MI) is a powerful concept from information theory used in many application fields. For practical tasks it is often necessary to estimate the Mutual Information from available data. We compare state of the art methods for estimating MI from continuous data, focusing on the usefulness for the feature selection task. Our results suggest that many methods are practically relevan...

Journal: :Pattern Recognition 2008
Hongxia Luan Feihu Qi Zhong Xue Liya Chen Dinggang Shen

This paper presents a novel image similarity measure, referred to as quantitative–qualitative measure of mutual information (Q-MI), for multimodality image registration. Conventional information measures, e.g., Shannon’s entropy and mutual information (MI), reflect quantitative aspects of information because they only consider probabilities of events. In fact, each event has its own utility to ...

Mohamed Habibullah, Mohammad Ahsanullah,

Among all measures of independence between random variables, mutual information is the only one that is based on information theory. Mutual information takes into account of all kinds of dependencies between variables, i.e., both the linear and non-linear dependencies. In this paper we have classified some well-known bivariate distributions into two classes of distributions based on their mutua...

2001
K. Kang H. Sompolinsky

We studied the mutual information between a stimulus and a large system consisting of stochastic, statistically independent elements that respond to a stimulus. The Mutual Information (MI) of the system saturates exponentially with system size. A theory of the rate of saturation of the MI is developed. We show that this rate is controlled by a distance function between the response probabilitie...

2008
Alexander Kraskov Peter Grassberger

Clustering is a concept used in a huge variety of applications. We review a conceptually very simple algorithm for hierarchical clustering called in the following the mutual information clustering (MIC) algorithm. It uses mutual information (MI) as a similarity measure and exploits its grouping property: The MI between three objects X ,Y, and Z is equal to the sum of the MI between X and Y , pl...

Journal: :Hearing research 2007
Israel Nelken Gal Chechik

Mutual information (MI) is in increasing use as a way of quantifying neural responses. However, it is still considered with some doubts by many researchers, because it is not always clear what MI really measures, and because MI is hard to calculate in practice. This paper aims to clarify these issues. First, it provides an interpretation of mutual information as variability decomposition, simil...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید