Non-Parametric Estimation of Mutual Information through the Entropy of the Linkage

نویسندگان

  • Maria Teresa Giraudo
  • Laura Sacerdote
  • Roberta Sirovich
چکیده

A new, non–parametric and binless estimator for the mutual information of a d–dimensional random vector is proposed. First of all, an equation that links the mutual information to the entropy of a suitable random vector with uniformly distributed components is deduced. When d = 2 this equation reduces to the well known connection between mutual information and entropy of the copula function associated to the original random variables. Hence, the problem of estimating the mutual information of the original random vector is reduced to the estimation of the entropy of a random vector obtained through a multidimensional transformation. The estimator we propose is a two–step method: first estimate the transformation and obtain the transformed sample, then estimate its entropy. The properties of the new estimator are discussed through simulation examples and its performances are compared to those of the best estimators in the literature. The precision of the estimator converges to values of the same order of magnitude of the best estimator tested. However, the new estimator is unbiased even for larger dimensions and smaller sample sizes, while the other tested estimators show a bias in these cases.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Determination of height of urban buildings based on non-parametric estimation of signal spectrum in SAR data tomography

Nowadays, the TomoSAR technique has been able to overcome the limitations of radar interferometry techniques in separating multiple scatterers of pixels. By extending the principles of virtual aperture in the elevation direction, these techniques pay much attention in the analysis of urban challenging areas. Despite the expectation of interference of the distribution of buildings with different...

متن کامل

ICA Using Kernel Entropy Estimation with NlogN Complexity

Mutual information (MI) is a common criterion in independent component analysis (ICA) optimization. MI is derived from probability density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing nonparametric algorithms suffer from high complexity, particularly i...

متن کامل

Generative and Discriminative Face Modelling for Detection

This paper reports a new image model combining self mutual information based generative modelling and fisherdiscriminant based discriminative modelling. Past work on face modelling have focused heavily on either generative modelling or boundary modelling considering negative examples. The motivation of this work is to examine the combinational treatment and study its effect. To effectively lear...

متن کامل

Some properties of the parametric relative operator entropy

The notion of entropy was introduced by Clausius in 1850, and some of the main steps towards the consolidation of the concept were taken by Boltzmann and Gibbs. Since then several extensions and reformulations have been developed in various disciplines with motivations and applications in different subjects, such as statistical mechanics, information theory, and dynamical systems. Fujii and Kam...

متن کامل

Estimating Mixture Entropy with Pairwise Distances

Mixture distributions arise in many parametric and non-parametric settings—for example, in Gaussian mixture models and in non-parametric estimation. It is often necessary to compute the entropy of a mixture, but, in most cases, this quantity has no closed-form expression, making some form of approximation necessary. We propose a family of estimators based on a pairwise distance function between...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 15  شماره 

صفحات  -

تاریخ انتشار 2013