Seven Means, Generalized Triangular Discrimination, and Generating Divergence Measures

نویسنده

  • Inder Jeet Taneja
چکیده

Jensen-Shannon, J-divergence and Arithmetic-Geometric mean divergences are three classical divergence measures known in the information theory and statistics literature. These three divergence measures bear interesting inequality among the three non-logarithmic measures known as triangular discrimination, Hellingar’s divergence and symmetric chi-square divergence. However, in 2003, Eve studied seven means from a geometrical point of view, which are Harmonic, Geometric, Arithmetic, Heronian, Contra-harmonic, Root-mean square and Centroidal. In this paper, we have obtained new inequalities among non-negative differences arising from these seven means. Correlations with generalized triangular discrimination and some new generating measures with their exponential representations are also presented.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Seven Means, Generalized Triangular Discrimination, and Generating Divergence Measures

Abstract From geometrical point of view, Eve [2] studied seven means. These means are Harmonic, Geometric, Arithmetic, Heronian, Contra-harmonic, Root-mean square and Centroidal mean. We have considered for the first time a new measure calling generalized triangular discrimination. Inequalities among non-negative differences arising due to seven means and particular cases of generalized triangu...

متن کامل

Generalized Symmetric Divergence Measures and Metric Spaces

Abstract Recently, Taneja [7] studied two one parameter generalizations of J-divergence, Jensen-Shannon divergence and Arithmetic-Geometric divergence. These two generalizations in particular contain measures like: Hellinger discrimination, symmetric chi-square divergence, and triangular discrimination. These measures are well known in the literature of Statistics and Information theory. In thi...

متن کامل

Generalized Symmetric Divergence Measures and the Probability of Error

Abstract There are three classical divergence measures exist in the literature on information theory and statistics. These are namely, Jeffryes-Kullback-Leiber [6], [8] J-divergence. SibsonBurbea-Rao [9], [3] Jensen-Shannon divegernce and Taneja [11] Arithmetic-Geometric divergence. These three measures bear an interesting relationship among each other. The divergence measures like Hellinger [5...

متن کامل

Generalized Symmetric Divergence Measures and Inequalities

, s = 1 The first measure generalizes the well known J-divergence due to Jeffreys [16] and Kullback and Leibler [17]. The second measure gives a unified generalization of JensenShannon divergence due to Sibson [22] and Burbea and Rao [2, 3], and arithmeticgeometric mean divergence due to Taneja [27]. These two measures contain in particular some well known divergences such as: Hellinger’s discr...

متن کامل

A Sequence of Inequalities among Difference of Symmetric Divergence Measures

In this paper we have considered two one parametric generalizations. These two generalizations have in particular the well known measures such as: J-divergence, Jensen-Shannon divergence and arithmetic-geometric mean divergence. These three measures are with logarithmic expressions. Also, we have particular cases the measures such as: Hellinger discrimination, symmetric χ2−divergence, and trian...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Information

دوره 4  شماره 

صفحات  -

تاریخ انتشار 2013