Nonextensive Generalizations of the Jensen-Shannon Divergence

نویسندگان

  • André F. T. Martins
  • Pedro M. Q. Aguiar
  • Mário A. T. Figueiredo
چکیده

Convexity is a key concept in information theory, namely via the many implications of Jensen’s inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen’s inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building blocks: convexity and Shannon’s entropy. In particular, a new concept of q-convexity is introduced and shown to satisfy a Jensen’s q-inequality. Based on this Jensen’s q-inequality, the Jensen-Tsallis q-difference is built, which is a nonextensive generalization of the JSD, based on Tsallis entropies. Finally, the Jensen-Tsallis q-difference is charaterized in terms of convexity and extrema.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Unified Generalizations of Relative Jensen–shannon and Arithmetic–geometric Divergence Measures, and Their Properties Pranesh Kumar and Inder Jeet Taneja

Abstract. In this paper we shall consider one parametric generalization of some nonsymmetric divergence measures. The non-symmetric divergence measures are such as: Kullback-Leibler relative information, χ2−divergence, relative J – divergence, relative Jensen – Shannon divergence and relative Arithmetic – Geometric divergence. All the generalizations considered can be written as particular case...

متن کامل

Nonextensive Information Theoretic Kernels on Measures

Positive definite kernels on probability measures have been recently applied to classification problems involving text, images, and other types of structured data. Some of these kernels are related to classic information theoretic quantities, such as (Shannon’s) mutual information and the JensenShannon (JS) divergence. Meanwhile, there have been recent advances in nonextensive generalizations o...

متن کامل

Generalized Symmetric Divergence Measures and Metric Spaces

Abstract Recently, Taneja [7] studied two one parameter generalizations of J-divergence, Jensen-Shannon divergence and Arithmetic-Geometric divergence. These two generalizations in particular contain measures like: Hellinger discrimination, symmetric chi-square divergence, and triangular discrimination. These measures are well known in the literature of Statistics and Information theory. In thi...

متن کامل

A Sequence of Inequalities among Difference of Symmetric Divergence Measures

In this paper we have considered two one parametric generalizations. These two generalizations have in particular the well known measures such as: J-divergence, Jensen-Shannon divergence and arithmetic-geometric mean divergence. These three measures are with logarithmic expressions. Also, we have particular cases the measures such as: Hellinger discrimination, symmetric χ2−divergence, and trian...

متن کامل

Nonextensive information-theoretic measure for image edge detection

We propose a nonextensive information-theoretic measure called Jensen-Tsallis divergence, which may be defined between any arbitrary number of probability distributions, and we analyze its main theoretical properties. Using the theory of majorization, we also derive its upper bounds performance. To gain further insight into the robustness and the application of the Jensen-Tsallis divergence mea...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/0804.1653  شماره 

صفحات  -

تاریخ انتشار 2008