Generalized Relative Information and Information Inequalities
نویسنده
چکیده
In this paper, we have obtained bounds on Csiszár’s f-divergence in terms of relative information of type s using Dragomir’s [9] approach. The results obtained in particular lead us to bounds in terms of χ−Divergence, Kullback-Leibler’s relative information and Hellinger’s discrimination.
منابع مشابه
Displacement convexity of generalized relative entropies. II
We introduce a class of generalized relative entropies (inspired by the Bregman divergence in information theory) on the Wasserstein space over a weighted Riemannian or Finsler manifold. We prove that the convexity of all the entropies in this class is equivalent to the combination of the non-negative weighted Ricci curvature and the convexity of another weight function used in the definition o...
متن کاملGeneralized Interlacing Inequalities
We discuss some applications of generalized interlacing inequalities of Ky Fan to the study of (a) some classical matrix inequalities and (b) matrix problems in quantum information science. AMS Classification 15A18, 15A57, 15A60, 15A90.
متن کاملOn Hadamard and Fej'{e}r-Hadamard inequalities for Caputo $small{k}$-fractional derivatives
In this paper we will prove certain Hadamard and Fejer-Hadamard inequalities for the functions whose nth derivatives are convex by using Caputo k-fractional derivatives. These results have some relationship with inequalities for Caputo fractional derivatives.
متن کاملNew bounds for the generalized Marcum Q-function
In this paper, we study the generalized Marcum -function where and . Our aim is to extend the results of Corazza and Ferrari (IEEE Trans. Inf. Theory, vol. 48, pp. 3003–3008, 2002) to the generalized Marcum -function in order to deduce some new tight lower and upper bounds. The key tools in our proofs are some monotonicity properties of certain functions involving the modified Bessel function o...
متن کاملCramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...
متن کامل