Relations between Renyi Distance and Fisher Information
Authors
Abstract:
In this paper, we first show that Renyi distance between any member of a parametric family and its perturbations, is proportional to its Fisher information. We, then, prove some relations between the Renyi distance of two distributions and the Fisher information of their exponentially twisted family of densities. Finally, we show that the partial ordering of families induced by Renyi distance is the same as that induced by Fisher information.
similar resources
Fisher information distance: a geometrical reading?
This paper is a strongly geometrical approach to the Fisher distance, which is a measure of dissimilarity between two probability distribution functions. The Fisher distance, as well as other divergence measures, are also used in many applications to establish a proper data average. The main purpose is to widen the range of possible interpretations and relations of the Fisher distance and its a...
full textCramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information
The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...
full textExtreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations
In employing MaxEnt, a crucial role is assigned to the reciprocity relations that relate the quantifier to be extremized (Shannon’s entropy S), the Lagrange multipliers that arise during the variational process, and the expectation values that constitute the a priori input information. We review here just how these ingredients relate to each other when the information quantifier S is replaced b...
full textPotential Statistical Evidence in Experiments and Renyi Information
Recently Habibi et al. (2006) defined a pre-experimental criterion for the potential strength of evidence provided by an experiment, based on Kullback-Leibler distance. In this paper, we investigate the potential statistical evidence in an experiment in terms of Renyi distance and compare the potential statistical evidence in lower (upper) record values with that in the same number of ii...
full textBaryonic Tully-Fisher Relations
I describe the disk mass–rotation velocity relation which underpins the familiar luminosity–linewidth relation. Continuity of this relation favors nearly maximal stellar mass-to-light ratios. This contradicts the low mass-to-light ratios implied by the lack of surface brightness dependence in the same relation. 1. Searching for the Physical Basis of the Tully-Fisher Relation The Tully-Fisher (T...
full textShannon Entropy , Renyi Entropy , and Information
This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures. A brief summary of the origins of the concept of physical entropy are provided in an appendix.
full textMy Resources
Journal title
volume 5 issue None
pages 25- 37
publication date 2006-11
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023