Fisher information distance: a geometrical reading?
نویسندگان
چکیده
This paper is a strongly geometrical approach to the Fisher distance, which is a measure of dissimilarity between two probability distribution functions. The Fisher distance, as well as other divergence measures, are also used in many applications to establish a proper data average. The main purpose is to widen the range of possible interpretations and relations of the Fisher distance and its associated geometry for the prospective applications. It focuses on statistical models of the normal probability distribution functions and takes advantage of the connection with the classical hyperbolic geometry to derive closed forms for the Fisher distance in several cases. Connections with the well-known Kullback-Leibler divergence measure are also devised.
منابع مشابه
Relations between Renyi Distance and Fisher Information
In this paper, we first show that Renyi distance between any member of a parametric family and its perturbations, is proportional to its Fisher information. We, then, prove some relations between the Renyi distance of two distributions and the Fisher information of their exponentially twisted family of densities. Finally, we show that the partial ordering of families induced by Renyi dis...
متن کاملInformation Submanifold Based on SPD Matrices and Its Applications to Sensor Networks
Abstract: In this paper, firstly, manifold PD(n) consisting of all n× n symmetric positive-definite matrices is introduced based on matrix information geometry; Secondly, the geometrical structures of information submanifold of PD(n) are presented including metric, geodesic and geodesic distance; Thirdly, the information resolution with sensor networks is presented by three classical measuremen...
متن کاملSparse signal processing on estimation grid with constant information distance applied in radar
Radar obtains its parameters on a grid whose design supports resolution of underlying radar processing. Existing radar exploits a regular grid although the resolution changes with stronger echoes at shorter ranges. We compute the radar resolution from the intrinsic geometrical structure of data models that is characterized in terms of the Fisher information metric. Based on the information-base...
متن کاملInformation Geometry Connecting Wasserstein Distance and Kullback-Leibler Divergence via the Entropy-Relaxed Transportation Problem
Two geometrical structures have been extensively studied for a manifold of probability distributions. One is based on the Fisher information metric, which is invariant under reversible transformations of random variables, while the other is based on the Wasserstein distance of optimal transportation, which reflects the structure of the distance between random variables. Here, we propose a new i...
متن کاملInformation geometry of divergence functions
Measures of divergence between two points play a key role in many engineering problems. One such measure is a distance function, but there are many important measures which do not satisfy the properties of the distance. The Bregman divergence, KullbackLeibler divergence and f -divergence are such measures. In the present article, we study the differential-geometrical structure of a manifold ind...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Discrete Applied Mathematics
دوره 197 شماره
صفحات -
تاریخ انتشار 2015