Hellinger Versus Kullback-Leibler Multivariable Spectrum Approximation
نویسندگان
چکیده
In this paper, we study a matricial version of a generalized moment problem with degree constraint. We introduce a new metric on multivariable spectral densities induced by the family of their spectral factors, which, in the scalar case, reduces to the Hellinger distance. We solve the corresponding constrained optimization problem via duality theory. A highly nontrivial existence theorem for the dual problem is established in the Byrnes– Lindquist spirit. A matricial Newton-type algorithm is finally provided for the numerical solution of the dual problem. Simulation indicates that the algorithm performs effectively and reliably.
منابع مشابه
Comparison of Kullback-Leibler, Hellinger and LINEX with Quadratic Loss Function in Bayesian Dynamic Linear Models: Forecasting of Real Price of Oil
In this paper we intend to examine the application of Kullback-Leibler, Hellinger and LINEX loss function in Dynamic Linear Model using the real price of oil for 106 years of data from 1913 to 2018 concerning the asymmetric problem in filtering and forecasting. We use DLM form of the basic Hoteling Model under Quadratic loss function, Kullback-Leibler, Hellinger and LINEX trying to address the ...
متن کاملMaximum likelihood eigenfunctions of the Fokker Planck equation and Hellinger projection
We apply the L2 based Fisher-Rao vector-field projection by Brigo, Hanzon and LeGland (1999) to finite dimensional approximations of the Fokker Planck equation on exponential families. We show that if the sufficient statistics are chosen among the diffusion eigenfunctions the finite dimensional projection or the equivalent assumed density approximation provide the exact maximum likelihood densi...
متن کاملApproximated moment-matching dynamics for basket-options simulation
The aim of this paper is to present two moment matching procedures for basketoptions pricing and to test its distributional approximations via distances on the space of probability densities, the Kullback-Leibler information (KLI) and the Hellinger distance (HD). We are interested in measuring the KLI and the HD between the real simulated basket terminal distribution and the distributions used ...
متن کاملInformation Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملExpansion of the Kullback-Leibler Divergence, and a new class of information metrics
Inferring and comparing complex, multivariable probability density functions is fundamental to problems in several fields, including probabilistic learning, network theory, and data analysis. Classification and prediction are the two faces of this class of problem. This study takes an approach that simplifies many aspects of these problems by presenting a structured, series expansion of the Kul...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Automat. Contr.
دوره 53 شماره
صفحات -
تاریخ انتشار 2008