Distances and Riemannian Metrics for Spectral Density Functions
نویسندگان
چکیده
منابع مشابه
ON THE LIFTS OF SEMI-RIEMANNIAN METRICS
In this paper, we extend Sasaki metric for tangent bundle of a Riemannian manifold and Sasaki-Mok metric for the frame bundle of a Riemannian manifold [I] to the case of a semi-Riemannian vector bundle over a semi- Riemannian manifold. In fact, if E is a semi-Riemannian vector bundle over a semi-Riemannian manifold M, then by using an arbitrary (linear) connection on E, we can make E, as a...
متن کاملUnsupervised Riemannian Clustering of Probability Density Functions
We present an algorithm for grouping families of probability density functions (pdfs). We exploit the fact that under the square-root re-parametrization, the space of pdfs forms a Riemannian manifold, namely the unit Hilbert sphere. An immediate consequence of this re-parametrization is that different families of pdfs form different submanifolds of the unit Hilbert sphere. Therefore, the proble...
متن کاملOn the distances between probability density functions
We give estimates of the distance between the densities of the laws of two functionals F and G on the Wiener space in terms of the Malliavin-Sobolev norm of F −G. We actually consider a more general framework which allows one to treat with similar (Malliavin type) methods functionals of a Poisson point measure (solutions of jump type stochastic equations). We use the above estimates in order to...
متن کاملSobolev Metrics on the Riemannian Manifold of All Riemannian Metrics
On the manifold M(M) of all Riemannian metrics on a compact manifold M one can consider the natural L-metric as decribed first by [10]. In this paper we consider variants of this metric which in general are of higher order. We derive the geodesic equations, we show that they are well-posed under some conditions and induce a locally diffeomorphic geodesic exponential mapping. We give a condition...
متن کاملRiemannian metrics for neural networks
We describe four algorithms for neural network training, each adapted to different scalability constraints. These algorithms are mathematically principled and invariant under a number of transformations in data and network representation, from which performance is thus independent. These algorithms are obtained from the setting of differential geometry, and are based on either the natural gradi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Signal Processing
سال: 2007
ISSN: 1053-587X
DOI: 10.1109/tsp.2007.896119