Learning gradients on manifolds
نویسندگان
چکیده
A common belief in high dimensional data analysis is that data is concentrated on a low dimensional manifold. This motivates simultaneous dimension reduction and regression on manifolds. We provide an algorithm for learning gradients on manifolds for dimension reduction for high dimensional data with few observations. We obtain generalization error bounds for the gradient estimates and show that the convergence rate depends on the intrinsic dimension of the manifold and not on the dimension of the ambient space. We illustrate the efficacy of this approach empirically on simulated and real data and compare the method to other dimension reduction procedures.
منابع مشابه
A Geometry Preserving Kernel over Riemannian Manifolds
Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...
متن کاملبهبود مدل تفکیککننده منیفلدهای غیرخطی بهمنظور بازشناسی چهره با یک تصویر از هر فرد
Manifold learning is a dimension reduction method for extracting nonlinear structures of high-dimensional data. Many methods have been introduced for this purpose. Most of these methods usually extract a global manifold for data. However, in many real-world problems, there is not only one global manifold, but also additional information about the objects is shared by a large number of manifolds...
متن کاملNatural Gradient Approach to Blind Separationof over - and under - Complete Mixturesl
In this paper we study natural gradient approaches to blind separation of over-and under-complete mixtures. First we introduce Lie group structures on the mani-folds of the under-and over-complete mixture matrices respectively, and endow Riemannian metrics on the manifolds based on the property of Lie groups. Then we derive the natural gradients on the manifolds using the isometry of the Rieman...
متن کاملGradients on Matrix Manifolds and their Chain Rule
Optimization on matrix manifolds is an important tool in machine learning and neural networks, and for local algorithms it is often necessary to compute the gradient of a function defined on a matrix space. Recent advances in information geometry have shown that by adopting to the inherent geometry of the search space, convergence speed can be greatly increased. Here, we give an overview of var...
متن کاملVisual Tracking using Learning Histogram of Oriented Gradients by SVM on Mobile Robot
The intelligence of a mobile robot is highly dependent on its vision. The main objective of an intelligent mobile robot is in its ability to the online image processing, object detection, and especially visual tracking which is a complex task in stochastic environments. Tracking algorithms suffer from sequence challenges such as illumination variation, occlusion, and background clutter, so an a...
متن کامل