Spectral Regression for Dimensionality Reduction by Deng Cai , Xiaofei He , and Jiawei Han May 2007
نویسندگان
چکیده
Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal low dimensional structure in high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, Isomap, and Laplacian Eigenmap. However, these algorithms only provide the embedding results of training samples. There are many extensions of these approaches which try to solve the out-of-sample extension problem by seeking an embedding function in reproducing kernel Hilbert space. However, a disadvantage of all these approaches is that their computations usually involve eigen-decomposition of dense matrices which is expensive in both time and memory. In this paper, we propose a novel dimensionality reduction method, called Spectral Regression (SR). SR casts the problem of learning an embedding function into a regression framework, which avoids eigen-decomposition of dense matrices. Also, with the regression based framework, different kinds of regularizers can be naturally incorporated into our algorithm which makes it more flexible. SR can be performed in supervised, unsupervised and semisupervised situation. It can make efficient use of both labeled and unlabeled points to discover the intrinsic discriminant structure in the data. Experimental results on classification and semi-supervised classification demonstrate the effectiveness and efficiency of our algorithm.
منابع مشابه
Spectral Regression for Dimensionality Reduction∗
Spectral methods have recently emerged as a powerful tool for dimensionality reduction and manifold learning. These methods use information contained in the eigenvectors of a data affinity (i.e., item-item similarity) matrix to reveal low dimensional structure in high dimensional data. The most popular manifold learning algorithms include Locally Linear Embedding, Isomap, and Laplacian Eigenmap...
متن کاملIsometric Projection
Recently the problem of dimensionality reduction has received a lot of interests in many fields of information processing. We consider the case where data is sampled from a low dimensional manifold which is embedded in high dimensional Euclidean space. The most popular manifold learning algorithms include Locally Linear Embedding, ISOMAP, and Laplacian Eigenmap. However, these algorithms are no...
متن کاملIsometric Projection by Deng Cai , Xiaofei He , and
Recently the problem of dimensionality reduction has received a lot of interests in many fields of information processing, including data mining, information retrieval, and pattern recognition. We consider the case where data is sampled from a low dimensional manifold which is embedded in high dimensional Euclidean space. The most popular manifold learning algorithms include Locally Linear Embe...
متن کاملSparse Projections over Graph
Recent study has shown that canonical algorithms such as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) can be obtained from graph based dimensionality reduction framework. However, these algorithms yield projective maps which are linear combination of all the original features. The results are difficult to be interpreted psychologically and physiologically. This pape...
متن کاملSemi-Supervised Regression using Spectral Techniques∗
Graph-based approaches for semi-supervised learning have received increasing amount of interest in recent years. Despite their good performance, many pure graph based algorithms do not have explicit functions and can not predict the label of unseen data. Graph regularization is a recently proposed framework which incorporates the intrinsic geometrical structure as a regularization term. It can ...
متن کامل