Sparse Orthonormalized Partial Least Squares

نویسندگان

  • Marcel van Gerven
  • Tom Heskes
چکیده

Orthonormalized partial least squares (OPLS) is often used to find a low-rank mapping between inputs X and outputs Y by estimating loading matrices A and B. In this paper, we introduce sparse orthonormalized PLS as an extension of conventional PLS that finds sparse estimates of A through the use of the elastic net algorithm. We apply sparse OPLS to the reconstruction of presented images from BOLD response in primary visual cortex. Sparse OPLS finds solutions with low reconstruction error which are easy to interpret due to the sparseness of the loading matrix B . Moreover, the elastic net algorithm is generalized to allow for coupling constraints that induce a spatial regularization.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Kernel Orthonormalized PLS for feature extraction in large data sets

In this paper we are presenting a novel multivariate analysis method for large scale problems. Our scheme is based on a novel kernel orthonormalized partial least squares (PLS) variant for feature extraction, imposing sparsity constrains in the solution to improve scalability. The algorithm is tested on a benchmark of UCI data sets, and on the analysis of integrated short-time music features fo...

متن کامل

Kernel PLS-SVC for Linear and Nonlinear Classification

A new method for classification is proposed. This is based on kernel orthonormalized partial least squares (PLS) dimensionality reduction of the original data space followed by a support vector classifier. Unlike principal component analysis (PCA), which has previously served as a dimension reduction step for discrimination problems, orthonormalized PLS is closely related to Fisher’s approach t...

متن کامل

Multi-label Classification Using Hypergraph Orthonormalized Partial Least Squares

In many real-world applications, humangenerated data like images are often associated with several semantic topics simultaneously, called multi-label data, which poses a great challenge for classification in such scenarios. Since the topics are always not independent, it is very useful to respect the correlations among different topics for performing better classification on multi-label data. H...

متن کامل

On the Equivalence between Canonical Correlation Analysis and Orthonormalized Partial Least Squares

Canonical correlation analysis (CCA) and partial least squares (PLS) are well-known techniques for feature extraction from two sets of multidimensional variables. The fundamental difference between CCA and PLS is that CCA maximizes the correlation while PLS maximizes the covariance. Although both CCA and PLS have been applied successfully in various applications, the intrinsic relationship betw...

متن کامل

Sparse partial least squares regression for simultaneous dimension reduction and variable selection

Partial least squares regression has been an alternative to ordinary least squares for handling multicollinearity in several areas of scientific research since the 1960s. It has recently gained much attention in the analysis of high dimensional genomic data. We show that known asymptotic consistency of the partial least squares estimator for a univariate response does not hold with the very lar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010