Lanczos Approximations for the Speedup of Kernel Partial Least Squares Regression
نویسندگان
چکیده
The Degrees of Freedom of Kernel Partial Least Squares (KPLS) require all eigenvalues of the kernel matrix K, hence the computation is cubic in the number of observations n. •We use Kernel PLS itself to approximate the eigenvalues of the kernel matrix. −→ We can compute approximate Degrees of Freedom of KPS in O ( n2 ) ! •We can also compute approximate confidence intervals for KPLS in O ( n2 ) !
منابع مشابه
ar X iv : 0 90 2 . 33 47 v 1 [ st at . M L ] 1 9 Fe b 20 09 Lanczos Approximations for the Speedup of Kernel Partial Least Squares Regression ∗
The runtime for Kernel Partial Least Squares (KPLS) to compute the fit is quadratic in the number of examples. However, the necessity of obtaining sensitivity measures as degrees of freedom for model selection or confidence intervals for more detailed analysis requires cubic runtime, and thus constitutes a computational bottleneck in real-world data analysis. We propose a novel algorithm for KP...
متن کاملA robust least squares fuzzy regression model based on kernel function
In this paper, a new approach is presented to fit arobust fuzzy regression model based on some fuzzy quantities. Inthis approach, we first introduce a new distance between two fuzzynumbers using the kernel function, and then, based on the leastsquares method, the parameters of fuzzy regression model isestimated. The proposed approach has a suitable performance to<b...
متن کاملLeast-squares Probabilistic Classifier: a Computationally Efficient Alternative to Kernel Logistic Regression
The least-squares probabilistic classifier (LSPC) is a computationally efficient alternative to kernel logistic regression (KLR). A key idea for the speedup is that, unlike KLR that uses maximum likelihood estimation for a log-linear model, LSPC uses least-squares estimation for a linear model. This allows us to obtain a global solution analytically in a classwise manner. In exchange for the sp...
متن کاملPartial least-squares vs. Lanczos bidiagonalization - I: analysis of a projection method for multiple regression
متن کامل
Kernel Partial Least Squares for Stationary Data
We consider the kernel partial least squares algorithm for non-parametric regression with stationary dependent data. Probabilistic convergence rates of the kernel partial least squares estimator to the true regression function are established under a source and an effective dimensionality condition. It is shown both theoretically and in simulations that long range dependence results in slower c...
متن کامل