Kernel Partial Least Squares is Universally Consistent
نویسندگان
چکیده
We prove the statistical consistency of kernel Partial Least Squares Regression applied to a bounded regression learning problem on a reproducing kernel Hilbert space. Partial Least Squares stands out of well-known classical approaches as e.g. Ridge Regression or Principal Components Regression, as it is not defined as the solution of a global cost minimization procedure over a fixed model nor is it a linear estimator. Instead, approximate solutions are constructed by projections onto a nested set of data-dependent subspaces. To prove consistency, we exploit the known fact that Partial Least Squares is equivalent to the conjugate gradient algorithm in combination with early stopping. The choice of the stopping rule (number of iterations) is a crucial point. We study two empirical stopping rules. The first one monitors the estimation error in each iteration step of Partial Least Squares, and the second one estimates the empirical complexity in terms of a condition number. Both stopping rules lead to universally consistent estimators provided the kernel is universal.
منابع مشابه
Kernel Conjugate Gradient is Universally Consistent
We study the statistical consistency of conjugate gradient applied to a bounded regression learning problem seen as an inverse problem defined in a reproducing kernel Hilbert space. This approach leads to an estimator that stands out of the well-known classical approaches, as it is not defined as the solution of a global cost minimization procedure over a fixed model nor is it a linear estimato...
متن کاملFe b 20 09 Kernel Conjugate Gradient is Universally Consistent
We study the statistical consistency of conjugate gradient applied to a bounded regression learning problem seen as an inverse problem defined in a reproducing kernel Hilbert space. This approach leads to an estimator that stands out of the well-known classical approaches, as it is not defined as the solution of a global cost minimization procedure over a fixed model nor is it a linear estimato...
متن کاملStrong Universal Consistency of Smooth Kernel Regression Estimates
The paper deals with kernel estimates of Nadaraya-Watson type for a regression function with square integrable response variable. For usual bandwidth sequences and smooth nonnegative kernels, e.g., Gaussian and quartic kernels, strong L2-consistency is shown without any further condition on the underlying distribution. The proof uses a Tauberian theorem for Ces~ro summability. Let X be a d-dime...
متن کاملKernel Partial Least Squares for Stationary Data
We consider the kernel partial least squares algorithm for non-parametric regression with stationary dependent data. Probabilistic convergence rates of the kernel partial least squares estimator to the true regression function are established under a source and an effective dimensionality condition. It is shown both theoretically and in simulations that long range dependence results in slower c...
متن کاملA robust least squares fuzzy regression model based on kernel function
In this paper, a new approach is presented to fit arobust fuzzy regression model based on some fuzzy quantities. Inthis approach, we first introduce a new distance between two fuzzynumbers using the kernel function, and then, based on the leastsquares method, the parameters of fuzzy regression model isestimated. The proposed approach has a suitable performance to<b...
متن کامل