Comparison of sparse least squares support vector regressors trained in primal and dual

نویسنده

  • Shigeo Abe
چکیده

In our previous work, we have developed sparse least squares support vector regressors (sparse LS SVRs) trained in the primal form in the reduced empirical feature space. In this paper we develop sparse LS SVRs trained in the dual form in the empirical feature space. Namely, first the support vectors that span the reduced empirical feature space are selected by the Cholesky factorization and LS SVR is trained in the dual form by solving a set of linear equations. We compare the computational cost of the LS SVRs in the primal and dual form and clarify that if the dimension of the reduced empirical feature space is almost equal to the number of training data, the dual form is faster. But the primal form is computationally more stable and for the large margin parameter the coefficient matrix of the dual form becomes near singular. By computer experiments using some benchmark data sets we verify the above results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Least Squares Support Vector Machines and Primal Space Estimation

In this paper a methodology for estimation in kernel-induced feature spaces is presented, making a link between the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM) and classical statistical inference techniques in order to perform linear regression in primal space. This is done by computing a finite dimensional approximation of the kernel-induced feature space mapping ...

متن کامل

Kobe University Repository : Kernel

In this paper we discuss sparse least squares support vector regressors (sparse LS SVRs) defined in the reduced empirical feature space, which is a subspace of mapped training data. Namely, we define an LS SVR in the primal form in the empirical feature space, which results in solving a set of linear equations. The independent components in the empirical feature space are obtained by deleting d...

متن کامل

Parsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model

A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate numb...

متن کامل

Parsimonious Support Vector Regression using Orthogonal Forward Selection with the Generalized Kernel Model

Sparse regression modeling is addressed using a generalized kernel model in which kernel regressor has its individually tuned position (center) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append regressors one by one. After the determination of the model structure, namely the selection certain number of regressors, the model weig...

متن کامل

Componentwise Least Squares Support Vector Machines

This chapter describes componentwise Least Squares Support Vector Machines (LS-SVMs) for the estimation of additive models consisting of a sum of nonlinear components. The primal-dual derivations characterizing LS-SVMs for the estimation of the additive model result in a single set of linear equations with size growing in the number of data-points. The derivation is elaborated for the classific...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008