Io-port.net Database Summary: Least-squares Support Vector Machines (ls-svms), Originating from Statistical Learning and Reproducing Kernel Hilbert Space

نویسندگان

  • Vincent Tóth
  • Roland Piga
  • Dario Zheng
  • Wei Xing
چکیده

io-port 06474690 Laurain, Vincent; Tóth, Roland; Piga, Dario; Zheng, Wei Xing An instrumental least squares support vector machine for nonlinear system identification. Automatica 54, Article ID 6308, 340-347 (2015). Summary: Least-Squares Support Vector Machines (LS-SVMs), originating from Statistical Learning and Reproducing Kernel Hilbert Space (RKHS) theories, represent a promising approach to identify nonlinear systems via nonparametric estimation of the involved nonlinearities in a computationally and stochastically attractive way. However, application of LS-SVMs and other RKHS variants in the identification context is formulated as a regularized linear regression aiming at the minimization of the `2 loss of the prediction error. This formulation corresponds to the assumption of an auto-regressive noise structure, which is often found to be too restrictive in practical applications. In this paper, Instrumental Variable (IV) based estimation is integrated into the LS-SVM approach, providing, under minor conditions, consistent identification of nonlinear systems regarding the noise modeling error. It is shown how the cost function of the LS-SVM is modified to achieve an IV-based solution. Although, a practically well applicable choice of the instrumental variable is proposed for the derived approach, optimal choice of this instrument in terms of the estimates associated variance still remains to be an open problem. The effectiveness of the proposed IV based LS-SVM scheme is also demonstrated by a Monte Carlo study based simulation example.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Least squares support vector machines with tuning based on chaotic differential evolution approach applied to the identification of a thermal process

In the past decade, support vector machines (SVMs) have gained the attention of many researchers. SVMs are non-parametric supervised learning schemes that rely on statistical learning theory which enables learning machines to generalize well to unseen data. SVMs refer to kernel-based methods that have been introduced as a robust approach to classification and regression problems, lately has han...

متن کامل

From Zero to Reproducing Kernel Hilbert Spaces in Twelve Pages or Less

Reproducing Kernel Hilbert Spaces (RKHS) have been found incredibly useful in the machine learning community. Their theory has been around for quite some time and has been used in the statistics literature for at least twenty years. More recently, their application to perceptron-style algorithms, as well as new classes of learning algorithms (specially large-margin or other regularization machi...

متن کامل

α and β Stability for Additively Regularized LS-SVMs via Convex Optimization

This paper considers the design of an algorithm that maximizes explicitly its own stability. The stability criterion as often used for the construction of bounds on the generalization error of a learning algorithm is proposed to compensate for overfitting. The primal-dual formulation characterizing Least Squares Support Vector Machines (LS-SVMs) and the additive regularization framework [13] ar...

متن کامل

Sparseness of Support Vector Machines

Support vector machines (SVMs) construct decision functions that are linear combinations of kernel evaluations on the training set. The samples with non-vanishing coefficients are called support vectors. In this work we establish lower (asymptotical) bounds on the number of support vectors. On our way we prove several results which are of great importance for the understanding of SVMs. In parti...

متن کامل

Least Squares Support Vector Machines: an Overview

Support Vector Machines is a powerful methodology for solving problems in nonlinear classification, function estimation and density estimation which has also led recently to many new developments in kernel based learning in general. In these methods one solves convex optimization problems, typically quadratic programs. We focus on Least Squares Support Vector Machines which are reformulations t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015