Least Squares Support Vector Machines and Primal Space Estimation
نویسندگان
چکیده
In this paper a methodology for estimation in kernel-induced feature spaces is presented, making a link between the primal-dual formulation of Least Squares Support Vector Machines (LS-SVM) and classical statistical inference techniques in order to perform linear regression in primal space. This is done by computing a finite dimensional approximation of the kernel-induced feature space mapping by using the Nyström technique in primal space. Additionally, the methodology can be applied for a fixed-size formulation using active selection of the support vectors with entropy maximization in order to obtain a sparse approximation. Examples for different cases show good results.
منابع مشابه
Least Squares Support Vector Machines: an Overview
Support Vector Machines is a powerful methodology for solving problems in nonlinear classification, function estimation and density estimation which has also led recently to many new developments in kernel based learning in general. In these methods one solves convex optimization problems, typically quadratic programs. We focus on Least Squares Support Vector Machines which are reformulations t...
متن کاملComponentwise Least Squares Support Vector Machines
This chapter describes componentwise Least Squares Support Vector Machines (LS-SVMs) for the estimation of additive models consisting of a sum of nonlinear components. The primal-dual derivations characterizing LS-SVMs for the estimation of the additive model result in a single set of linear equations with size growing in the number of data-points. The derivation is elaborated for the classific...
متن کاملOptimal control by least squares support vector machines
Support vector machines have been very successful in pattern recognition and function estimation problems. In this paper we introduce the use of least squares support vector machines (LS-SVM's) for the optimal control of nonlinear systems. Linear and neural full static state feedback controllers are considered. The problem is formulated in such a way that it incorporates the N-stage optimal con...
متن کاملFixed-size kernel logistic regression for phoneme classification
Kernel logistic regression (KLR) is a popular non-linear classification technique. Unlike an empirical risk minimization approach such as employed by Support Vector Machines (SVMs), KLR yields probabilistic outcomes based on a maximum likelihood argument which are particularly important in speech recognition. Different from other KLR implementations we use a Nyström approximation to solve large...
متن کاملLearning to Rank with Pairwise Regularized Least-Squares
Learning preference relations between objects of interest is one of the key problems in machine learning. Our approach for addressing this task is based on pairwise comparisons for estimation of overall ranking. In this paper, we propose a simple preference learning algorithm based on regularized least squares and describe it within the kernel methods framework. Our algorithm, that we call Rank...
متن کامل