نتایج جستجو برای: support vector machines
تعداد نتایج: 859628 فیلتر نتایج به سال:
Support Vector Machines (SVMs) and Adaptive Boosting (AdaBoost) are two successful classification methods. They are essentially the same as they both try to maximize the minimal margin on a training set. In this work, we present an even platform to compare these two learning algorithms in terms of their test error, margin distribution and generalization power. Two basic models of polynomials an...
The standard 2-norm SVM is known for its good performance in twoclass classi£cation. In this paper, we consider the 1-norm SVM. We argue that the 1-norm SVM may have some advantage over the standard 2-norm SVM, especially when there are redundant noise features. We also propose an ef£cient algorithm that computes the whole solution path of the 1-norm SVM, hence facilitates adaptive selection of...
Is there anything worthwhile to learn about the new SVM algorithm, or does it fall into the category of " yet-another-algorithm , " in which case readers should stop here and save their time for something more useful' In this short overview, I will try to argue that studying support-vector learning is very useful in two respects First, it is quite satisfying from a theoretical point of view. SV...
Support vector machines are one of the most popular machine learning methods for classification. Despite its great success, the SVM was originally designed for binary classification. Extensions to the multicategory case are important for general classification problems. In this article, we propose a new class of multicategory hinge loss functions, namely reinforced hinge loss functions. Both th...
We compare L1 and L2 soft margin support vector machines from the standpoint of positive definiteness, the number of support vectors, and uniqueness and degeneracy of solutions. Since the Hessian matrix of L2 SVMs is positive definite, the number of support vectors for L2 SVMs is larger than or equal to the number of L1 SVMs. For L1 SVMs, if there are plural irreducible sets of support vectors,...
In Support Vector Machines (SVM's), a non-linear model is estimated based on solving a Quadratic Programming (QP) problem. The quadratic cost function consists of a maximum likelihood cost term with constant variance and a regularization term. By specifying a diierence inclusion on the noise variance model, the maximum likelihood term is adopted for the case of heteroskedastic noise, which aris...
Support vector machines (SVMs) construct decision functions that are linear combinations of kernel evaluations on the training set. The samples with non-vanishing coefficients are called support vectors. In this work we establish lower (asymptotical) bounds on the number of support vectors. On our way we prove several results which are of great importance for the understanding of SVMs. In parti...
In this paper we propose a modified framework of support vector machines, called Oblique Support Vector Machines(OSVMs), to improve the capability of classification. The principle of OSVMs is joining an orthogonal vector into weight vector in order to rotate the support hyperplanes. By this way, not only the regularized risk function is revised, but the constrained functions are also modified. ...
In this chapter we present a new learning algorithm, Leave{One{Out (LOO{) SVMs and its generalization Adaptive Margin (AM{) SVMs, inspired by a recent upper bound on the leave{one{out error proved for kernel classiiers by Jaakkola and Haussler. The new approach minimizes the expression given by the bound in an attempt to minimize the leave{one{out error. This gives a convex optimization problem...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید