Algorithms for Sparse Support Vector Machines

نویسندگان

چکیده

Many problems in classification involve huge numbers of irrelevant features. Variable selection reveals the crucial features, reduces dimensionality feature space, and improves model interpretation. In support vector machine literature, variable is achieved by l1 penalties. These convex relaxations seriously bias parameter estimates toward 0 tend to admit too many The current article presents an alternative that replaces penalties sparse-set constraints. Penalties still appear, but serve a different purpose. proximal distance principle takes loss function L(β) adds penalty ρ2dist(β,Sk)2 capturing squared Euclidean β sparsity set Sk where at most k components are nonzero. If βρ represents minimum objective fρ(β)=L(β)+ρ2dist(β,Sk)2, then tends constrained over as ρ ∞. We derive two closely related algorithms carry out this strategy. Our simulated real examples vividly demonstrate how achieve better without power. Supplementary materials for available online.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Support Vector Machines

Support Vector Machines (SVMs) are state-of-the-art algorithms for classification in machine learning. However, the SVM formulation does not directly seek to find sparse solutions. In this work, we propose an alternate formulation that explicitly imposes sparsity. We show that the proposed technique is related to the standard SVM formulation and therefore shares similar theoretical guarantees. ...

متن کامل

Sparse Deconvolution Using Support Vector Machines

Sparse deconvolution is a classical subject in digital signal processing, having many practical applications. Support vector machine (SVM) algorithms show a series of characteristics, such as sparse solutions and implicit regularization, which make them attractive for solving sparse deconvolution problems. Here, a sparse deconvolution algorithm based on the SVM framework for signal processing i...

متن کامل

Learning Optimally Sparse Support Vector Machines

We show how to train SVMs with an optimal guarantee on the number of support vectors (up to constants), and with sample complexity and training runtime bounds matching the best known for kernel SVM optimization (i.e. without any additional asymptotic cost beyond standard SVM training). Our method is simple to implement and works well in practice.

متن کامل

Dimensionality Reduction via Sparse Support Vector Machines

We describe a methodology for performing variable ranking and selection using support vector machines (SVMs). The method constructs a series of sparse linear SVMs to generate linear models that can generalize well, and uses a subset of nonzero weighted variables found by the linear models to produce a final nonlinear model. The method exploits the fact that a linear SVM (no kernels) with `1-nor...

متن کامل

STAGE-DISCHARGE MODELING USING SUPPORT VECTOR MACHINES

Establishment of rating curves are often required by the hydrologists for flow estimates in the streams, rivers etc. Measurement of discharge in a river is a time-consuming, expensive, and difficult process and the conventional approach of regression analysis of stage-discharge relation does not provide encouraging results especially during the floods. P

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Graphical Statistics

سال: 2022

ISSN: ['1061-8600', '1537-2715']

DOI: https://doi.org/10.1080/10618600.2022.2146697