Efficient Top-K Feature Selection Using Coordinate Descent Method

نویسندگان

چکیده

Sparse learning based feature selection has been widely investigated in recent years. In this study, we focus on the l2,0-norm selection, which is effective for exact top-k but challenging to optimize. To solve general constrained problems, novelly develop a parameter-free optimization framework coordinate descend (CD) method, termed CD-LSR. Specifically, devise skillful conversion from original problem solving one continuous matrix and discrete matrix. Then nontrivial constraint can be solved efficiently by with CD method. We impose vanilla least square regression (LSR) model optimize it Extensive experiments exhibit efficiency of CD-LSR, as well discrimination ability identify informative features. More importantly, versatility CD-LSR facilitates applications more sophisticated models. Based competitive performance baseline LSR model, satisfactory its reasonably expected. The source MATLAB code are available at: https://github.com/solerxl/Code_For_AAAI_2023.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Parallel Coordinate Descent Newton Method for Efficient $\ell_1$-Regularized Minimization

The recent years have witnessed advances in parallel algorithms for large scale optimization problems. Notwithstanding demonstrated success, existing algorithms that parallelize over features are usually limited by divergence issues under high parallelism or require data preprocessing to alleviate these problems. In this work, we propose a Parallel Coordinate Descent Newton algorithm using mult...

متن کامل

Feature Clustering for Accelerating Parallel Coordinate Descent

Large-scale `1-regularized loss minimization problems arise in high-dimensional applications such as compressed sensing and high-dimensional supervised learning, including classification and regression problems. High-performance algorithms and implementations are critical to efficiently solving these problems. Building upon previous work on coordinate descent algorithms for `1-regularized probl...

متن کامل

Top-k Supervise Feature Selection via ADMM for Integer Programming

Recently, structured sparsity-inducing based feature selection has become a hot topic in machine learning and pattern recognition. Most of the sparsity-inducing feature selection methods are designed to rank all features by certain criterion and then select the k top-ranked features, where k is an integer. However, the k top features are usually not the top k features and therefore maybe a subo...

متن کامل

Exact Top-k Feature Selection via l2, 0-Norm Constraint

In this paper, we propose a novel robust and pragmatic feature selection approach. Unlike those sparse learning based feature selection methods which tackle the approximate problem by imposing sparsity regularization in the objective function, the proposed method only has one `2,1-norm loss term with an explicit `2,0-Norm equality constraint. An efficient algorithm based on augmented Lagrangian...

متن کامل

Penalized Bregman Divergence Estimation via Coordinate Descent

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2023

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v37i9.26258