A gradient-based optimization algorithm for LASSO
نویسندگان
چکیده
LASSO is a useful method for achieving both shrinkage and variable selection simultaneously. The main idea of LASSO is to use the L1 constraint in the regularization step which has been applied to various models such as wavelets, kernel machines, smoothing splines, and multiclass logistic models. We call such models with the L1 constraint generalized LASSO models. In this paper, we propose a new algorithm called the gradient LASSO algorithm for generalized LASSO. The gradient LASSO algorithm is computationally more stable than QP based algorithms because it does not require matrix inversions, and thus it can be more easily applied to high dimensional data. Simulation results show that the proposed algorithm is fast enough for practical purposes and provides reliable results. To illustrate its computing power with high dimensional data, we analyze multiclass microarray data using the proposed algorithm.
منابع مشابه
Sparse learning with duality gap guarantee
We propose a general regularized empirical risk minimization framework for sparse learning which accommodates popular regularizers such as lasso, group lasso, and the trace norm. Within this framework, we develop two optimization algorithms. The first method is based on squared penalties added to the empirical risk and is solved using a subgradient-based L-BFGS quasi-Newton method. The second m...
متن کاملGradient-based Ant Colony Optimization for Continuous Spaces
A novel version of Ant Colony Optimization (ACO) algorithms for solving continuous space problems is presented in this paper. The basic structure and concepts of the originally reported ACO are preserved and adaptation of the algorithm to the case of continuous space is implemented within the general framework. The stigmergic communication is simulated through considering certain direction vect...
متن کاملFast Overlapping Group Lasso
The group Lasso is an extension of the Lasso for feature selection on (predefined) non-overlapping groups of features. The non-overlapping group structure limits its applicability in practice. There have been several recent attempts to study a more general formulation, where groups of features are given, potentially with overlaps between the groups. The resulting optimization is, however, much ...
متن کاملGradient-based Ant Colony Optimization for Continuous Spaces
A novel version of Ant Colony Optimization (ACO) algorithms for solving continuous space problems is presented in this paper. The basic structure and concepts of the originally reported ACO are preserved and adaptation of the algorithm to the case of continuous space is implemented within the general framework. The stigmergic communication is simulated through considering certain direction vect...
متن کاملAccelerated Stochastic Gradient Method for Composite Regularization
Regularized risk minimization often involves nonsmooth optimization. This can be particularly challenging when the regularizer is a sum of simpler regularizers, as in the overlapping group lasso. Very recently, this is alleviated by using the proximal average, in which an implicitly nonsmooth function is employed to approximate the composite regularizer. In this paper, we propose a novel extens...
متن کامل