Penalized Bregman Divergence Estimation via Coordinate Descent

Authors

  • Chunming Zhang
  • Yi Chai
  • Zhengjun Zhang
Abstract:

Variable selection via penalized estimation is appealing for dimension reduction. For penalized linear regression, Efron, et al. (2004) introduced the LARS algorithm. Recently, the coordinate descent (CD) algorithm was developed by Friedman, et al. (2007) for penalized linear regression and penalized logistic regression and was shown to gain computational superiority. This paper explores the CD algorithm to penalized Bregman divergence (BD) estimation for a broader class of models, including not only the generalized linear model, which has been well studied in the literature on penalization, but also the quasi-likelihood model, which has been less developed. Simulation study and real data application illustrate the performances of the CD and LARS algorithms in regression estimation, variable selection and classification procedure when the number of explanatory variables is large in comparison to the sample size.

Download for Free

Sign up for free to access the full text

Already have an account?login

similar resources

Coordinate Descent Algorithms for Lasso Penalized Regression

Imposition of a lasso penalty shrinks parameter estimates toward zero and performs continuous model selection. Lasso penalized regression is capable of handling linear regression problems where the number of predictors far exceeds the number of cases. This paper tests two exceptionally fast algorithms for estimating regression coefficients with a lasso penalty. The previously known ℓ2 algorithm...

full text

Penalized Bregman divergence for large-dimensional regression and classification.

Regularization methods are characterized by loss functions measuring data fits and penalty terms constraining model parameters. The commonly used quadratic loss is not suitable for classification with binary responses, whereas the loglikelihood function is not readily applicable to models where the exact distribution of observations is unknown or not fully specified. We introduce the penalized ...

full text

Shape Constrained Density Estimation via Penalized Rényi Divergence

Abstract. Shape constraints play an increasingly prominent role in nonparametric function estimation. While considerable recent attention has been focused on log concavity as a regularizing device in nonparametric density estimation, weaker forms of concavity constraints encompassing larger classes of densities have received less attention but offer some additional flexibility. Heavier tail beh...

full text

A Coordinate Majorization Descent Algorithm for l1 Penalized Learning

The glmnet package by [1] is an extremely fast implementation of the standard coordinate descent algorithm for solving l1 penalized learning problems. In this paper, we consider a family of coordinate majorization descent algorithms for solving the l1 penalized learning problems by replacing each coordinate descent step with a coordinate-wise majorization descent operation. Numerical experiment...

full text

Faster Coordinate Descent via Adaptive Importance Sampling

Coordinate descent methods employ random partial updates of decision variables in order to solve huge-scale convex optimization problems. In this work, we introduce new adaptive rules for the random selection of their updates. By adaptive, we mean that our selection rules are based on the dual residual or the primal-dual gap estimates and can change at each iteration. We theoretically character...

full text

When Cyclic Coordinate Descent Outperforms Randomized Coordinate Descent

Coordinate descent (CD) method is a classical optimization algorithm that has seen a revival of interest because of its competitive performance in machine learning applications. A number of recent papers provided convergence rate estimates for their deterministic (cyclic) and randomized variants that differ in the selection of update coordinates. These estimates suggest randomized coordinate de...

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 10  issue None

pages  125- 140

publication date 2011-11

By following a journal you will be notified via email when a new issue of this journal is published.

Keywords

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023