The Group Dantzig Selector
نویسندگان
چکیده
We introduce a new method — the group Dantzig selector — for high dimensional sparse regression with group structure, which has a convincing theory about why utilizing the group structure can be beneficial. Under a group restricted isometry condition, we obtain a significantly improved nonasymptotic `2-norm bound over the basis pursuit or the Dantzig selector which ignores the group structure. To gain more insight, we also introduce a surprisingly simple and intuitive sparsity oracle condition to obtain a block `1norm bound, which is easily accessible to a broad audience in machine learning community. Encouraging numerical results are also provided to support our theory.
منابع مشابه
DASSO: Connections Between the Dantzig Selector and Lasso
We propose a new algorithm, DASSO, for fitting the entire coefficient path of the Dantzig selector with a similar computational cost to the LARS algorithm that is used to compute the Lasso. DASSO efficiently constructs a piecewise linear path through a sequential simplex-like algorithm, which is remarkably similar to LARS. Comparison of the two algorithms sheds new light on the question of how ...
متن کاملDantzig selector homotopy with dynamic measurements
The Dantzig selector is a near ideal estimator for recovery of sparse signals from linear measurements in the presence of noise. It is a convex optimization problem which can be recast into a linear program (LP) for real data, and solved using some LP solver. In this paper we present an alternative approach to solve the Dantzig selector which we call “Primal Dual pursuit” or “PD pursuit”. It is...
متن کاملThe Double Dantzig
The Dantzig selector (Candes and Tao, 2007) is a new approach that has been proposed for performing variable selection and model fitting on linear regression models. It uses an L1 penalty to shrink the regression coefficients towards zero, in a similar fashion to the Lasso. While both the Lasso and Dantzig selector potentially do a good job of selecting the correct variables, several researcher...
متن کاملParallelism, uniqueness, and large-sample asymptotics for the Dantzig selector.
The Dantzig selector (Candès and Tao, 2007) is a popular ℓ1-regularization method for variable selection and estimation in linear regression. We present a very weak geometric condition on the observed predictors which is related to parallelism and, when satisfied, ensures the uniqueness of Dantzig selector estimators. The condition holds with probability 1, if the predictors are drawn from a co...
متن کاملA Generalized Dantzig Selector with Shrinkage Tuning
The Dantzig selector performs variable selection and model fitting in linear regression. It uses an L1 penalty to shrink the regression coefficients towards zero, in a similar fashion to the Lasso. While both the Lasso and Dantzig selector potentially do a good job of selecting the correct variables, they tend to over-shrink the final coefficients. This results in an unfortunate trade-off. One ...
متن کامل