نتایج جستجو برای: رگرسیونهای lasso
تعداد نتایج: 4601 فیلتر نتایج به سال:
Parameterized quadratic programming (Lasso) is a powerful tool for the recovery of sparse signals based on underdetermined observations contaminated by noise. In this paper, we study the problem of simultaneous sparsity pattern recovery and approximation recovery based on the Lasso. An extended Lasso method is proposed with the following main contributions: (1) we analyze the recovery accuracy ...
This paper concerns a class of group-lasso learning problems where the objective function is the sum of an empirical loss and the group-lasso penalty. For a class of loss function satisfying a quadratic majorization condition, we derive a unified algorithm called groupwisemajorization-descent (GMD) for efficiently computing the solution paths of the corresponding group-lasso penalized learning ...
5 The LASSO 9 5.1 Performance of Lasso estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 5.2 “Normal equations” for the LASSO solutions. . . . . . . . . . . . . . . . . . . . . . . . . . . . 9 5.3 Facts about Lasso solutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10 This short document presents the Dantzig Selector, first introd...
The application of the lasso is espoused in high-dimensional settings where only a small number of the regression coefficients are believed to be nonzero (i.e., the solution is sparse). Moreover, statistical properties of high-dimensional lasso estimators are often proved under the assumption that the correlation between the predictors is bounded. In this vein, coordinatewise methods, the most ...
We re-examine the original Group Lasso paper of Yuan and Lin (2007). The form of penalty in that paper seems to be designed for problems with uncorrelated features, but the statistical community has adopted it for general problems with correlated features. We show that for this general situation, a Group Lasso with a different choice of penalty matrix is generally more effective. We give insigh...
The least absolute deviation (LAD) regression is a useful method for robust regression, and the least absolute shrinkage and selection operator (lasso) is a popular choice for shrinkage estimation and variable selection. In this article we combine these two classical ideas together to produce LAD-lasso. Compared with the LAD regression, LAD-lasso can do parameter estimation and variable selecti...
Abstract: We study the asymptotic properties of Lasso+mLS and Lasso+ Ridge under the sparse high-dimensional linear regression model: Lasso selecting predictors and then modified Least Squares (mLS) or Ridge estimating their coefficients. First, we propose a valid inference procedure for parameter estimation based on parametric residual bootstrap after Lasso+ mLS and Lasso+Ridge. Second, we der...
In this paper we study post-model selection estimators which apply ordinary least squares (ols) to the model selected by first-step penalized estimators, typically lasso. It is well known that lasso can estimate the nonparametric regression function at nearly the oracle rate, and is thus hard to improve upon. We show that ols post lasso estimator performs at least as well as lasso in terms of t...
The least absolute shrinkage and selection operator (lasso) has been widely used in regression shrinkage and selection. In this article, we extend its application to the REGression model with AutoRegressive errors (REGAR). Two types of lasso estimators are carefully studied. The first is similar to the traditional lasso estimator with only two tuning parameters (one for regression coefficients ...
We consider the problem of selecting functional variables using the L1 regularization in a functional linear regression model with a scalar response and functional predictors in the presence of outliers. Since the LASSO is a special case of the penalized least squares regression with L1-penalty function it suffers from the heavy-tailed errors and/or outliers in data. Recently, the LAD regressio...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید