FIRST: Combining forward iterative selection and shrinkage in high dimensional sparse linear regression
نویسندگان
چکیده
We propose a new class of variable selection techniques for regression in high dimensional linear models based on a forward selection version of the LASSO, adaptive LASSO or elastic net, respectively to be called as forward iterative regression and shrinkage technique (FIRST), adaptive FIRST and elastic FIRST. These methods seem to work effectively for extremely sparse high dimensional linear models. We exploit the fact that the LASSO, adaptive LASSO and elastic net have closed form solutions when the predictor is onedimensional. The explicit formula is then repeatedly used in an iterative fashion to build the model until convergence occurs. By carefully considering the relationship between estimators at successive stages, we develop fast algorithms to compute our estimators. The performance of our new estimators are compared with commonly used estimators in terms of predictive accuracy and errors in variable selection.
منابع مشابه
Robust Estimation in Linear Regression with Molticollinearity and Sparse Models
One of the factors affecting the statistical analysis of the data is the presence of outliers. The methods which are not affected by the outliers are called robust methods. Robust regression methods are robust estimation methods of regression model parameters in the presence of outliers. Besides outliers, the linear dependency of regressor variables, which is called multicollinearity...
متن کاملImproved Variable Selection with Forward - Lasso Adaptive Shrinkage
Recently, considerable interest has focused on variable selection methods in regression situations where the number of predictors, p, is large relative to the number of observations, n. Two commonly applied variable selection approaches are the Lasso, which computes highly shrunk regression coefficients, and Forward Selection, which uses no shrinkage. We propose a new approach, “Forward-Lasso A...
متن کاملLazy lasso for local regression
Locally weighted regression is a technique that predicts the response for new data items from their neighbors in the training data set, where closer data items are assigned higher weights in the prediction. However, the original method may suffer from overfitting and fail to select the relevant variables. In this paper we propose combining a regularization approach with locally weighted regress...
متن کاملNonparametric Greedy Algorithms for the Sparse Learning Problem
This paper studies the forward greedy strategy in sparse nonparametric regression. For additive models, we propose an algorithm called additive forward regression; for general multivariate models, we propose an algorithm called generalized forward regression. Both algorithms simultaneously conduct estimation and variable selection in nonparametric settings for the high dimensional sparse learni...
متن کاملInterpretable sparse SIR for functional data
This work focuses on the issue of variable selection in functional regression. Unlike most work in this framework, our approach does not select isolated points in the definition domain of the predictors, nor does it rely on the expansion of the predictors in a given functional basis. It provides an approach to select full intervals made of consecutive points. This feature improves the interpret...
متن کامل