Shrinkage Tuning Parameter Selection with a Diverging Number of Parameters
نویسندگان
چکیده
Contemporary statistical research frequently deals with problems involving a diverging number of parameters. For those problems, various shrinkage methods (e.g., LASSO, SCAD, etc) are found particularly useful for the purpose of variable selection (Fan and Peng, 2004; Huang et al., 2007b). Nevertheless, the desirable performances of those shrinkage methods heavily hinge on an appropriate selection of the tuning parameters. With a fixed predictor dimension, Wang et al. (2007b) and Wang and Leng (2007) demonstrated that the tuning parameters selected by a BIC-type criterion can identify the true model consistently. In this work, similar results are further extended to the situation with a diverging number of parameters for both unpenalized and penalized estimators (Fan and Peng, 2004; Huang et al., 2007b). Consequently, our theoretical results further enlarge not only the applicable scope of the traditional BIC-type criteria but also that of those shrinkage estimation methods (Tibshirani, 1996; Huang et al., 2007b; Fan and Li, 2001; Fan and Peng, 2004).
منابع مشابه
Regression Coefficient and Autoregressive Order Shrinkage and Selection via Lasso
The least absolute shrinkage and selection operator (lasso) has been widely used in regression shrinkage and selection. In this article, we extend its application to the REGression model with AutoRegressive errors (REGAR). Two types of lasso estimators are carefully studied. The first is similar to the traditional lasso estimator with only two tuning parameters (one for regression coefficients ...
متن کاملModel Selection for Correlated Data with Diverging Number of Parameters
High-dimensional longitudinal data arise frequently in biomedical and genomic research. It is important to select relevant covariates when the dimension of the parameters diverges as the sample size increases.We propose the penalized quadratic inference function to perform model selection and estimation simultaneously in the framework of a diverging number of regression parameters. The penalize...
متن کاملShrinkage Tuning Parameter Selection in Precision Matrices Estimation
Recent literature provides many computational and modeling approaches for covariance matrices estimation in a penalized Gaussian graphical models but relatively little study has been carried out on the choice of the tuning parameter. This paper tries to fill this gap by focusing on the problem of shrinkage parameter selection when estimating sparse precision matrices using the penalized likelih...
متن کاملConsistent selection of tuning parameters via variable selection stability
Penalized regression models are popularly used in high-dimensional data analysis to conduct variable selection and model fitting simultaneously. Whereas success has been widely reported in literature, their performances largely depend on the tuning parameters that balance the trade-off between model fitting and model sparsity. Existing tuning criteria mainly follow the route of minimizing the e...
متن کاملClassic and Bayes Shrinkage Estimation in Rayleigh Distribution Using a Point Guess Based on Censored Data
Introduction In classical methods of statistics, the parameter of interest is estimated based on a random sample using natural estimators such as maximum likelihood or unbiased estimators (sample information). In practice, the researcher has a prior information about the parameter in the form of a point guess value. Information in the guess value is called as nonsample information. Thomp...
متن کامل