نتایج جستجو برای: stein type shrinkage lasso
تعداد نتایج: 1360847 فیلتر نتایج به سال:
In this dissertation we studied asymptotic properties of shrinkage estimators, and compared their performance with absolute penalty estimators (APE) in linear and partially linear models (PLM). A robust shrinkage M-estimator is proposed for PLM, and asymptotic properties are investigated, both analytically and through simulation studies. In Chapter 2, we compared the performance of shrinkage an...
A new class of minimax Stein-type shrinkage estimators of a multivariate normal mean is studied where the shrinkage factor is based on an `p norm. The proposed estimators allow some but not all coordinates to be estimated by 0 thereby allow sparsity as well as minimaxity. AMS 2000 subject classifications: Primary 62C20; secondary 62J07.
Adaptive ridge is a special form of ridge regression, balancing the quadratic penalization on each parameter of the model. This paper shows the equivalence between adaptive ridge and lasso (least absolute shrinkage and selection operator). This equivalence states that both procedures produce the same estimate. Least absolute shrinkage can thus be viewed as a particular quadratic penalization. F...
The Lasso is a popular and computationally efficient procedure for automatically performing both variable selection and coefficient shrinkage on linear regression models. One limitation of the Lasso is that the same tuning parameter is used for both variable selection and shrinkage. As a result, it typically ends up selecting a model with too many variables to prevent over shrinkage of the regr...
Smooth James-Stein thresholding-based estimators enjoy smoothness like ridge regression and perform variable selection like lasso. They have added flexibility thanks to more than one regularization parameters (like adaptive lasso), and the ability to select these parameters well thanks to a unbiased and smooth estimation of the risk. The motivation is a gravitational wave burst detection proble...
In a linear regression model with homoscedastic Normal noise, I consider James–Stein type shrinkage in the estimation of nuisance parameters associated with control variables. For at least three control variables and exogenous treatment, I show that the standard leastsquares estimator is dominated with respect to squared-error loss in the treatment effect even among unbiased estimators and even...
The Reverse Stein Effect is identified and illustrated: A statistician who shrinks his/her data toward a point chosen without reliable knowledge about the underlying value of the parameter to be estimated but based instead upon the observed data will not be protected by the minimax property of shrinkage estimators such a" that of James and Stein, but instead will likely incur a greater error th...
Estimating a covariance matrix is an important task in applications where the number of variables is larger than the number of observations. In the literature, shrinkage approaches for estimating a high-dimensional covariance matrix are employed to circumvent the limitations of the sample covariance matrix. A new family of nonparametric Stein-type shrinkage covariance estimators is proposed who...
Zero-inflated negative binomial model is an appropriate choice to count response variables with excessive zeros and over-dispersion simultaneously. This paper addressed parameter estimation in the zero-inflated when there are many parameters, so that some of them have not influence on variable. We proposed based linear shrinkage, pretest, shrinkage pretest, Stein-type, and positive Stei...
In Kuriki and Takemura (1997a) we established a general theory of James-Stein type shrinkage to convex sets with smooth boundary. In this paper we show that our results can be generalized to the case where shrinkage is toward smooth non-convex cones. A primary example of this shrinkage is descriptive principal component analysis, where one shrinks small singular values of the data matrix. Here ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید