نتایج جستجو برای: penalized regression

تعداد نتایج: 319670  

Journal: :Computational Statistics & Data Analysis 2009
Jiguo Cao James O. Ramsay

Wepropose the generalized profilingmethod to estimate themultiple regression functions in the framework of penalized spline smoothing, where the regression functions and the smoothing parameter are estimated in two nested levels of optimization. The corresponding gradients and Hessian matrices are worked out analytically, using the Implicit Function Theorem if necessary, which leads to fast and...

Journal: :Journal of the American Statistical Association 2009
Hua Liang Runze Li

This article focuses on variable selection for partially linear models when the covariates are measured with additive errors. We propose two classes of variable selection procedures, penalized least squares and penalized quantile regression, using the nonconvex penalized principle. The first procedure corrects the bias in the loss function caused by the measurement error by applying the so-call...

Journal: :Statistical applications in genetics and molecular biology 2008
A Geert Heidema Nico Nagelkerke

To discriminate between breast cancer patients and controls, we used a three-step approach to obtain our decision rule. First, we ranked the mass/charge values using random forests, because it generates importance indices that take possible interactions into account. We observed that the top ranked variables consisted of highly correlated contiguous mass/charge values, which were grouped in the...

2008
Nicole Krämer Anne-Laure Boulesteix Gerhard Tutz

We propose a novel framework that combines penalization techniques with Partial Least Squares (PLS). We focus on two important applications. (1) We combine PLS with a roughness penalty to estimate high-dimensional regression problems with functional predictors and scalar response. (2) Starting with an additive model, we expand each variable in terms of a generous number of B-Spline basis functi...

Journal: :Biometrika 2010
Chunming Zhang Yuan Jiang Yi Chai

Regularization methods are characterized by loss functions measuring data fits and penalty terms constraining model parameters. The commonly used quadratic loss is not suitable for classification with binary responses, whereas the loglikelihood function is not readily applicable to models where the exact distribution of observations is unknown or not fully specified. We introduce the penalized ...

2013
Cheryl J. Flynn Jeffrey S. Simonoff

It has been shown that AIC-type criteria are asymptotically efficient selectors of the tuning parameter in non-concave penalized regression methods under the assumption that the population variance is known or that a consistent estimator is available. We relax this assumption to prove that AIC itself is asymptotically efficient and we study its performance in finite samples. In classical regres...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید