Degrees of Freedom Tests for Smoothing Splines

نویسنده

  • Eva Cantoni
چکیده

When using smoothing splines to estimate a function, the user faces the problem of choosing the smoothing parameter. Several techniques are available for selecting this parameter according to certain optimality criteria. Here, we take a different point of view and we propose a technique for choosing between two alternatives, for example allowing for two different levels of degrees of freedom. The problem is addressed in the framework of a mixed-effects model, whose assumptions ensure that the resulting estimator is unbiased. A likelihood-ratio-type test statistic is proposed, and its exact distribution is derived. Tests of linearity and overall effect follow directly. We then extend this idea to additive models where it provides a more attractive alternative than multi-parameter optimisation, and where it gives exact distributional results that can be used in an analysisof-deviance-type approach. Examples on real data and a simulation study of level and power complete the paper.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rejoinder: Boosting Algorithms: Regularization, Prediction and Model Fitting

We are grateful that Hastie points out the connection to degrees of freedom for LARS which leads to another—and often better—definition of degrees of freedom for boosting in generalized linear models. As Hastie writes and as we said in the paper, our formula for degrees of freedom is only an approximation: the cost of searching, for example, for the best variable in componentwise linear least s...

متن کامل

Inference Using Shape - Restricted Regression Splines

Regression splines are smooth, flexible, and parsimonious nonparametric function estimators. They are known to be sensitive to knot number and placement, but if assumptions such as monotonicity or convexity may be imposed on the regression function, the shaperestricted regression splines are robust to knot choices. Monotone regression splines were introduced by Ramsay [Statist. Sci. 3 (1998) 42...

متن کامل

Discussion of “ Boosting Algorithms : Regularization , Prediction and Model Fitting ” by Peter Bühlmann and Torsten Hothorn

We congratulate the authors (hereafter BH) for an interesting take on the boosting technology, and for developing a modular computational environment in R for exploring their models. Their use of low-degree-of-freedom smoothing splines as a base learner provides an interesting approach to adaptive additive modeling. The notion of “Twin Boosting” is interesting as well; besides the adaptive lass...

متن کامل

Hybrid Adaptive Splines

An adaptive spline method for smoothing is proposed which combines features from both regression spline and smoothing spline approaches One of its advantages is the ability to vary the amount of smoothing in response to the inhomogeneous curvature of true functions at di erent locations This method can be applied to many multivariate function estimation problems which is illustrated in this pap...

متن کامل

Comparing Smoothing Techniques for Fitting the Nonlinear Effect of Covariate in Cox Models

BACKGROUND AND OBJECTIVE Cox model is a popular model in survival analysis, which assumes linearity of the covariate on the log hazard function, While continuous covariates can affect the hazard through more complicated nonlinear functional forms and therefore, Cox models with continuous covariates are prone to misspecification due to not fitting the correct functional form for continuous covar...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000