Sliced Inverse Moment Regression Using Weighted Chi-Squared Tests for Dimension Reduction∗

نویسندگان

  • Zhishen Ye
  • Jie Yang
چکیده

We propose a new class of dimension reduction methods using the first two inverse moments, called Sliced Inverse Moment Regression (SIMR). We develop corresponding weighted chi-squared tests for the dimension of the regression. Basically, SIMR are linear combinations of Sliced Inverse Regression (SIR) and the method using a new candidate matrix which is designed to recover the entire inverse second moment subspace. Theoretically, SIMR, as well as Sliced Average Variance Estimate (SAVE), are more capable of recovering the complete central dimension reduction subspace than SIR and Principle Hessian Directions (pHd). Therefore it can substitute for SIR, pHd, SAVE or any linear combination of them at a theoretical level. Simulation study shows that SIMR using the weighted chi-squared test may have consistently greater power than SIR, pHd, and SAVE.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sufficient Dimension Reduction via Inverse Regression: A Minimum Discrepancy Approach

A family of dimension-reduction methods, the inverse regression (IR) family, is developed by minimizing a quadratic objective function. An optimal member of this family, the inverse regression estimator (IRE), is proposed, along with inference methods and a computational algorithm. The IRE has at least three desirable properties: (1) Its estimated basis of the central dimension reduction subspa...

متن کامل

Determining the dimension in Sliced

Sliced inverse regression and principal Hessian directions (Li, 1991, 1992) aim to reduce the dimensionality of regression problems. An important step in the method is the determination of a suitable dimension. While statistical tests based on the nullity eigenvalues are usually suggested, we here focus on the quality of the estimation of the eeective dimension reduction (edr) spaces. Essential...

متن کامل

Testing Predictor Contributions in Sufficient Dimension Reduction

We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower...

متن کامل

Efficiency loss and the linearity condition in dimension reduction

Linearity, sometimes jointly with constant variance, is routinely assumed in the context of sufficient dimension reduction. It is well understood that, when these conditions do not hold, blindly using them may lead to inconsistency in estimating the central subspace and the central mean subspace. Surprisingly, we discover that even if these conditions do hold, using them will bring efficiency l...

متن کامل

Resistant Dimension Reduction

Existing dimension reduction (DR) methods such as ordinary least squares (OLS) and sliced inverse regression (SIR) often perform poorly in the presence of outliers. Ellipsoidal trimming can be used to create outlier resistant DR methods that can also give useful results when the assumption of linearly related predictors is violated. Theory for SIR and OLS is reviewed, and it is shown that sever...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007