MM algorithms for distance covariance based sufficient dimension reduction and sufficient variable selection

نویسندگان

چکیده

Sufficient dimension reduction (SDR) using distance covariance (DCOV) was recently proposed as an approach to dimension-reduction problems. Compared with other SDR methods, it is model-free without estimating link function and does not require any particular distributions on predictors (see Sheng Yin, 2013, 2016). However, the DCOV-based method involves optimizing a nonsmooth nonconvex objective over Stiefel manifold. To tackle numerical challenge, we novelly formulate original equivalently into DC (Difference of Convex functions) program construct iterative algorithm based majorization-minimization (MM) principle. At each step MM algorithm, inexactly solve quadratic subproblem manifold by taking one iteration Riemannian Newton's method. The can also be readily extended sufficient variable selection (SVS) covariance. We establish convergence property under some regularity conditions. Simulation studies show our drastically improves computation efficiency robust across various settings compared existing Supplemental materials for this article are available.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Likelihood-based Sufficient Dimension Reduction

We obtain the maximum likelihood estimator of the central subspace under conditional normality of the predictors given the response. Analytically and in simulations we found that our new estimator can preform much better than sliced inverse regression, sliced average variance estimation and directional regression, and that it seems quite robust to deviations from normality.

متن کامل

Sufficient Dimension Reduction Summaries

Observational studies assessing causal or non-causal relationships between an explanatory measure and an outcome can be complicated by hosts of confounding measures. Large numbers of confounders can lead to several biases in conventional regression based estimation. Inference is more easily conducted if we reduce the number of confounders to a more manageable number. We discuss use of sufficien...

متن کامل

Tensor sufficient dimension reduction.

Tensor is a multiway array. With the rapid development of science and technology in the past decades, large amount of tensor observations are routinely collected, processed, and stored in many scientific researches and commercial activities nowadays. The colorimetric sensor array (CSA) data is such an example. Driven by the need to address data analysis challenges that arise in CSA data, we pro...

متن کامل

Sufficient dimension reduction for censored predictors.

Motivated by a study conducted to evaluate the associations of 51 inflammatory markers and lung cancer risk, we propose several approaches of varying computational complexity for analyzing multiple correlated markers that are also censored due to lower and/or upper limits of detection, using likelihood-based sufficient dimension reduction (SDR) methods. We extend the theory and the likelihood-b...

متن کامل

Sufficient Dimension Reduction for Longitudinal Data

Correlation structure contains important information about longitudinal data. Existing sufficient dimension reduction approaches assuming independence may lead to substantial loss of efficiency. We apply the quadratic inference function to incorporate the correlation information and apply the transformation method to recover the central subspace. The proposed estimators are shown to be consiste...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Statistics & Data Analysis

سال: 2021

ISSN: ['0167-9473', '1872-7352']

DOI: https://doi.org/10.1016/j.csda.2020.107089