Large dimensional analysis and optimization of robust shrinkage covariance matrix estimators

نویسندگان

  • Romain Couillet
  • Matthew R. McKay
چکیده

This article studies two regularized robust estimators of scatter matrices proposed in parallel in (Chen et al., 2011) and (Pascal et al., 2013), based on Tyler’s robust M-estimator (Tyler, 1987) and on Ledoit and Wolf’s shrinkage covariance matrix estimator (Ledoit and Wolf, 2004). These hybrid estimators have the advantage of conveying (i) robustness to outliers or impulsive samples and (ii) small sample size adequacy to the classical sample covariance matrix estimator. We consider here the case of i.i.d. elliptical zero mean samples in the regime where both sample and population sizes are large. We demonstrate that, under this setting, the estimators under study asymptotically behave similar to wellunderstood random matrix models. This characterization allows us to derive optimal shrinkage strategies to estimate the population scatter matrix, improving significantly upon the empirical shrinkage method proposed in (Chen et al., 2011).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Shrinkage Estimators for High-Dimensional Covariance Matrices

As high-dimensional data becomes ubiquitous, standard estimators of the population covariance matrix become difficult to use. Specifically, in the case where the number of samples is small (large p small n) the sample covariance matrix is not positive definite. In this paper we explore some recent estimators of sample covariance matrices in the large p, small n setting namely, shrinkage estimat...

متن کامل

Nonparametric Stein-type shrinkage covariance matrix estimators in high-dimensional settings

Estimating a covariance matrix is an important task in applications where the number of variables is larger than the number of observations. In the literature, shrinkage approaches for estimating a high-dimensional covariance matrix are employed to circumvent the limitations of the sample covariance matrix. A new family of nonparametric Stein-type shrinkage covariance estimators is proposed who...

متن کامل

Covariance Matrix Estimation for Reinforcement Learning

One of the goals in scaling reinforcement learning (RL) pertains to dealing with high-dimensional and continuous stateaction spaces. In order to tackle this problem, recent efforts have focused on harnessing well-developed methodologies from statistical learning, estimation theory and empirical inference. A key related challenge is tuning the many parameters and efficiently addressing numerical...

متن کامل

Shrinkage Estimation of Large Dimensional Precision Matrix Using Random Matrix Theory

This paper considers ridge-type shrinkage estimation of a large dimensional precision matrix. The asymptotic optimal shrinkage coefficients and the theoretical loss are derived. Data-driven estimators for the shrinkage coefficients are also conducted based on the asymptotic results from random matrix theory. The new method is distribution-free and no assumption on the structure of the covarianc...

متن کامل

Improved Estimation of the Covariance Matrix of Stock Returns With an Application to Portfolio Selection

This paper proposes to estimate the covariance matrix of stock returns by an optimally weighted average of two existing estimators: the sample covariance matrix and single-index covariance matrix. This method is generally known as shrinkage, and it is standard in decision theory and in empirical Bayesian statistics. Our shrinkage estimator can be seen as a way to account for extra-market covari...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Multivariate Analysis

دوره 131  شماره 

صفحات  -

تاریخ انتشار 2014