Matrix Factor Analysis: From Least Squares to Iterative Projection

نویسندگان

چکیده

In this article, we study large-dimensional matrix factor models and estimate the loading matrices score by minimizing square loss function. Interestingly, resultant estimators coincide with Projected Estimators (PE) in Yu et al. which was proposed from perspective of simultaneous reduction dimensionality magnitudes idiosyncratic error matrix. other word, provide a least-square interpretation PE for model, parallels to PCA vector model. We derive convergence rates theoretical minimizers under sub-Gaussian tails. Considering robustness heavy tails errors, extend least squares Huber function, leads weighted iterative projection approach compute learn parameters. also function bounded fourth or even (2+ϵ) th moment errors. conduct extensive numerical studies investigate empirical performance relative state-of-the-art ones. The perform robustly much better than existing ones when data are heavy-tailed, as result can be used safe replacement practice. An application Fama-French financial portfolio dataset demonstrates advantage estimator.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Gradient Projection Iterative Sketch for Large-Scale Constrained Least-Squares

We propose a randomized first order optimization algorithm Gradient Projection Iterative Sketch (GPIS) and an accelerated variant for efficiently solving large scale constrained Least Squares (LS). We provide the first theoretical convergence analysis for both algorithms. An efficient implementation using a tailored linesearch scheme is also proposed. We demonstrate our methods’ computational e...

متن کامل

Iterative Reweighted Least Squares ∗

Describes a powerful optimization algorithm which iteratively solves a weighted least squares approximation problem in order to solve an L_p approximation problem. 1 Approximation Methods of approximating one function by another or of approximating measured data by the output of a mathematical or computer model are extraordinarily useful and ubiquitous. In this note, we present a very powerful ...

متن کامل

Iterative least-squares solutions of coupled Sylvester matrix equations

In this paper, we present a general family of iterative methods to solve linear equations, which includes the well-known Jacobi and Gauss–Seidel iterations as its special cases. The methods are extended to solve coupled Sylvester matrix equations. In our approach, we regard the unknown matrices to be solved as the system parameters to be identified, and propose a least-squares iterative algorit...

متن کامل

​Rank based Least-squares Independent Component Analysis

  In this paper, we propose a nonparametric rank-based alternative to the least-squares independent component analysis algorithm developed. The basic idea is to estimate the squared-loss mutual information, which used as the objective function of the algorithm, based on its copula density version. Therefore, no marginal densities have to be estimated. We provide empirical evaluation of th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Business & Economic Statistics

سال: 2023

ISSN: ['1537-2707', '0735-0015']

DOI: https://doi.org/10.1080/07350015.2023.2191676