Stochastic Variance-reduced Gradient Descent for Low-rank Matrix Recovery from Linear Measurements
نویسندگان
چکیده
We study the problem of estimating low-rank matrices from linear measurements (a.k.a., matrix sensing) through nonconvex optimization. We propose an efficient stochastic variance reduced gradient descent algorithm to solve a nonconvex optimization problem of matrix sensing. Our algorithm is applicable to both noisy and noiseless settings. In the case with noisy observations, we prove that our algorithm converges to the unknown low-rank matrix at a linear rate up to the minimax optimal statistical error. And in the noiseless setting, our algorithm is guaranteed to linearly converge to the unknown low-rank matrix and achieves exact recovery with optimal sample complexity. Most notably, the overall computational complexity of our proposed algorithm, which is defined as the iteration complexity times per iteration time complexity, is lower than the state-of-the-art algorithms based on gradient descent. Experiments on synthetic data corroborate the superiority of the proposed algorithm over the state-of-the-art algorithms.
منابع مشابه
A Universal Variance Reduction-Based Catalyst for Nonconvex Low-Rank Matrix Recovery
We propose a generic framework based on a new stochastic variance-reduced gradient descent algorithm for accelerating nonconvex low-rank matrix recovery. Starting from an appropriate initial estimator, our proposed algorithm performs projected gradient descent based on a novel semi-stochastic gradient specifically designed for low-rank matrix recovery. Based upon the mild restricted strong conv...
متن کاملA Unified Variance Reduction-Based Framework for Nonconvex Low-Rank Matrix Recovery
We propose a generic framework based on a new stochastic variance-reduced gradient descent algorithm for accelerating nonconvex low-rank matrix recovery. Starting from an appropriate initial estimator, our proposed algorithm performs projected gradient descent based on a novel semi-stochastic gradient specifically designed for low-rank matrix recovery. Based upon the mild restricted strong conv...
متن کاملGlobal Optimality of Local Search for Low Rank Matrix Recovery
We show that there are no spurious local minima in the non-convex factorized parametrization of low-rank matrixrecovery from incoherent linear measurements. With noisy measurements we show all local minima are very close to aglobal optimum. Together with a curvature bound at saddle points, this yields a polynomial time global convergenceguarantee for stochastic gradient descent ...
متن کاملGuarantees of Riemannian Optimization for Low Rank Matrix Recovery
We establish theoretical recovery guarantees of a family of Riemannian optimization algorithms for low rank matrix recovery, which is about recovering an m × n rank r matrix from p < mn number of linear measurements. The algorithms are first interpreted as the iterative hard thresholding algorithms with subspace projections. Then based on this connection, we prove that if the restricted isometr...
متن کاملRiemannian stochastic variance reduced gradient
Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large but finite number of loss functions. In this paper, we propose a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a manifold search space. The key challenges of averaging, adding, and subtracting multiple gradients are addressed with r...
متن کامل