A Universal Variance Reduction-Based Catalyst for Nonconvex Low-Rank Matrix Recovery
نویسندگان
چکیده
We propose a generic framework based on a new stochastic variance-reduced gradient descent algorithm for accelerating nonconvex low-rank matrix recovery. Starting from an appropriate initial estimator, our proposed algorithm performs projected gradient descent based on a novel semi-stochastic gradient specifically designed for low-rank matrix recovery. Based upon the mild restricted strong convexity and smoothness conditions, we derive a projected notion of the restricted Lipschitz continuous gradient property, and prove that our algorithm enjoys linear convergence rate to the unknown low-rank matrix with an improved computational complexity. Moreover, our algorithm can be employed to both noiseless and noisy observations, where the optimal sample complexity and the minimax optimal statistical rate can be attained respectively. We further illustrate the superiority of our generic framework through several specific examples, both theoretically and experimentally.
منابع مشابه
A Universal Variance Reduction-Based Catalyst for Nonconvex Low-Rank Matrix Recovery
A. Additional Applications and Experimental Results In this section, we present the application of our generic framework to one-bit matrix completion as well as additional experimental results for matrix sensing. A.1. One-bit Matrix Completion Compared with matrix completion, we only observe the sign of each noisy entries of the unknown low-rank matrix X⇤ in one-bit matrix completion (Davenport...
متن کاملA Unified Variance Reduction-Based Framework for Nonconvex Low-Rank Matrix Recovery
We propose a generic framework based on a new stochastic variance-reduced gradient descent algorithm for accelerating nonconvex low-rank matrix recovery. Starting from an appropriate initial estimator, our proposed algorithm performs projected gradient descent based on a novel semi-stochastic gradient specifically designed for low-rank matrix recovery. Based upon the mild restricted strong conv...
متن کاملStochastic Variance-reduced Gradient Descent for Low-rank Matrix Recovery from Linear Measurements
We study the problem of estimating low-rank matrices from linear measurements (a.k.a., matrix sensing) through nonconvex optimization. We propose an efficient stochastic variance reduced gradient descent algorithm to solve a nonconvex optimization problem of matrix sensing. Our algorithm is applicable to both noisy and noiseless settings. In the case with noisy observations, we prove that our a...
متن کاملFace Recognition Based Rank Reduction SVD Approach
Standard face recognition algorithms that use standard feature extraction techniques always suffer from image performance degradation. Recently, singular value decomposition and low-rank matrix are applied in many applications,including pattern recognition and feature extraction. The main objective of this research is to design an efficient face recognition approach by combining many tech...
متن کاملA Nonconvex Free Lunch for Low-Rank plus Sparse Matrix Recovery
We study the problem of low-rank plus sparse matrix recovery. We propose a generic and efficient nonconvex optimization algorithm based on projected gradient descent and double thresholding operator, with much lower computational complexity. Compared with existing convex-relaxation based methods, the proposed algorithm recovers the low-rank plus sparse matrices for free, without incurring any a...
متن کامل