Lecture 10 : Applications of Low Rank Approximation in Optimization
نویسنده
چکیده
Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications. In this lecture we describe applications of low rank approximation in optimization. Firstly, let us give a short overview of the last lecture. We defined the operator norm of a matrix ‖.‖2 and the Frobenius norm ‖.‖F and we showed that the best rank k approximation of a given matrix M is the matrix that chooses singular vectors corresponding to the largest k singular value of M .
منابع مشابه
Exact Solutions in Structured Low-Rank Approximation
Structured low-rank approximation is the problem of minimizing a weighted Frobenius distance to a given matrix among all matrices of fixed rank in a linear space of matrices. We study the critical points of this optimization problem using algebraic geometry. A particular focus lies on Hankel matrices, Sylvester matrices and generic linear spaces.
متن کاملLow-Rank Matrix Approximation with Stability
Low-rank matrix approximation has been widely adopted in machine learning applications with sparse data, such as recommender systems. However, the sparsity of the data, incomplete and noisy, introduces challenges to the algorithm stability – small changes in the training data may significantly change the models. As a result, existing low-rank matrix approximation solutions yield low generalizat...
متن کاملFactorization Approach to Structured Low-Rank Approximation with Applications
We consider the problem of approximating an affinely structured matrix, for example a Hankel matrix, by a low-rank matrix with the same structure. This problem occurs in system identification, signal processing and computer algebra, among others. We impose the low-rank by modeling the approximation as a product of two factors with reduced dimension. The structure of the low-rank model is enforc...
متن کاملLow-Rank Approximation and Completion of Positive Tensors
Unlike the matrix case, computing low-rank approximations of tensors is NP-hard and numerically ill-posed in general. Even the best rank-1 approximation of a tensor is NP-hard. In this paper, we use convex optimization to develop polynomial-time algorithms for low-rank approximation and completion of positive tensors. Our approach is to use algebraic topology to define a new (numerically well-p...
متن کاملStructured Low-Rank Approximation with Missing Data
We consider low-rank approximation of affinely structured matrices with missing elements. The method proposed is based on reformulation of the problem as inner and outer optimization. The inner minimization is a singular linear least-norm problem and admits an analytic solution. The outer problem is a nonlinear least squares problem and is solved by local optimization methods: minimization subj...
متن کامل