Finding Low-rank Solutions of Sparse Linear Matrix Inequalities using Convex Optimization
نویسندگان
چکیده
This paper is concerned with the problem of finding a low-rank solution of an arbitrary sparse linear matrix inequality (LMI). To this end, we map the sparsity of the LMI problem into a graph. We develop a theory relating the rank of the minimum-rank solution of the LMI problem to the sparsity of its underlying graph. Furthermore, we propose three graph-theoretic convex programs to obtain a low-rank solution. Two of these convex optimization problems need a tree decomposition of the sparsity graph, which is an NP-hard problem in the worst case. The third one does not rely on any computationally-expensive graph analysis and is always polynomial-time solvable. The results of this work can be readily applied to three separate problems of minimumrank matrix completion, conic relaxation for polynomial optimization, and affine rank minimization. The results are finally illustrated on two applications of optimal distributed control and nonlinear optimization for electrical networks.
منابع مشابه
Finding the Largest Low-Rank Clusters With Ky Fan 2-k-Norm and ℓ1-Norm
We propose a convex optimization formulation with the Ky Fan 2-k-norm and `1-norm to find k largest approximately rank-one submatrix blocks of a given nonnegative matrix that has low-rank block diagonal structure with noise. We analyze low-rank and sparsity structures of the optimal solutions using properties of these two matrix norms. We show that, under certain hypotheses, with high probabili...
متن کاملMathematics of sparsity ( and a few other things )
In the last decade, there has been considerable interest in understanding when it is possible to find structured solutions to underdetermined systems of linear equations. This paper surveys some of the mathematical theories, known as compressive sensing and matrix completion, that have been developed to find sparse and low-rank solutions via convex programming techniques. Our exposition emphasi...
متن کاملTensor completion and low-n-rank tensor recovery via convex optimization
In this paper we consider sparsity on a tensor level, as given by the n-rank of a tensor. In the important sparse-vector approximation problem (compressed sensing) and the low-rank matrix recovery problem, using a convex relaxation technique proved to be a valuable solution strategy. Here, we will adapt these techniques to the tensor setting. We use the n-rank of a tensor as sparsity measure an...
متن کاملStrongly Convex Programming for Principal Component Pursuit
In this paper, we address strongly convex programming for principal component pursuit with reduced linear measurements, which decomposes a superposition of a low-rank matrix and a sparse matrix from a small set of linear measurements. We first provide sufficient conditions under which the strongly convex models lead to the exact low-rank and sparse matrix recovery; Second, we also give suggesti...
متن کاملLeast-Squares Covariance Matrix Adjustment
We consider the problem of finding the smallest adjustment to a given symmetric n × n matrix, as measured by the Euclidean or Frobenius norm, so that it satisfies some given linear equalities and inequalities, and in addition is positive semidefinite. This least-squares covariance adjustment problem is a convex optimization problem, and can be efficiently solved using standard methods when the ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM Journal on Optimization
دوره 27 شماره
صفحات -
تاریخ انتشار 2017