Sparse recovery under weak moment assumptions
نویسندگان
چکیده
We prove that iid random vectors that satisfy a rather weak moment assumption can be used as measurement vectors in Compressed Sensing, and the number of measurements required for exact reconstruction is the same as the best possible estimate – exhibited by a random Gaussian matrix. We then show that this moment condition is necessary, up to a log log factor. In addition, we explore the Compatibility Condition and the Restricted Eigenvalue Condition in the noisy setup, as well as properties of neighbourly random polytopes.
منابع مشابه
A remark on weaken restricted isometry property in compressed sensing
The restricted isometry property (RIP) has become well-known in the compressed sensing community. Recently, a weaken version of RIP was proposed for exact sparse recovery under weak moment assumptions. In this note, we prove that the weaken RIP is also sufficient for stable and robust sparse recovery by linking it with a recently introduced robust width property in compressed sensing. Moreover,...
متن کاملThe lower tail of random quadratic forms, with applications to ordinary least squares and restricted eigenvalue properties
Finite sample properties of random covariance-type matrices have been the subject of much research. In this paper we focus on the “lower tail”’ of such a matrix, and prove that it is subgaussian under a simple fourth moment assumption on the onedimensional marginals of the random vectors. A similar result holds for more general sums of random positive semidefinite matrices, and the (relatively ...
متن کاملWeak Recovery Conditions from Graph Partitioning Bounds and Order Statistics
We study a weaker formulation of the nullspace property which guarantees recovery of sparse signals from linear measurements by `1 minimization. We require this condition to hold only with high probability, given a distribution on the nullspace of the coding matrix A. Under some assumptions on the distribution of the reconstruction error, we show that testing these weak conditions means boundin...
متن کاملBlock-sparse Solutions using Kernel Block RIP and its Application to Group Lasso
We propose kernel block restricted isometry property (KB-RIP) as a generalization of the well-studied RIP and prove a variety of results. First, we present a “sumof-norms”-minimization based formulation of the sparse recovery problem and prove that under suitable conditions on KB-RIP, it recovers the optimal sparse solution exactly. The Group Lasso formulation, widely used as a good heuristic, ...
متن کاملSparse Matrix Factorization
We investigate the problem of factoring a matrix into several sparse matrices and propose an algorithm for this under randomness and sparsity assumptions. This problem can be viewed as a simplification of the deep learning problem where finding a factorization corresponds to finding edges in different layers and also values of hidden units. We prove that under certain assumptions on a sparse li...
متن کامل