A Sharp Sufficient Condition for Sparsity Pattern Recovery
Authors
Abstract:
Sufficient number of linear and noisy measurements for exact and approximate sparsity pattern/support set recovery in the high dimensional setting is derived. Although this problem as been addressed in the recent literature, there is still considerable gaps between those results and the exact limits of the perfect support set recovery. To reduce this gap, in this paper, the sufficient condition is enhanced. A specific form of a Joint Typicality decoder is used for the support recovery task. Two performance metrics are considered for the recovery validation; one, which considers exact support recovery, and the other which seeks partial support recovery. First, an upper bound is obtained on the error probability of the sparsity pattern recovery. Next, using the mentioned upper bound, sufficient number of measurements for reliable support recovery is derived. It is shown that the sufficient condition for reliable support recovery depends on three key parameters of the problem; the noise variance, the minimum nonzero entry of the unknown sparse vector and the sparsity level. Simulations are performed for different sparsity rate, different noise variances, and different distortion levels. The results show that for all the mentioned cases the proposed methodology increases convergence rate of upper bound of the error probability of support recovery significantly which leads to a lower error probability bound compared with previously proposed bounds.
similar resources
Sharp Sufficient Conditions on Exact Sparsity Pattern Recovery
Consider the n-dimensional vector y = Xβ+ ǫ, where β ∈ R has only k nonzero entries and ǫ ∈ R is a Gaussian noise. This can be viewed as a linear system with sparsity constraints, corrupted by noise. We find a non-asymptotic upper bound on the probability that the optimal decoder for β declares a wrong sparsity pattern, given any generic perturbation matrix X . In the case when X is randomly dr...
full textTight Sufficient Conditions on Exact Sparsity Pattern Recovery
A noisy underdetermined system of linear equations is considered in which a sparse vector (a vector with a few nonzero elements) is subject to measurement. The measurement matrix elements are drawn from a Gaussian distribution. We study the information-theoretic constraints on exact support recovery of a sparse vector from the measurement vector and matrix. We compute a tight, sufficient condit...
full textNecessary and Sufficient Conditions on Sparsity Pattern Recovery
The problem of detecting the sparsity pattern of a k-sparse vector in Rn from m random noisy measurements is of interest in many areas such as system identification, denoising, pattern recognition, and compressed sensing. This paper addresses the scaling of the number of measurements m, with signal dimension n and sparsity-level nonzeros k, for asymptotically-reliable detection. We show a neces...
full textEfficient Sparsity Pattern Recovery
The theory of compressed sensing shows that sparsity pattern (or support) of a sparse signal can be recovered from a small number of appropriate linear projections (samples). Unfortunately, as soon as noise is added, the number of required samples exceeds the full signal dimension, rendering compressed sensing ineffective. In recent work, we have shown that this can be fixed if a small distorti...
full textSharp thresholds for high-dimensional and noisy recovery of sparsity
The problem of consistently estimating the sparsity pattern of a vector β∗ ∈ R based on observations contaminated by noise arises in various contexts, including subset selection in regression, structure estimation in graphical models, sparse approximation, and signal denoising. We analyze the behavior of l1-constrained quadratic programming (QP), also referred to as the Lasso, for recovering th...
full textSparsity Pattern Recovery in Compressed Sensing
Sparsity Pattern Recovery in Compressed Sensing by Galen Reeves Doctor of Philosophy in Engineering — Electrical Engineering and Computer Sciences University of California, Berkeley Professor Michael Gastpar, Chair The problem of recovering sparse signals from a limited number of measurements is now ubiquitous in signal processing, statistics, and machine learning. A natural question of fundame...
full textMy Resources
Journal title
volume 8 issue 1
pages 13- 23
publication date 2020-01-01
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023