Uncertainty Principles and Sparse Signal Representations Using Overcomplete Representations

نویسنده

  • Todd P. Coleman
چکیده

This discussion sparse representations of signals in R. The sparsity of a signal is quantified by the number of nonzero components in its representation. Such representations of signals are useful in signal processing, lossy source coding, image processing, etc. We first speak of an uncertainty principle regarding the sparsity of any two different orthonormal basis representations of a signal S. Next, describing a signal as an overcomplete description involving a pair of orthonormal bases is considered. Because this is an overcomplete description, many possible representations exist. The hope is that the most sparse representation is a lot better than any representation using a single orthonormal basis. The uncertainty principle can be exploited to provide conditions upon when the most sparse overcomple description is unique. Next, performing the optimization search is considered, which in general is nonconvex and combinatorial. However, it is illustrated that if the most sparse representation is unique and is sufficiently sparse, it can be found using a linear programming formulation, which is considerably more computationally affordable. The notion of ’sufficiently sparse’ depends upon the pair of orthonormal bases, and in particular their mutual incoherence. We then explore typical mutual incoherence between pairs of bases, and discuss some (idealized) applications. We consider a signal S ∈ R of unit l2 energy. We are provided two different orthonormal bases A and B, described in terms of matrices where each column is one of the orthonormal vectors, i.e. A = [a1a2...aN ] and B = [b1b2...bN ]. We note that for each of the bases individually, S is uniquely

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On sparse signal representations

An elementary proof of a basic uncertainty principle concerning pairs of representations of WN vectors in different orthonormal bases is provided. The result, slightly stronger than stated before, has a direct impact on the uniqueness property of the sparse representation of such vectors using pairs of orthonormal bases as overcomplete dictionaries. The main contribution in this paper is the im...

متن کامل

Training sparse natural image models with a fast Gibbs sampler of an extended state space

We present a new learning strategy based on an efficient blocked Gibbs sampler for sparse overcomplete linear models. Particular emphasis is placed on statistical image modeling, where overcomplete models have played an important role in discovering sparse representations. Our Gibbs sampler is faster than general purpose sampling schemes while also requiring no tuning as it is free of parameter...

متن کامل

Image Classification via Sparse Representation and Subspace Alignment

Image representation is a crucial problem in image processing where there exist many low-level representations of image, i.e., SIFT, HOG and so on. But there is a missing link across low-level and high-level semantic representations. In fact, traditional machine learning approaches, e.g., non-negative matrix factorization, sparse representation and principle component analysis are employed to d...

متن کامل

Efficient Sparse Coding in Early Sensory Processing: Lessons from Signal Recovery

Sensory representations are not only sparse, but often overcomplete: coding units significantly outnumber the input units. For models of neural coding this overcompleteness poses a computational challenge for shaping the signal processing channels as well as for using the large and sparse representations in an efficient way. We argue that higher level overcompleteness becomes computationally tr...

متن کامل

An EM algorithm for learning sparse and overcomplete representations

An expectation-maximization (EM) algorithm for learning sparse and overcomplete representations is presented in this paper. We show that the estimation of the conditional moments of the posterior distribution can be accomplished by maximum a posteriori estimation. The approximate conditional moments enable the development of an EM algorithm for learning the overcomplete basis vectors and inferr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2002