Dictionary Learning with Large Step Gradient Descent for Sparse Representations
نویسندگان
چکیده
This work presents a new algorithm for dictionary learning. Existing algorithms such as MOD and K-SVD often fail to find the best dictionary because they get trapped in a local minimum. Olshausen and Field’s Sparsenet algorithm relies on a fixed step projected gradient descent. With the right step, it can avoid local minima and converge towards the global minimum. The problem then becomes to find the right step size. In this work we provide the expression of the optimal step for the gradient descent but the step we use is twice as large as the optimal step. That large step allows the descent to bypass local minima and yields significantly better results than existing algorithms. The algorithms are compared on synthetic data. Our method outperforms existing algorithms both in approximation quality and in perfect recovery rate if an oracle support for the sparse representation is provided.
منابع مشابه
Thresholded Smoothed- (sl0) Dictionary Learning for Sparse Representations
In this paper, we suggest to use a modified version of Smoothed0 (SL0) algorithm in the sparse representation step of iterative dictionary learning algorithms. In addition, we use a steepest descent for updating the non unit columnnorm dictionary instead of unit column-norm dictionary. Moreover, to do the dictionary learning task more blindly, we estimate the average number of active atoms in t...
متن کاملThresholded smoothed-l0(SL0) dictionary learning for sparse representations
In this paper, we suggest to use a modified version of Smoothed-!0 (SL0) algorithm in the sparse representation step of iterative dictionary learning algorithms. In addition, we use a steepest descent for updating the non unit columnnorm dictionary instead of unit column-norm dictionary. Moreover, to do the dictionary learning task more blindly, we estimate the average number of active atoms in...
متن کاملIncrementally Built Dictionary Learning for Sparse Representation
Extracting sparse representations with Dictionary Learning (DL) methods has led to interesting image and speech recognition results. DL has recently been extended to supervised learning (SDL) by using the dictionary for feature extraction and classification. One challenge with SDL is imposing diversity for extracting more discriminative features. To this end, we propose Incrementally Built Dict...
متن کاملGlobal optimization of factor models and dictionary learning using alternating minimization
Learning new representations in machine learning is often tackled using a factorization of the data. For many such problems, including sparse coding and matrix completion, learning these factorizations can be difficult, in terms of efficiency and to guarantee that the solution is a global minimum. Recently, a general class of objectives have been introduced, called induced regularized factor mo...
متن کاملStochastic Coordinate Coding and Its Application for Drosophila Gene Expression Pattern Annotation
Drosophila melanogaster has been established as a model organism for investigating the fundamental principles of developmental gene interactions. The gene expression patterns of Drosophila melanogaster can be documented as digital images, which are annotated with anatomical ontology terms to facilitate pattern discovery and comparison. The automated annotation of gene expression pattern images ...
متن کامل