Robust Kernel Dictionary Learning Using a Whole Sequence Convergent Algorithm
نویسندگان
چکیده
Kernel sparse coding is an effective strategy to capture the non-linear structure of data samples. However, how to learn a robust kernel dictionary remains an open problem. In this paper, we propose a new optimization model to learn the robust kernel dictionary while isolating outliers in the training samples. This model is essentially based on the decomposition of the reconstruction error into small dense noises and large sparse outliers. The outlier error term is formulated as the product of the sample matrix in the feature space and a diagonal coefficient matrix. This facilitates the kernelized dictionary learning. To solve the non-convex optimization problem, we develop a whole sequence convergent algorithm which guarantees the obtained solution sequence is a Cauchy sequence. The experimental results show that the proposed robust kernel dictionary learning method provides significant performance improvement.
منابع مشابه
Design of Non-Linear Discriminative Dictionaries for Image Classification
In recent years there has been growing interest in designing dictionaries for image classification. These methods, however, neglect the fact that data of interest often has non-linear structure. Motivated by the fact that this non-linearity can be handled by the kernel trick, we propose learning of dictionaries in the high-dimensional feature space which are simultaneously reconstructive and di...
متن کاملSpeech Enhancement using Adaptive Data-Based Dictionary Learning
In this paper, a speech enhancement method based on sparse representation of data frames has been presented. Speech enhancement is one of the most applicable areas in different signal processing fields. The objective of a speech enhancement system is improvement of either intelligibility or quality of the speech signals. This process is carried out using the speech signal processing techniques ...
متن کاملMultiple Kernel Learning: A Unifying Probabilistic Viewpoint Multiple Kernel Learning: A Unifying Probabilistic Viewpoint
We present a probabilistic viewpoint to multiple kernel learning unifying well-known regularised risk approaches and recent advances in approximate Bayesian inference relaxations. The framework proposes a general objective function suitable for regression, robust regression and classi cation that is lower bound of the marginal likelihood and contains many regularised risk approaches as special ...
متن کامل`0 norm based dictionary learning by proximal methods with global convergence
Sparse coding and dictionary learning have seen their applications in many vision tasks, which usually is formulated as a non-convex optimization problem. Many iterative methods have been proposed to tackle such an optimization problem. However, it remains an open problem to have a method that is not only practically fast but also is globally convergent. In this paper, we proposed a fast proxim...
متن کاملDiscriminative Dictionaries and Projections for Visual Classification
Title of Dissertation: SPARSE REPRESENTATION, DISCRIMINATIVE DICTIONARIES AND PROJECTIONS FOR VISUAL CLASSIFICATION Ashish Shrivastava, Doctor of Philosophy, 2015 Dissertation directed by: Professor Rama Chellappa Department of Electrical and Computer Engineering Developments in sensing and communication technologies have led to an explosion in the availability of visual data from multiple sour...
متن کامل