The Mathematics of Divergence Based Online Learning in Vector Quantization
نویسندگان
چکیده
We propose the utilization of divergences in gradient descent learning of supervised and unsupervised vector quantization as an alternative for the squared Euclidean distance. The approach is based on the determination of the Fréchet-derivatives for the divergences, wich can be immediately plugged into the online-learning rules.We provide themathematical foundation of the respective framework. This framework includes usual gradient descent learning of prototypes as well as parameter optimization and relevance learning for improvement of the performance.
منابع مشابه
Divergence based Learning Vector Quantization
We suggest the use of alternative distance measures for similarity based classification in Learning Vector Quantization. Divergences can be employed whenever the data consists of non-negative normalized features, which is the case for, e.g., spectral data or histograms. As examples, we derive gradient based training algorithms in the framework of Generalized Learning Vector Quantization based o...
متن کاملA Higher Order Online Lyapunov-Based Emotional Learning for Rough-Neural Identifiers
o enhance the performances of rough-neural networks (R-NNs) in the system identification, on the base of emotional learning, a new stable learning algorithm is developed for them. This algorithm facilitates the error convergence by increasing the memory depth of R-NNs. To this end, an emotional signal as a linear combination of identification error and its differences is used to achie...
متن کاملNGTSOM: A Novel Data Clustering Algorithm Based on Game Theoretic and Self- Organizing Map
Identifying clusters is an important aspect of data analysis. This paper proposes a noveldata clustering algorithm to increase the clustering accuracy. A novel game theoretic self-organizingmap (NGTSOM ) and neural gas (NG) are used in combination with Competitive Hebbian Learning(CHL) to improve the quality of the map and provide a better vector quantization (VQ) for clusteringdata. Different ...
متن کاملDivergence-based classification in learning vector quantization
We discuss the use of divergences in dissimilarity based classification. Divergences can be employed whenever vectorial data consists of non-negative, potentially normalized features. This is, for instance, the case in spectral data or histograms. In particular, we introduce and study Divergence Based Learning Vector Quantization (DLVQ). We derive cost function based DLVQ schemes for the family...
متن کاملMaximum Likelihood Topographic Map Formation
We introduce a new unsupervised learning algorithm for kernel-based topographic map formation of heteroscedastic gaussian mixtures that allows for a unified account of distortion error (vector quantization), log-likelihood, and Kullback-Leibler divergence.
متن کامل