2D Dimensionality Reduction Methods without Loss
Authors
Abstract:
In this paper, several two-dimensional extensions of principal component analysis (PCA) and linear discriminant analysis (LDA) techniques has been applied in a lossless dimensionality reduction framework, for face recognition application. In this framework, the benefits of dimensionality reduction were used to improve the performance of its predictive model, which was a support vector machine (SVM) classifier. At the same time, the loss of the useful information was minimized using the projection penalty idea. The well-known face databases were used to train and evaluate the proposed methods. The experimental results indicated that the proposed methods had a higher average classification accuracy in general compared to the classification based on Euclidean distance, and also compared to the methods which first extracted features based on dimensionality reduction technics, and then used SVM classifier as the predictive model.
similar resources
Multilevel dimensionality-reduction methods
When data sets are multilevel (group nesting or repeated measures), different sources of variations must be identified. In the framework of unsupervised analyses, multilevel simultaneous component analysis (MSCA) has recently been proposed as the most satisfactory option for analyzing multilevel data. MSCA estimates submodels for the different levels in data and thereby separates the “within”-s...
full textConvolutional 2D LDA for Nonlinear Dimensionality Reduction
Representing high-volume and high-order data is an essential problem, especially in machine learning field. Although existing two-dimensional (2D) discriminant analysis achieves promising performance, the single and linear projection features make it difficult to analyze more complex data. In this paper, we propose a novel convolutional two-dimensional linear discriminant analysis (2D LDA) meth...
full textSpectral Methods for Dimensionality Reduction
How can we search for low dimensional structure in high dimensional data? If the data is mainly confined to a low dimensional subspace, then simple linear methods can be used to discover the subspace and estimate its dimensionality. More generally, though, if the data lies on (or near) a low dimensional submanifold, then its structure may be highly nonlinear, and linear methods are bound to fai...
full textDimensionality reduction methods for molecular simulations
Molecular simulations produce very highdimensional data-sets with millions of data points. As analysis methods are often unable to cope with so many dimensions, it is common to use dimensionality reduction and clustering methods to reach a reduced representation of the data. Yet these methods often fail to capture the most important features necessary for the construction of a Markov model. Her...
full textUsing Dimensionality Reduction Methods in Text Clustering
High dimensionality of the feature space is one of the major concerns owing to computational complexity and accuracy consideration in the text clustering. Therefore, various dimension reduction methods have been introduced in the literature to select an informative subset (or sub list) of features. As each dimension reduction method uses a different strategy (aspect) to select a subset of featu...
full textStatistical Learning Methods Including Dimensionality Reduction
This special issue ‘Statistical learning methods including dimensionality reduction’ is concerned with situations where the main statistical problem of interest, for example regression, discrimination (supervised classification), and clustering (unsupervised classification), is to be combined with dimension reduction methods. The general objective of this special issue is to collect and present...
full textMy Resources
Journal title
volume 7 issue 1
pages 201- 210
publication date 2019-03-01
By following a journal you will be notified via email when a new issue of this journal is published.
Hosted on Doprax cloud platform doprax.com
copyright © 2015-2023