L1-norm-based (2D)PCA
نویسنده
چکیده
Traditional bidirectional two-dimension (2D) principal component analysis ((2D)PCA-L2) is sensitive to outliers because its objective function is the least squares criterion based on L2-norm. This paper proposes a simple but effective L1-norm-based bidirectional 2D principal component analysis ((2D)PCA-L1), which jointly takes advantage of the merits of bidirectional 2D subspace learning and L1-normbased distance criterion. Experimental results on two popular face databases show that the proposed method is more robust to outliers than several methods based on principal component analysis in the fields of data compression and object recognition. Keywords-bidirectional two-dimension principal component analysis; l2-norm; outliers; L1-norm; Optimization
منابع مشابه
Robust Sparse 2D Principal Component Analysis for Object Recognition
We extensively investigate robust sparse two dimensional principal component analysis (RS2DPCA) that makes the best of semantic, structural information and suppresses outliers in this paper. The RS2DPCA combines the advantages of sparsity, 2D data format and L1-norm for data analysis. We also prove that RS2DPCA can offer a good solution of seeking spare 2D principal components. To verify the pe...
متن کاملL1-norm Principal-Component Analysis in L2-norm-reduced-rank Data Subspaces
Standard Principal-Component Analysis (PCA) is known to be very sensitive to outliers among the processed data. On the other hand, in has been recently shown that L1-norm-based PCA (L1-PCA) exhibits sturdy resistance against outliers, while it performs similar to standard PCA when applied to nominal or smoothly corrupted data. Exact calculation of the K L1-norm Principal Components (L1-PCs) of ...
متن کاملA pure L1L1-norm principal component analysis
The L1 norm has been applied in numerous variations of principal component analysis (PCA). L1-norm PCA is an attractive alternative to traditional L2-based PCA because it can impart robustness in the presence of outliers and is indicated for models where standard Gaussian assumptions about the noise may not apply. Of all the previously-proposed PCA schemes that recast PCA as an optimization pro...
متن کاملAn efficient algorithm for L1-norm principal component analysis
Principal component analysis (PCA) (also called Karhunen Loève transform) has been widely used for dimensionality reduction, denoising, feature selection, subspace detection and other purposes. However, traditional PCA minimizes the sum of squared errors and suffers from both outliers and large feature noises. The L1-norm based PCA (more precisely L1,1 norm) is more robust. Yet, the optimizatio...
متن کاملL1-norm Kernel PCA
We present the first model and algorithm for L1-norm kernel PCA. While L2-norm kernel PCA has been widely studied, there has been no work on L1-norm kernel PCA. For this non-convex and non-smooth problem, we offer geometric understandings through reformulations and present an efficient algorithm where the kernel trick is applicable. To attest the efficiency of the algorithm, we provide a conver...
متن کامل