Nonlinear Component Analysis as a Kernel Eigenvalue Problem
نویسندگان
چکیده
A new method for performing a nonlinear form of principal component analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in high-dimensional feature spaces, related to input space by some nonlinear map—for instance, the space of all possible five-pixel products in 16×16 images. We give the derivation of the method and present experimental results on polynomial feature extraction for pattern recognition.
منابع مشابه
Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem
Kernel principal component analysis has been introduced as a method of extracting a set of orthonormal nonlinear features from multivariate data, and many impressive applications are being reported within the literature. This article presents the view that the eigenvalue decomposition of a kernel matrix can also provide the discrete expansion coefficients required for a nonparametric orthogonal...
متن کاملMultivariate Statistical Monitoring of Nonlinear Biological Processes Using Kernel Pca
In this paper, a new nonlinear process monitoring technique based upon kernel principal component analysis (KPCA) is developed. In recent years, KPCA has been emerging to tackle the nonlinear monitoring problem. KPCA can efficiently compute principal components in high dimensional feature spaces by the use of integral operator and nonlinear kernel functions. The basic idea of KPCA is to first m...
متن کاملKernel Isomap
Isomap [4] is a manifold learning algorithm, which extends classical multidimensional scaling (MDS) by considering approximate geodesic distance instead of Euclidean distance. The approximate geodesic distance matrix can be interpreted as a kernel matrix, which implies that Isomap can be solved by a kernel eigenvalue problem. However, the geodesic distance kernel matrix is not guaranteed to be ...
متن کاملKernel Discriminative Analysis for Speech Recognition
Linear Discriminative Analysis techniques have been used in pattern recognition to map feature vectors to achieve optimal classification. Kernel Discriminative Analysis(KDA) seeks to introduce non-linearity in this approach by mapping the features to a non-linear space before applying LDA analysis. The formulation is expressed as an eigenvalue problem resolution. Using a different kernel, one c...
متن کاملComparison of input and feature space nonlinear kernel nuisance attribute projections for speaker verification
Nuisance attribute projection (NAP) was an effective method to reduce session variability in SVM-based speaker verification systems. As the expanded feature space of nonlinear kernels is usually high or infinite dimensional, it is difficult to find nuisance directions via conventional eigenvalue analysis and to do projection directly in the feature space. In this paper, two different approaches...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural Computation
دوره 10 شماره
صفحات -
تاریخ انتشار 1998