Statistical properties of kernel principal component analysis
نویسندگان
چکیده
منابع مشابه
Kernel Principal Component Analysis
A new method for performing a nonlinear form of Principal Component Analysis is proposed. By the use of integral operator kernel functions, one can e ciently compute principal components in high{ dimensional feature spaces, related to input space by some nonlinear map; for instance the space of all possible d{pixel products in images. We give the derivation of the method and present experimenta...
متن کاملRobust Kernel Principal Component Analysis
Kernel Principal Component Analysis (KPCA) is a popular generalization of linear PCA that allows non-linear feature extraction. In KPCA, data in the input space is mapped to higher (usually) dimensional feature space where the data can be linearly modeled. The feature space is typically induced implicitly by a kernel function, and linear PCA in the feature space is performed via the kernel tric...
متن کاملStreaming Kernel Principal Component Analysis
Kernel principal component analysis (KPCA) provides a concise set of basis vectors which capture nonlinear structures within large data sets, and is a central tool in data analysis and learning. To allow for nonlinear relations, typically a full n ⇥ n kernel matrix is constructed over n data points, but this requires too much space and time for large values of n. Techniques such as the Nyström ...
متن کاملDistributed Kernel Principal Component Analysis
Kernel Principal Component Analysis (KPCA) is a key technique in machine learning for extracting the nonlinear structure of data and pre-processing it for downstream learning algorithms. We study the distributed setting in which there are multiple workers, each holding a set of points, who wish to compute the principal components of the union of their pointsets. Our main result is a communicati...
متن کاملSparse Kernel Principal Component Analysis
'Kernel' principal component analysis (PCA) is an elegant nonlinear generalisation of the popular linear data analysis method, where a kernel function implicitly defines a nonlinear transformation into a feature space wherein standard PCA is performed. Unfortunately, the technique is not 'sparse', since the components thus obtained are expressed in terms of kernels associated with every trainin...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2006
ISSN: 0885-6125,1573-0565
DOI: 10.1007/s10994-006-6895-9