نتایج جستجو برای: functional principal component analysis

تعداد نتایج: 3758525  

2009
JASON MORTON

Multivariate Gaussian data is completely characterized by its mean and covariance, yet modern non-Gaussian data makes higher-order statistics such as cumulants inevitable. For univariate data, the third and fourth scalar-valued cumulants are relatively well-studied as skewness and kurtosis. For multivariate data, these cumulants are tensor-valued, higher-order analogs of the covariance matrix c...

2016
Ikko Yamane Florian Yger Maxime Berar Masashi Sugiyama

Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose ...

1997
Bernhard Schölkopf Alexander J. Smola Klaus-Robert Müller

A new method for performing a nonlinear form of Principal Component Analysis is proposed. By the use of integral operator kernel functions, one can e ciently compute principal components in high{ dimensional feature spaces, related to input space by some nonlinear map; for instance the space of all possible d{pixel products in images. We give the derivation of the method and present experimenta...

2002
Kristin M. Branson Sameer Agarwal

Many tasks involving high-dimensional data, such as face recognition, suffer from the curse of dimensionality: the number of training samples required to accurately learn a classifier increases exponentially with the dimensionality of the data. Structured Principal Component Analysis (SPCA) reduces the dimensionality of the data by choosing a small number of features to represent larger sets of...

Journal: :CoRR 2017
Abubakar Abid Vivek Kumar Bagaria Martin J. Zhang James Y. Zou

We present a new technique called contrastive principal component analysis (cPCA) that is designed to discover low-dimensional structure that is unique to a dataset, or enriched in one dataset relative to other data. The technique is a generalization of standard PCA, for the setting where multiple datasets are available – e.g. a treatment and a control group, or a mixed versus a homogeneous pop...

2013
A Akinduko

Principal component analysis (PCA) is an important tool in exploring data. The conventional approach to PCA leads to a solution which favours the structures with large variances. This is sensitive to outliers and could obfuscate interesting underlying structures. One of the equivalent definitions of PCA is that it seeks the subspaces that maximize the sum of squared pairwise distances between d...

Journal: :IEEE transactions on neural networks 1999
Jie Luo Bo Hu Xieting Ling Ruey-Wen Liu

Conventional blind signal separation algorithms do not adopt any asymmetric information of the input sources, thus the convergence point of a single output is always unpredictable. However, in most of the applications, we are usually interested in only one or two of the source signals and prior information is almost always available. In this paper, a principal independent component analysis (PI...

2018
Thibaud Taillefumier

where each column is a data sample. Analyzing—and hopefully understanding— the result of an experiment often consists in uncovering regularity or structure in the data matrix. Unfortunately, measured data is often “messy” in the sense that it is too high-dimensional for us to detect structure by direct inspection and in the sense that noise and/or redundancy often impairs data visualization. Pr...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید