نتایج جستجو برای: principal component
تعداد نتایج: 700522 فیلتر نتایج به سال:
We consider the online version of the well known Principal Component Analysis (PCA) problem. In standard PCA, the input to the problem is a set of vectors X = [x1, . . . , xn] in Rd×n and a target dimension k < d; the output is a set of vectors Y = [y1, . . . , yn] in Rk×n that minimize minΦ ‖X − ΦY ‖F where Φ is restricted to be an isometry. The global minimum of this quantity, OPTk, is obtain...
Efficient and compact representation of images is a fundamental problem in computer vision. Principal Component Analysis (PCA) has been widely used for image representation and has been successfully applied to many computer vision algorithms. In this paper, we propose a method that uses Haar-like binary box functions to span a subspace which approximates the PCA subspace. The proposed method ca...
Conventional blind signal separation algorithms do not adopt any asymmetric information of the input sources, thus the convergence point of a single output is always unpredictable. However, in most of the applications, we are usually interested in only one or two of the source signals and prior information is almost always available. In this paper, a principal independent component analysis (PI...
where each column is a data sample. Analyzing—and hopefully understanding— the result of an experiment often consists in uncovering regularity or structure in the data matrix. Unfortunately, measured data is often “messy” in the sense that it is too high-dimensional for us to detect structure by direct inspection and in the sense that noise and/or redundancy often impairs data visualization. Pr...
This article presents a unified theory for analysis of components in discrete data, and compares the methods with techniques such as independent component analysis (ICA), non-negative matrix factorisation (NMF) and latent Dirichlet allocation (LDA). The main families of algorithms discussed are mean field, Gibbs sampling, and Rao-Blackwellised Gibbs sampling. Applications are presented for voti...
We study the extraction of nonlinear data models in high dimensional spaces with modi ed self organizing maps We present a general algorithm which maps low dimensional lattices into high dimensional data manifolds without violation of topology The approach is based on a new principle exploiting the speci c dynamical properties of the rst order phase tran sition induced by the noise of the data ...
Principal component analysis (PCA) is a multivariate technique that analyzes a data table in which observations are described by several inter-correlated quantitative dependent variables. Its goal is to extract the important information from the table, to represent it as a set of new orthogonal variables called principal components, and to display the pattern of similarity of the observations a...
Multivariate Gaussian data is completely characterized by its mean and covariance, yet modern non-Gaussian data makes higher-order statistics such as cumulants inevitable. For univariate data, the third and fourth scalar-valued cumulants are relatively well-studied as skewness and kurtosis. For multivariate data, these cumulants are tensor-valued, higher-order analogs of the covariance matrix c...
Principal Component Analysis (PCA) is a canonical and well-studied tool for dimensionality reduction. However, when few data are available, the poor quality of the covariance estimator at its core may compromise its performance. We leverage this issue by casting the PCA into a multitask framework, and doing so, we show how to solve simultaneously several related PCA problems. Hence, we propose ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید