نتایج جستجو برای: principal component analyzing technique
تعداد نتایج: 1349475 فیلتر نتایج به سال:
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive o...
Mixtures of Principal Component Analyzers can be used to model high dimensional data that lie on or near a low dimensional manifold. By linearly mapping the PCA subspaces to one global low dimensional space, we obtain a ‘global’ low dimensional coordinate system for the data. As shown by Roweis et al., ensuring consistent global low-dimensional coordinates for the data can be expressed as a pen...
A. Two quite different forms of nonlinear principal component analysis have been proposed in the literature. The first one is associated with the names of Guttman, Burt, Hayashi, Benzécri, McDonald, De Leeuw, Hill, Nishisato. We call itmultiple correspondence analysis. The second form has been discussed by Kruskal, Shepard, Roskam, Takane, Young, De Leeuw, Winsberg, Ramsay. We call it no...
Principal Component Analysis (PCA) is a feature extraction approach directly based on a whole vector pattern and acquires a set of projections that can realize the best reconstruction for an original data in the mean squared error sense. In this paper, the progressive PCA (PrPCA) is proposed, which could progressively extract features from a set of given data with large dimensionality and the e...
Principal component analysis (PCA) is a dimensionality reduction modeling technique that transforms a set of process variables by rotating their axes of representation. Maximum Likelihood PCA (MLPCA) is an extension that accounts for different noise contributions in each variable. Neither PCA nor its extensions utilize external information about the model or data such as the range or distributi...
In many experiments, the data points collected live in high-dimensional observation spaces, yet can be assigned a set of labels or parameters. In electrophysiological recordings, for instance, the responses of populations of neurons generally depend on mixtures of experimentally controlled parameters. The heterogeneity and diversity of these parameter dependencies can make visualization and int...
We consider a problem involving estimation of a high-dimensional covariance matrix that is the sum of a diagonal matrix and a low-rank matrix, and making a decision based on the resulting estimate. Such problems arise, for example, in portfolio management, where a common approach employs principal component analysis (PCA) to estimate factors used in constructing the low-rank term of the covaria...
Given a set of signals, a classical construction of an optimal truncatable basis for optimally representing the signals, is the principal component analysis (PCA for short) approach. When the information about the signals one would like to represent is a more general property, like smoothness, a different basis should be considered. One example is the Fourier basis which is optimal for represen...
We consider the online version of the well known Principal Component Analysis (PCA) problem. In standard PCA, the input to the problem is a set of vectors X = [x1, . . . , xn] in Rd×n and a target dimension k < d; the output is a set of vectors Y = [y1, . . . , yn] in Rk×n that minimize minΦ ‖X − ΦY ‖F where Φ is restricted to be an isometry. The global minimum of this quantity, OPTk, is obtain...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید