نتایج جستجو برای: principal components analysispca

تعداد نتایج: 498150  

2005
Stephen McIntyre Ross McKitrick

[1] The ‘‘hockey stick’’ shaped temperature reconstruction of Mann et al. (1998, 1999) has been widely applied. However it has not been previously noted in print that, prior to their principal components (PCs) analysis on tree ring networks, they carried out an unusual data transformation which strongly affects the resulting PCs. Their method, when tested on persistent red noise, nearly always ...

2013
Mehdi Yousefian Yeganeh Yagoubi

Tigris, are widespread in west of Iran and lives mostly in river and lake. They are omnivorous fish. This study aims to investigate population structure of C. reginus using Principal component Analysis. The study was conducted in three rivers that is the most remote headstream of the Gamasiab River in west part of Iran. Seven quantitative traits were measured for each specimen. After logarithmi...

2000
Peter J. B. Hancock

A system that uses an underlying genetic algorithm to evolve faces in response to user selection is described. The descriptions of faces used by the system are derived from a statistical analysis of a set of faces. The faces used for generation are transformed to an average shape by defining locations around each face and morphing. The shape-free images and shape vectors are then separately sub...

Journal: :Pattern Recognition 1998
Roger D. Boyle

Principal Components Analysis (PCA) is of great use in representation of multi-dimensional data sets, often providing a useful compression mechanism. Sometimes, input data sets are drawn from disparate domains, such that components of the input are heterogeneous, making them di cult to compare in scale. When this occurs, it is possible for one component to dominate another in the PCA at the exp...

2015
Namrata Vaswani Chenlu Qiu Brian Lois Han Guo Jinchun Zhan

This work studies the problem of sequentially recovering a sparse vector St and a vector from a low-dimensional subspace Lt from knowledge of their sum Mt := Lt + St. If the primary goal is to recover the low-dimensional subspace in which the Lt’s lie, then the problem is one of online or recursive robust principal components analysis (PCA). An example of where such a problem might arise is in ...

Journal: :International journal of neural systems 2009
Ezequiel López-Rubio Juan Miguel Ortiz-de-Lazcano-Lobato

We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be...

2008
C. Verde

This work proposes a method for fault isolation by means of the structured model characterization with isolation capability, and the residual generation through dynamic principal component analysis. Specifically, the characterization is obtained using graph theory tools, and is expressed in terms of known variables and subsets of constraints. Thus, in the absence of analytical explicit models, ...

Journal: :Neurocomputing 2005
César Ignacio García-Osorio Colin Fyfe

Support Vector Machines are supervised regression and classification machines which have the nice property of automatically identifying which of the data points are most important in creating the machine. Kernel Principal Component Analysis (KPCA) is a related technique in that it also relies on linear operations in a feature space but does not have this ability to identify important points. Sp...

1999
Tal Steinherz Nathan Intrator Ehud Rivlin

Skew detection via principal components is proposed as an e ective method for images which contain other parts than text It is shown that the negative of the image leads to much more robust results and that the computation time involved is still practical This method is also shown to be e ective for single word skew detection

2013
Michael B. Richman Andrew E. Mercer Lance M. Leslie Charles A. Doswell Chad M. Shafer

Until recently, computational power was insufficient to diagonalize atmospheric datasets of order 10 10 elements. Eigenanalysis of tens of thousands of variables now can achieve massive data compression for spatial fields with strong correlation properties. Application of eigenanalysis to 26,394 variable dimensions, for three severe weather datasets (tornado, hail and wind) retains 9 11 princip...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید