نتایج جستجو برای: canonical correlation
تعداد نتایج: 434433 فیلتر نتایج به سال:
We give a probabilistic interpretation of canonical correlation (CCA) analysis as a latent variable model for two Gaussian random vectors. Our interpretation is similar to the probabilistic interpretation of principal component analysis (Tipping and Bishop, 1999, Roweis, 1998). In addition, we can interpret Fisher linear discriminant analysis (LDA) as CCA between appropriately defined vectors.
You want to find a linear combination of the x coordinates that correlates well over the data with an (in general, different) linear combination of the y coordinates. In fact, you want to find the best such matched pair of linear combinations on the x and y sides, that is, the one yielding the largest coefficient of correlation. But why stop there? Once you have the best pair, you can ask for t...
Two new methods for dealing with missing values in generalized canonical correlation analysis are introduced. The first approach, which does not require iterations, is a generalization of the Test Equating method available for principal component analysis. In the second approach, missing values are imputed in such a way that the generalized canonical correlation analysis objective function does...
0167-8655/$ see front matter 2010 Elsevier B.V. A doi:10.1016/j.patrec.2010.09.025 q The work of O. Kursun was supported by Scienti nation Unit of Istanbul University under the grant YA ⇑ Corresponding author. Tel.: +90 212 473 7070/17 E-mail addresses: [email protected] (O. Kurs Alpaydin), [email protected] (O.V. Favorov). Fisher’s linear discriminant analysis (LDA) is one of the most ...
Canonical correlation analysis (CCA) is a classical multivariate method concerned with describing linear dependencies between sets of variables. After a short exposition of the linear sample CCA problem and its analytical solution, the article proceeds with a detailed characterization of its geometry. Projection operators are used to illustrate the relations between canonical vectors and variat...
In the multi-view regression problem, we have a regression problem where the input variable (which is a real vector) can be partitioned into two different views, where it is assumed that either view of the input is sufficient to make accurate predictions — this is essentially (a significantly weaker version of) the co-training assumption for the regression problem. We provide a semi-supervised ...
Recently, multi-view feature extraction has attracted great interest and Canonical Correlation Analysis (CCA) is a powerful technique for finding the linear correlation between two view variable sets. However, CCA does not consider the structure and cross view information in feature extraction, which is very important for subsequence tasks. In this paper, a new approach called Canonical Sparse ...
We review a new method of performing Canonical Correlation Analysis (CCA) with Artificial Neural Networks. We have previously [5, 4] compared its capabilities with standard statistical methods on simple data sets where the maximum correlations are given by linear filters. In this paper, we extend the method by implementing a very precise set of constraints which allow multiple correlations to b...
Conventional speaker-independent HMMs ignore the speaker di erences and collect speech data in an observation space. This causes a problem that the output probability distribution of the HMMs becomes vague so that it deteriorates the recognition accuracy. To solve this problem, we construct the speaker subspace for an individual speaker and correlate them by o-space canonical correlation analys...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید