First-order approximation of Gram-Schmidt orthonormalization beats deflation in coupled PCA learning rules

نویسنده

  • Ralf Möller
چکیده

In coupled learning rules for principal component analysis, eigenvectors and eigenvalues are simultaneously estimated in a coupled system of equations. Coupled single-neuron rules have favorable convergence properties. For the estimation of multiple eigenvectors, orthonormalization methods have to be applied, either full Gram-Schmidt orthonormalization, its first-order approximation as used in Oja’s Stochastic Gradient Ascent algorithm, or deflation as in Sanger’s Generalized Hebbian Algorithm. This paper reports the observation that a first-order approximation of Gram-Schmidt orthonormalization is superior to the standard deflation procedure in coupled learning rules. The first-order approximation exhibits a smaller orthonormality error and produces eigenvectors and eigenvalues of better quality. This improvement is essential for applications where multiple principal eigenvectors have to be estimated simultaneously rather than sequentially. Moreover, loss of orthonormality may have an harmful effect on subsequent processing stages, like the computation of distance measures for competition in local PCA methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Coupled Singular Value Decomposition of a Cross-Covariance Matrix

We derive coupled on-line learning rules for the singular value decomposition (SVD) of a cross-covariance matrix. In coupled SVD rules, the singular value is estimated alongside the singular vectors, and the effective learning rates for the singular vector rules are influenced by the singular value estimates. In addition, we use a first-order approximation of Gram-Schmidt orthonormalization as ...

متن کامل

Interlocking of learning and orthonormalization in RRLSA

In sequential principal component analyzers based on deflation of the input vector, deviations from orthogonality of the previous eigenvector estimates may entail a severe loss of orthogonality in the next stages. A combination of the learning method with subsequent Gram-Schmidt orthonormalization solves this problem, but increases the computational effort. For the “robust recursive least squar...

متن کامل

Iterative Gram-Schmidt orthonormalization for efficient parameter estimation

We present an e cient method for estimating nonlinearly entered parameters of a linear signal model corrupted by additive noise. The method uses the Gram-Schmidt orthonormalization procedure in combination with a number of iterations to de-bias and re-balance the coupling between non-orthogonal signal components e ciently. Projection interpretation is provided as rationale of the proposed itera...

متن کامل

Enhanced Gram-Schmidt Process for Improving the Stability in Signal and Image Processing

The Gram-Schmidt Process (GSP) is used to convert a non-orthogonal basis (a set of linearly independent vectors) into an orthonormal basis (a set of orthogonal, unit-length vectors). The process consists of taking each vector and then subtracting the elements in common with the previous vectors. This paper introduces an Enhanced version of the Gram-Schmidt Process (EGSP) with inverse, which is ...

متن کامل

Two-step demodulation based on the Gram-Schmidt orthonormalization method.

This Letter presents an efficient, fast, and straightforward two-step demodulating method based on a Gram-Schmidt (GS) orthonormalization approach. The phase-shift value has not to be known and can take any value inside the range (0,2π), excluding the singular case, where it corresponds to π. The proposed method is based on determining an orthonormalized interferogram basis from the two supplie...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Neurocomputing

دوره 69  شماره 

صفحات  -

تاریخ انتشار 2006