On Computation of Approximate Joint Block-Diagonalization Using Ordinary AJD

نویسندگان

  • Petr Tichavský
  • Arie Yeredor
  • Zbynek Koldovský
چکیده

Approximate joint block diagonalization (AJBD) of a set of matrices has applications in blind source separation, e.g., when the signal mixtures contain mutually independent subspaces of dimension higher than one. The main message of this paper is that certain ordinary approximate joint diagonalization (AJD) methods (which were originally derived for degenerate" subspaces of dimension 1) can also be used successfully for AJBD, but not all are suitable equally well. In particular, we prove that when the set is exactly jointly block-diagonalizable, perfect block-diagonalization is attainable by the recently proposed AJD algorithm U-WEDGE" (uniformly weighted exhaustive diagonalization with Gaussian iteration) but this basic consistency property is not shared by some other popular AJD algorithms. In addition, we show using simulation, that in the more general noisy case, the subspace identi cation accuracy of U-WEDGE compares favorably to competitors.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-orthogonal tensor diagonalization

Tensor diagonalization means transforming a given tensor to an exactly or nearly diagonal form through multiplying the tensor by non-orthogonal invertible matrices along selected dimensions of the tensor. It has a link to an approximate joint diagonalization (AJD) of a set of matrices. In this paper, we derive (1) a new algorithm for a symmetric AJD, which is called two-sided symmetric diagonal...

متن کامل

Approximate Joint Diagonalization Using a Natural Gradient Approach

We present a new algorithm for non-unitary approximate joint diagonalization (AJD), based on a “natural gradient”-type multiplicative update of the diagonalizing matrix, complemented by step-size optimization at each iteration. The advantages of the new algorithm over existing non-unitary AJD algorithms are in the ability to accommodate non-positive-definite matrices (compared to Pham’s algorit...

متن کامل

"Weighting for more": Enhancing characteristic-function based ICA with asymptotically optimal weighting

The CHaracteristic-function-Enabled Source Separation (CHESS) method for independent component analysis (ICA) is based on approximate joint diagonalization (AJD) of Hessians of the observations’ empirical log-characteristicfunction, taken at selected off-origin “processing points”. As previously observed in other contexts, the AJD performance can be significantly improved by optimal weighting, ...

متن کامل

Least-Squares Joint Diagonalization of a matrix set by a congruence transformation

The approximate joint diagonalization (AJD) is an important analytic tool at the base of numerous independent component analysis (ICA) and other blind source separation (BSS) methods, thus finding more and more applications in medical imaging analysis. In this work we present a new AJD algorithm named SDIAG (Spheric Diagonalization). It imposes no constraint either on the input matrices or on t...

متن کامل

A new General Weighted Least-Squares Algorithm for Approximate Joint Diagonalization

Independent component analysis (ICA) and other blind source separation (BSS) methods are important processing tools for multi-channel processing of electroencephalographic data and have found numerous applications for brain-computer interfaces. A number of solutions to the BSS problem are achieved by approximate joint diagonalization (AJD) algorithms, thus the goodness of the solution depends o...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012