Strong Rank Revealing Cholesky Factorization
نویسنده
چکیده
STRONG RANK REVEALING CHOLESKY FACTORIZATION M. GU AND L. MIRANIAN y Abstract. For any symmetric positive definite n nmatrixAwe introduce a definition of strong rank revealing Cholesky (RRCh) factorization similar to the notion of strong rank revealing QR factorization developed in the joint work of Gu and Eisenstat. There are certain key properties attached to strong RRCh factorization, the importance of which is discussed by Higham in the context of backward stability in his work on Cholesky decomposition of semidefinite matrices. We prove the existence of a pivoting strategy which, if applied in addition to standard Cholesky decomposition, leads to a strong RRCh factorization, and present two algorithms which use pivoting strategies based on the idea of local maximum volumes to compute a strong RRCh decomposition.
منابع مشابه
Robust Approximate Cholesky Factorization of Rank-Structured Symmetric Positive Definite Matrices
Given a symmetric positive definite matrix A, we compute a structured approximate Cholesky factorization A ≈ RTR up to any desired accuracy, where R is an upper triangular hierarchically semiseparable (HSS) matrix. The factorization is stable, robust, and efficient. The method compresses off-diagonal blocks with rank-revealing orthogonal decompositions. In the meantime, positive semidefinite te...
متن کاملNew Efficient and Robust HSS Cholesky Factorization of SPD Matrices
In this paper, we propose a robust Cholesky factorization method for symmetric positive definite (SPD), hierarchically semiseparable (HSS) matrices. Classical Cholesky factorizations and some semiseparable methods need to sequentially compute Schur complements. In contrast, we develop a strategy involving orthogonal transformations and approximations which avoids the explicit computation of the...
متن کاملSuccessive Rank-Revealing Cholesky Factorizations on GPUs
We present an algorithm and its GPU implementation for fast generation of rank-revealing Cholesky factors {Rk} at output in response to a sequence of data matrices {Ak} at input. The Cholesky factors are subsequently used for calculating adaptive weight vectors as control feedback in space-time adaptive processing (STAP) and sensing systems [3]. The size of the input data matrices is m× n, wher...
متن کاملLU factorization with panel rank revealing pivoting and its communication avoiding version
We present the LU decomposition with panel rank revealing pivoting (LU PRRP), an LU factorization algorithm based on strong rank revealing QR panel factorization. LU PRRP is more stable than Gaussian elimination with partial pivoting (GEPP), with a theoretical upper bound of the growth factor of (1+ τb) n b , where b is the size of the panel used during the block factorization, τ is a parameter...
متن کاملRow Modifications of a Sparse Cholesky Factorization
Given a sparse, symmetric positive definite matrix C and an associated sparse Cholesky factorization LDL, we develop sparse techniques for updating the factorization after a symmetric modification of a row and column of C. We show how the modification in the Cholesky factorization associated with this rank-2 modification of C can be computed efficiently using a sparse rank-1 technique developed...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005