نتایج جستجو برای: hessenberg matrix
تعداد نتایج: 364962 فیلتر نتایج به سال:
Hessenberg decomposition is the basic tool used in computational linear algebra to approximate the eigenvalues of a matrix. In this article, we generalize Hessenberg decomposition to continuous matrix fields over topological spaces. This works in great generality: the space is only required to be normal and to have finite covering dimension. As applications, we derive some new structure results...
This paper describes a parallel Hessenberg reduction in the context of multicore architectures using tile algorithms. The Hessenberg reduction is very often used as a pre-processing step in solving dense linear algebra problems, such as the standard eigenvalue problem. Although expensive, orthogonal transformations are accepted techniques and commonly used for this reduction because they guaran...
We consider the numerical construction of a unitary Hessenberg matrix from spectral data using an inverse QR algorithm. Any unitary upper Hessenberg matrix H with nonnegative subdiagonal elements can be represented by 2n ? 1 real parameters. This representation, which we refer to as the Schur parameterization of H; facilitates the development of eecient algorithms for this class of matrices. We...
Two inverse eigenvalue problems are discussed. First, given the eigenvalues and a weight vector an extended Hessenberg matrix is computed. This matrix represents the recurrences linked to a (rational) Arnoldi inverse problem. It is well-known that the matrix capturing the recurrence coefficients is of Hessenberg form in the standard Arnoldi case. Considering, however, rational functions and adm...
In this paper we describe how to compute the eigenvalues of a unitary rank structured matrix in two steps. First we perform a reduction of the given matrix into Hessenberg form, next we compute the eigenvalues of this resulting Hessenberg matrix via an implicit QR-algorithm. Along the way, we explainhow the knowledge of a certain ‘shift’ correction term to the structure can be used to speed up ...
In many scientific applications, eigenvalues of a matrix have to be computed. By first reducing a matrix from fully dense to Hessenberg form, eigenvalue computations with the QR algorithm become more efficient. Previous work on shared memory architectures has shown that the Hessenberg reduction is in some cases most efficient when performed in two stages: First reduce the matrix to block Hessen...
We present a new algorithm for solving the Sylvester-Observer Equation: AX ? XH = (0;C). The algorithm embodies two main computational phases: the solution of a series of independent equation systems, and a series of matrix-matrix multiplications. The algorithm is, thus, well suited for parallel and high performance computing. By reducingthe coeecient matrix A to lower Hessenberg form, one can ...
We present an eecient design and implementation for the solution of the Sylvester Observer matrix equation on a distributed memory computer. It consists of two main computational intensive segments, and one of a lesser complexity: the reduction of the system matrix A to a Hessenberg form, the solution of n shifted Hessenberg systems and a number of matrix-matrix multiplies. The Hessenberg reduc...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید