نتایج جستجو برای: eigenvalues and vectors

تعداد نتایج: 16837282  

Journal: :Math. Comput. 2001
Zhongxiao Jia G. W. Stewart

This paper concerns the Rayleigh–Ritz method for computing an approximation to an eigenspace X of a general matrix A from a subspace W that contains an approximation to X . The method produces a pair (N, X̃) that purports to approximate a pair (L,X), where X is a basis for X and AX = XL. In this paper we consider the convergence of (N, X̃) as the sine of the angle between X andW approaches zero. ...

2003
Sadataka Furui

Transition from chaotic to quasi-periodic phase in modified Lorenz model is analyzed by performing the contact transformation such that the trajectory in R is projected on R. The relative torsion number and the characteristics of the template are measured using the eigenvector of the Jacobian instead of vectors on moving frame along the closed trajectory. Application to the circulation of a flu...

2008
Chunchen Liu Xiaolong Yuan Arjun Mullaguru Jeffrey Fan

In this paper, we have proposed a novel model order reduction technique via rational transfer function fitting and eigenmode analysis considering residues. We define a constant as a key in the sorting algorithm as one of correlations in order to sort the order of eigenvalues. It is demonstrated that the accuracy via eigenmode analysis considering residues is improved. The proposed algorithm is ...

Functional data analysis is a relatively new and rapidly growing area of statistics. This is partly due to technological advancements which have made it possible to generate new types of data that are in the form of curves. Because the data are functions, they lie in function spaces, which are of infinite dimension. To analyse functional data, one way, which is widely used, is to employ princip...

2008
Mario Castagnino Fernando Lombardo

It is demonstrated that almost any S-matrix of quantum field theory in curved spaces posses an infinite set of complex poles (or branch cuts). These poles can be transformed into complex eigenvalues, the corresponding eigenvectors being Gamow vectors. All this formalism, which is heuristic in ordinary Hilbert space, becomes a rigorous one within the framework of a properly chosen rigged Hilbert...

Journal: :Lecture Notes in Computer Science 2022

Abstract The physical property of the Hubbard model can be understood by solving eigenvalue problem for Hamiltonian derived from model. Since is a large sparse matrix, an iteration method usually utilized problems. One effectual solvers this LOBPCG (Locally Optimal Block Preconditioned Conjugate Gradient) method. tuning strategies on GPU systems when all vectors are stored in device memory have...

2001
Jan Marthedal Rasmussen

This thesis deals with linear ill-posed problems related to compact operators, and iterative Krylov subspace methods for solving discretized versions of these. Linear compact operators in infinite dimensional Hilbert spaces will be investigated and several results on the singular values and eigenvalues for such will be presented. A large subset of linear compact operators consists of integral o...

2002
JASON C. GOODMAN JOHN MARSHALL

The authors explore the use of the ‘‘neutral vectors’’ of a linearized version of a global quasigeostrophic atmospheric model with realistic mean flow in the study of the nonlinear model’s low-frequency variability. Neutral vectors are the (right) singular vectors of the linearized model’s tendency matrix that have the smallest eigenvalues; they are also the patterns that exhibit the largest re...

2007
Bappaditya Mandal Xudong Jiang Alex ChiChung Kot

This work proposes a method which enables us to perform kernel Fisher discriminant analysis in the whole eigenspace for face recognition. It employs the ratio of eigenvalues to decompose the entire kernel feature space into two subspaces: a reliable subspace spanned mainly by the facial variation and an unreliable subspace due to finite number of training samples. Eigenvectors are then scaled u...

Journal: :CoRR 2016
Sandrine Dallaporta Yohann de Castro

Restricted Isometry Constants (RICs) are a pivotal notion in Compressed Sensing as these constants finely assess how a linear operator is conditioned on the set of sparse vectors and hence how it performs in stable and robust sparse regression (SRSR). While it is an open problem to construct deterministic matrices with apposite RICs, one can prove that such matrices exist using random matrices ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید