نتایج جستجو برای: eigenvalues and vectors
تعداد نتایج: 16837282 فیلتر نتایج به سال:
In this work, the eigenspaces of unitary Cayley graphs and certain Hamming graphs are considered. It is shown that these graph classes are closely related and admit particularly simple eigenspace bases for all eigenvalues, namely bases containing vectors only with entries from the set {0, 1,−1}. A direct consequence is that the considered graph classes are integral.
We apply local laws of random matrices and free probability theory to study the spectral properties two kernel-based sensor fusion algorithms, nonparametric canonical correlation analysis (NCCA) alternating diffusion (AD), for simultaneously recorded high dimensional datasets under null hypothesis. The matrix interest is product kernel associated with databsets, which may not be diagonalizable ...
We give a brief description of a non-symmetric Lanczos algorithm that does not require strict bi-orthogonality among the generated vectors. We show how the vectors generated are algebraically related to “Controllable Space” and “Observable Space” for a related linear dynamical system. The algorithm described is particularly appropriate for large sparse systems. 1. Intr~uction The Lanczos Algori...
one of the most important number sequences in mathematics is fibonacci sequence. fibonacci sequence except for mathematics is applied to other branches of science such as physics and arts. in fact, between anesthetics and this sequence there exists a wonderful relation. fibonacci sequence has an importance characteristic which is the golden number. in this thesis, the golden number is observed ...
In this paper we consider two closely related problems : estimation of eigenvalues and eigenfunctions of the covariance kernel of functional data based on (possibly) irregular measurements, and the problem of estimating the eigenvalues and eigenvectors of the covariance matrix for high-dimensional Gaussian vectors. In Peng and Paul (2007), a restricted maximum likelihood (REML) approach has bee...
In this paper we consider two closely related problems: estimation of eigenvalues and eigenfunctions of the covariance kernel of functional data based on (possibly) irregular measurements, and the problem of estimating the eigenvalues and eigenvectors of the covariance matrix for highdimensional Gaussian vectors. In [A geometric approach to maximum likelihood estimation of covariance kernel fro...
Unless specified otherwise, all vectors in this lecture live in Rn, and all matrices are symmetric and live in Rn×n. For two vectors v,w, let v · w = ∑ i viwi denote their inner product, and v 0 indicate that all vi ≥ 0. For two matrices A and B, denote by A •B their inner product thinking of them as vectors in Rn2 , i.e. A • B = ∑ ij AijBij = Tr(A >B). Here Tr(·) denotes the trace of a matrix....
This thesis deals with the computation of a small set of exterior eigenvalues of a given large sparse matrix on present (and future) supercomputers using a Block-JacobiDavidson method. The main idea of the method is to operate on blocks of vectors and to combine several sparse matrix-vector multiplications with different vectors in a single computation. Block vector calculations and in particul...
In this paper we consider two closely related problems : estimation of eigenvalues and eigenfunctions of the covariance kernel of functional data based on (possibly) irregular measurements, and the problem of estimating the eigenvalues and eigenvectors of the covariance matrix for high-dimensional Gaussian vectors. In [23], a restricted maximum likelihood (REML) approach has been developed to d...
It has been known for a long time that the sets of integer vectors that are recognizable by finite-state automata are those that can be defined in an extension of Presburger arithmetic. In this paper, we address the problem of deciding whether the closure of a linear transformation preserves the recognizable nature of sets of integer vectors. We solve this problem by introducing an original ext...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید