نتایج جستجو برای: eigenvectors and gram

تعداد نتایج: 16834237  

Journal: :Neural networks : the official journal of the International Neural Network Society 2007
Luc Hoegaerts Lieven De Lathauwer Ivan Goethals Johan A. K. Suykens Joos Vandewalle Bart De Moor

The dominant set of eigenvectors of the symmetrical kernel Gram matrix is used in many important kernel methods (like e.g. kernel Principal Component Analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data, the batch calculation nature and computational demands of the eigenvector decomposition limit th...

Journal: :J. Multivariate Analysis 2013
Lo-Bin Chang Zhidong Bai Su-Yun Huang Chii-Ruey Hwang

Many kernel-based learning algorithms have the computational load scaled with the sample size n due to the column size of a full kernel Gram matrix K. This article considers the Nyström low-rank approximation. It uses a reduced kernel ?̂?, which is n×m, consisting of m columns (say columns i1, i2,···, im) randomly drawn from K. This approximation takes the form K ≈ ?̂?U?̂?, where U is the reduced ...

2015
HOWARD C. ELMAN

We study random eigenvalue problems in the context of spectral stochastic finite elements. In particular, given a parameter-dependent, symmetric positive-definite matrix operator, we explore the performance of algorithms for computing its eigenvalues and eigenvectors represented using polynomial chaos expansions. We formulate a version of stochastic inverse subspace iteration, which is based on...

Journal: :CoRR 2012
Hiroyuki Ishigami Kinji Kimura Yoshimasa Nakamura

A new inverse iteration algorithm that can be used to compute all the eigenvectors of a real symmetric tri-diagonal matrix on parallel computers is developed. The modified Gram-Schmidt orthogonalization is used in the classical inverse iteration. This algorithm is sequential and causes a bottleneck in parallel computing. In this paper, the use of the compact WY representation is proposed in the...

An efficient technique is presented for optimum design of structures with both natural frequency and complex frequency response constraints. The main ideals to reduce the number of dynamic analysis by introducing high quality approximation. Eigenvalues are approximated using the Rayleigh quotient. Eigenvectors are also approximated for the evaluation of eigenvalues and frequency responses. A tw...

Journal: :Proceedings of the Glasgow Mathematical Association 1963

Journal: :Notices of the American Mathematical Society 2016

ژورنال: فیزیک زمین و فضا 2020

Computed Magnetic Gradient Tensor (CMGT) includes the first derivatives of three components of magnetic field of a body. At the eigenvector analysis of Gravity Gradient Tensors (GGT) for a line of poles and point pole, the eigenvectors of the largest eigenvalues (first eigenvectors) point precisely toward the Center of Mass (COM) of a body. However, due to the nature of the magnetic field, it i...

Journal: :international journal of nano dimension 0
a. zambare department of biotechnology, mgm’s jawaharlal nehru engineering college, mgm campus, n-6, cidco, aurangabad (ms) india. t. nerpagar department of biotechnology, mgm’s jawaharlal nehru engineering college, mgm campus, n-6, cidco, aurangabad (ms) india. n. chaudhari department of biotechnology, mgm’s jawaharlal nehru engineering college, mgm campus, n-6, cidco, aurangabad (ms) india. p. manchalwad department of biotechnology, mgm’s jawaharlal nehru engineering college, mgm campus, n-6, cidco, aurangabad (ms) india. s. harke department of biotechnology, mgm’s jawaharlal nehru engineering college, mgm campus, n-6, cidco, aurangabad (ms) india.

in study, spherical silver nanoparticles (snps) were synthesized by chemical reduction method from a metal precursor silver nitrate in presence of an anionic surfactant and strong reducing agent. in this experimental work, snps are synthesized in presence of different concentration of stabilizing agent and effect of stabilizing agent on size distribution of snps have been observed. further anti...

2013
Lo-Bin Chang Zhidong Bai Su-Yun Huang Chii-Ruey Hwang

• Many kernel-based learning algorithms have the computational load. • The Nyström low-rank approximation is designed for reducing the computation. • We propose the spectrum decomposition condition with a theoretical justification. • Asymptotic error bounds on eigenvalues and eigenvectors are derived. • Numerical experiments are provided for covariance kernel and Wishart matrix. AMS subject cla...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید