نتایج جستجو برای: eigenvalue gradient method

تعداد نتایج: 1735319  

2008
HAIJUN WU ZHIMIN ZHANG Z. ZHANG

Gradient recovery has been widely used for a posteriori error estimates (see Ainsworth & Oden, 2000; Babuška & Strouboulis, 2001; Chen & Xu, 2007; Fierro & Veeser, 2006; Zhang, 2007; Zienkiewicz et al., 2005; Zienkiewicz & Zhu, 1987, 1992a,b). Recently, it has been employed to enhance the eigenvalue approximations by the finite-element method under certain mesh conditions (see Naga et al., 2006...

Journal: :journal of linear and topological algebra (jlta) 0
m nili ahmadabadi department of mathematics, islamic azad university, najafabad branch, iran.

in this paper, a fundamentally new method, based on the de nition, is introduced for numerical computation of eigenvalues, generalized eigenvalues and quadratic eigenvalues of matrices. some examples are provided to show the accuracy and reliability of the proposed method. it is shown that the proposed method gives other sequences than that of existing methods but they still are convergent to t...

2009
ANDREW V. KNYAZEV

Preconditioned eigenvalue solvers (eigensolvers) are gaining popularity, but their convergence theory remains sparse and complex. We consider the simplest preconditioned eigensolver— the gradient iterative method with a fixed step size—for symmetric generalized eigenvalue problems, where we use the gradient of the Rayleigh quotient as an optimization direction. We prove a known sharp and simple...

2001
Andrew V. Knyazev Klaus Neymeyr

In two previous papers by Neymeyr: A geometric theory for preconditioned inverse iteration I: Extrema of the Rayleigh quotient, LAA 322: (1-3), 61-85, 2001, and A geometric theory for preconditioned inverse iteration II: Convergence estimates, LAA 322: (1-3), 87-104, 2001, a sharp, but cumbersome, convergence rate estimate was proved for a simple preconditioned eigensolver, which computes the s...

Journal: :Journal of Industrial and Management Optimization 2022

<p style='text-indent:20px;'>Tensor eigenvalue complementary problems, as a special class of are the generalization matrix problems in higher-order. In recent years, tensor complementarity have been studied extensively. The research fields mainly focus on analysis theory and algorithms. this paper, we investigate solution method for four kinds with different structures. By utilizing an eq...

2003
Kenneth F. Alvin

An iterative procedure is presented for computing eigenvector sensitivities due to finite element model parameter variations. The present method is a Preconditioned Conjugate Projected Gradient-based technique and is intended to utilize the existing matrix factorizations developed for an iterative eigensolution such as Lanczos or Subspace Iteration. As such, this technique can be integrated int...

N. Aghazadeh Y. Gholizade Atani

In this paper, we present an edge detection method based on wavelet transform and Hessian matrix of image at each pixel. Many methods which based on wavelet transform, use wavelet transform to approximate the gradient of image and detect edges by searching the modulus maximum of gradient vectors. In our scheme, we use wavelet transform to approximate Hessian matrix of image at each pixel, too. ...

Journal: :CoRR 2013
Steven Thomas Smith

The techniques and analysis presented in this thesis provide new methods to solve optimization problems posed on Riemannian manifolds. These methods are applied to the subspace tracking problem found in adaptive signal processing and adaptive control. A new point of view is offered for the constrained optimization problem. Some classical optimization techniques on Euclidean space are generalize...

2011
Ingolf Busch Oliver G. Ernst Elisabeth Ullmann

We present two expansions for the gradient of a random field. In the first approach, we differentiate its truncated KarhunenLoève expansion. In the second approach, the Karhunen-Loève expansion of the random field gradient is computed directly. Both strategies require the solution of dense, symmetric matrix eigenvalue problems which can be handled efficiently by combining hierachical matrix tec...

2009
ANDREW V. KNYAZEV

Preconditioned eigenvalue solvers (eigensolvers) are gaining popularity, but their convergence theory remains sparse and complex. We consider the simplest preconditioned eigensolver— the gradient iterative method with a fixed step size—for symmetric generalized eigenvalue problems, where we use the gradient of the Rayleigh quotient as an optimization direction. A sharp convergence rate bound fo...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید