نتایج جستجو برای: eigenvalue gradient method

تعداد نتایج: 1735319  

2005
Maxim Larin Valery Il’in

in which A is a large sparse symmetric positive definite matrix, λ is an eigenvalue and u is a corresponding eigenvector. The evaluation of one or more smallest eigenpairs has much practical interest for describing the characteristics of physical phenomena. For example, smallest eigenvalues characterize the base frequences of vibrating mechanical structures. Typically, the matrix A is a discret...

2005
I. Lashuk M. E. Argentati E. Ovchinnikov A. V. Knyazev Ilya Lashuk Merico Argentati Evgueni Ovtchinnikov Andrew Knyazev A. Knyazev

We present preliminary results of an ongoing project to develop codes of the Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) method for symmetric eigenvalue problems for hypre and PETSc software packages. hypre and PETSc provide high quality domain decomposition and multigrid preconditioning for parallel computers. Our LOBPCG implementation for hypre is publicly available in hy...

Journal: :SIAM J. Matrix Analysis Applications 2009
Andrew V. Knyazev Klaus Neymeyr

Preconditioned eigenvalue solvers (eigensolvers) are gaining popularity, but their convergence theory remains sparse and complex. We consider the simplest preconditioned eigensolver— the gradient iterative method with a fixed step size—for symmetric generalized eigenvalue problems, where we use the gradient of the Rayleigh quotient as an optimization direction. A sharp convergence rate bound fo...

2017
Hongtao Chen Hailong Guo Zhimin Zhang Qingsong Zou

In this article, we construct aC0 linear finite element method for two fourth-order eigenvalue problems: the biharmonic and the transmission eigenvalue problems. The basic idea of our construction is to use gradient recovery operator to compute the higher-order derivatives of a C0 piecewise linear function, which do not exist in the classical sense. For the biharmonic eigenvalue problem, the op...

2015
Jiashang Jiang

Model updating is an inverse eigenvalue problem which concerns the modification of an existing but inaccurate model with measured modal data. In this paper, an efficient gradient based iterative method for updating the mass, damping and stiffness matrices simultaneously using a few of complex measured modal data is developed. Convergence analysis indicates that the iterative solutions always co...

Journal: :Journal of Functional Analysis 1995

1998
ANDREAS STATHOPOULOS YOUSEF SAAD

The (Jacobi-)Davidson method, which is a popular preconditioned extension to the Arnoldi method for solving large eigenvalue problems, is often used with restarting. This has significant performance shortcomings, since important components of the invariant subspace may be discarded. One way of saving more information at restart is through “thick” restarting, a technique that involves keeping mo...

1996
B. Bunk

provides the smallest eigenvalue, together with the corresponding eigenvector. The minimum of a functional can be found iteratively by the method of conjugate gradients (CG)[1], its application to the special case of the Ritz functional has been worked out by Geradin[2] and Fried[3]. If a small number of lowest eigenvalues is required instead, q(x) may be minimised repeatedly, restricting x to ...

Journal: :CoRR 2017
Weinan E Bing Yu

We propose a deep learning based method, the Deep Ritz Method, for numerically solving variational problems, particularly the ones that arise from partial differential equations. The Deep Ritz method is naturally nonlinear, naturally adaptive and has the potential to work in rather high dimensions. The framework is quite simple and fits well with the stochastic gradient descent method used in d...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید