نتایج جستجو برای: weakly chebyshev subspace

تعداد نتایج: 64870  

Journal: :J. Comput. Physics 2010
Yunkai Zhou

We propose a block Davidson-type subspace iteration using Chebyshev polynomial filters for large symmetric/hermitian eigenvalue problem. The method consists of three essential components. The first is an adaptive procedure for constructing efficient block Chebyshev polynomial filters; the second is an inner–outer restart technique inside a Chebyshev–Davidson iteration that reduces the computati...

Journal: :Physical review. E, Statistical, nonlinear, and soft matter physics 2006
Yunkai Zhou Yousef Saad Murilo L Tiago James R Chelikowsky

Solving the Kohn-Sham eigenvalue problem constitutes the most computationally expensive part in self-consistent density functional theory (DFT) calculations. In a previous paper, we have proposed a nonlinear Chebyshev-filtered subspace iteration method, which avoids computing explicit eigenvectors except at the first self-consistent-field (SCF) iteration. The method may be viewed as an approach...

2016
Phani Motamarri Vikram Gavini Michael Ortiz

We present a spectrum-splitting approach to conduct all-electron Kohn-Sham density functional theory (DFT) calculations by employing Fermi-operator expansion of the Kohn-Sham Hamiltonian. The proposed approach splits the subspace containing the occupied eigenspace into a core-subspace, spanned by the core eigenfunctions, and its complement, the valence-subspace, and thereby enables an efficient...

Journal: :Numerical Lin. Alg. with Applic. 2000
Luca Bergamaschi Marco Vianello

In this paper we compare Krylov subspace methods with Chebyshev series expansion for approximating the matrix exponential operator on large, sparse, symmetric matrices. Experimental results upon negative-definite matrices with very large size, arising from (2D and 3D) FE and FD spatial discretization of linear parabolic PDEs, demonstrate that the Chebyshev method can be an effective alternative...

2013
M. M. Hosseini

In this paper, an Adomian decomposition method using Chebyshev orthogonal polynomials is proposed to solve a well-known class of weakly singular Volterra integral equations. Comparison with the collocation method using polynomial spline approximation with Legendre Radau points reveals that the Adomian decomposition method using Chebyshev orthogonal polynomials is of high accuracy and reduces th...

Journal: :Numerical Lin. Alg. with Applic. 2000
Yousef Saad

The convergence behavior of a number of algorithms based on minimizing residual norms over Krylov subspaces, is not well understood. Residual or error bounds currently available are either too loose or depend on unknown constants which can be very large. In this paper we take another look at traditional as well as alternative ways of obtaining upper bounds on residual norms. In particular, we d...

Journal: :SIAM J. Matrix Analysis Applications 2004
Christopher A. Beattie Mark Embree John Rossi

The performance of Krylov subspace eigenvalue algorithms for large matrices can be measured by the angle between a desired invariant subspace and the Krylov subspace. We develop general bounds for this convergence that include the effects of polynomial restarting and impose no restrictions concerning the diagonalizability of the matrix or its degree of nonnormality. Associated with a desired se...

2010
WU LI

In this paper we show some very interesting properties of weak Chebyshev subspaces and use them to simplify Pinkus's characterization of Asubspaces of C[a, b]. As a consequence we obtain that if the metric projection PG from C[a, b] onto a finite-dimensional subspace G has a continuous selection and elements of G have no common zeros on (a, b), then G is an /4-subspace.

N. Eftekhari,

Let T be a compact Hausdorff topological space and let M denote an n–dimensional subspace of the space C(T ), the space of real–valued continuous functions on T and let the space be equipped with the uniform norm. Zukhovitskii [7] attributes the Basic Theorem to E.Ya.Remez and gives a proof by duality. He also gives a proof due to Shnirel’man, which uses Helly’s Theorem, now the paper obtains a...

2008
Andrew J. Wathen Tyrone Rees Victor Pereyra

It is widely believed that Krylov subspace iterative methods are better than Chebyshev semi-iterative methods. When the solution of a linear system with a symmetric and positive definite coefficient matrix is required then the Conjugate Gradient method will compute the optimal approximate solution from the appropriate Krylov subspace, that is, it will implicitly compute the optimal polynomial. ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید