نتایج جستجو برای: chebyshev acceleration technique

تعداد نتایج: 658725  

Journal: :iranian journal of optimization 2009
s.h. nasseri h. attari

in this paper, chebyshev acceleration technique is used to solve the fuzzy linear system (fls). this method is discussed in details and followed by summary of some other acceleration techniques. moreover, we show that in some situations that the methods such as jacobi, gauss-sidel, sor and conjugate gradient is divergent, our proposed method is applicable and the acquired results are illustrate...

H. Attari S.H. Nasseri,

In this paper, Chebyshev acceleration technique is used to solve the fuzzy linear system (FLS). This method is discussed in details and followed by summary of some other acceleration techniques. Moreover, we show that in some situations that the methods such as Jacobi, Gauss-Sidel, SOR and conjugate gradient is divergent, our proposed method is applicable and the acquired results are illustrate...

2014
Ruiping Wen Guoyan Meng Chuanlong Wang C. L. WANG

In this paper, we present a parallel quasi-Chebyshev acceleration applied to the nonoverlapping multisplitting iterative method for the linear systems when the coefficient matrix is either an H-matrix or a symmetric positive definite matrix. First, m parallel iterations are implemented in m different processors. Second, based on l1-norm or l2-norm, the m optimization models are parallelly treat...

2003

All kinematic studies of human motion employ measurement techniques which introduce noise into displacement data. Commonly, the data, as time related functions, are differentiated to produce velocity and acceleration information. Unfortunately, differentiation amplifies the noise present to such an extent that additional signal treatment is essential. The study was conducted to compare film gen...

Journal: :Journal of Computational and Applied Mathematics 2021

In this paper, by introducing a class of relaxed filtered Krylov subspaces, we propose the subspace method for computing eigenvalues with largest real parts and corresponding eigenvectors non-symmetric matrices. As by-products, generalizations Chebyshev–Davidson solving eigenvalue problems are also presented. We give convergence analysis complex Chebyshev polynomial, which plays significant rol...

2012
Qun Lin Wujian Peng

An acceleration scheme based on stationary iterativemethods is presented for solving linear system of equations. Unlike Chebyshev semi-iterative method which requires accurate estimation of the bounds for iterative matrix eigenvalues, we use a wide range of Chebyshev-like polynomials for the accelerating process without estimating the bounds of the iterative matrix. A detailed error analysis is...

Journal: :IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences 2022

Deep unfolding is a promising deep-learning technique, whose network architecture based on expanding the recursive structure of existing iterative algorithms. Although convergence acceleration remarkable advantage deep unfolding, its theoretical aspects have not been revealed yet. The first half this study details analysis in deep-unfolded gradient descent (DUGD) trainable parameters are step s...

1997
Vincent Heuveline Miloud Sadkane

We propose a restarted Arnoldi’s method with Faber polynomials and discuss its use for computing the rightmost eigenvalues of large non hermitian matrices. We illustrate, with the help of some practical test problems, the benefit obtained from the Faber acceleration by comparing this method with the Chebyshev based acceleration. A comparison with the implicitly restarted Arnoldi method is also ...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید