نتایج جستجو برای: dai liao conjugate gradient method

تعداد نتایج: 1758206  

Journal: :J. Applied Mathematics 2012
Jin-kui Liu Xianglin Du Kairong Wang

Journal: :J. Comput. Physics 2009
Jianke Yang

In this paper, the Newton-conjugate-gradient methods are developed for solitary wave computations. These methods are based on Newton iterations, coupled with conjugategradient iterations to solve the resulting linear Newton-correction equation. When the linearization operator is self-adjoint, the preconditioned conjugate-gradient method is proposed to solve this linear equation. If the lineariz...

Journal: :SIAM Journal on Optimization 1999
Yu-Hong Dai Ya-Xiang Yuan

Conjugate gradient methods are widely used for unconstrained optimization, especially large scale problems. However, the strong Wolfe conditions are usually used in the analyses and implementations of conjugate gradient methods. This paper presents a new version of the conjugate gradient method, which converges globally provided the line search satisses the standard Wolfe conditions. The condit...

2008
Neculai Andrei

Conjugate gradient algorithms are very powerful methods for solving large-scale unconstrained optimization problems characterized by low memory requirements and strong local and global convergence properties. Over 25 variants of different conjugate gradient methods are known. In this paper we propose a fundamentally different method, in which the well known parameter k β is computed by an appro...

H. Attari S.H. Nasseri,

In this paper, Chebyshev acceleration technique is used to solve the fuzzy linear system (FLS). This method is discussed in details and followed by summary of some other acceleration techniques. Moreover, we show that in some situations that the methods such as Jacobi, Gauss-Sidel, SOR and conjugate gradient is divergent, our proposed method is applicable and the acquired results are illustrate...

2005
WILLIAM W. HAGER HONGCHAO ZHANG

This paper reviews the development of different versions of nonlinear conjugate gradient methods, with special attention given to global convergence properties.

Journal: :SIAM Journal on Optimization 2013
Yu-Hong Dai Cai-Xia Kou

In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild condi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید