نتایج جستجو برای: sufficient descent directions
تعداد نتایج: 286567 فیلتر نتایج به سال:
The selection of appropriate beam irradiation directions in radiotherapy – beam angle optimization (BAO) problem – is very important for the quality of the treatment, both for improving tumor irradiation and for better organs sparing. However, the BAO problem is still not solved satisfactorily and, most of the time, beam directions continue to be manually selected in clinical practice which req...
Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, first-order global convergence comes from the ability of the vectors to approximate t...
Conjugate gradient methods have been paid attention to, because they can be directly applied to large-scale unconstrained optimization problems. In order to incorporate second order information of the objective function into conjugate gradient methods, Dai and Liao (2001) proposed a conjugate gradient method based on the secant condition. However, their method does not necessarily generate a de...
Article history: Received 13 January 2011 Accepted 23 February 2011 Available online 13 May 2011
This paper describes a new algorithm for the solution of nonconvex unconstrained optimization problems, with the property of converging to points satisfying second-order necessary optimality conditions. The algorithm is based on a procedure which, from two descent directions, a Newton-type direction and a direction of negative curvature, selects in each iteration the linesearch model best adapt...
In this paper, we proposed a new hybrid conjugate gradient algorithm for solving unconstrained optimization problems as convex combination of the Dai-Yuan algorithm, conjugate-descent and Hestenes-Stiefel algorithm. This is globally convergent satisfies sufficient descent condition by using strong Wolfe conditions. The numerical results show that nonlinear efficient robust.
In this paper we study the convergence of online gradient descent algorithms in reproducing kernel Hilbert spaces (RKHSs) without regularization. We establish a sufficient condition and a necessary condition for the convergence of excess generalization errors in expectation. A sufficient condition for the almost sure convergence is also given. With high probability, we provide explicit converge...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید