نتایج جستجو برای: line search methods
تعداد نتایج: 2434504 فیلتر نتایج به سال:
Newton method is one of the most famous numerical methods among the line search methods to minimize functions. It is well known that the search direction and step length play important roles in this class of methods to solve optimization problems. In this investigation, a new modification of the Newton method to solve unconstrained optimization problems is presented. The significant ...
Using search directions of a recent class of three--term conjugate gradient methods, modified versions of the Hestenes-Stiefel and Polak-Ribiere-Polyak methods are proposed which satisfy the sufficient descent condition. The methods are shown to be globally convergent when the line search fulfills the (strong) Wolfe conditions. Numerical experiments are done on a set of CUTEr unconstrained opti...
Finite-sum problems appear as the sample average approximation of a stochastic optimization problem and often arise in machine learning applications with large scale data sets. A very popular approach to face finite-sum is gradient method. It well known that proper strategy select hyperparameters this method (i.e. set a-priori selected parameters) and, particular, rate, needed guarantee converg...
We develop a line-search second-order algorithmic framework for minimizing finite sums. do not make any convexity assumptions, but require the terms of sum to be continuously differentiable and have Lipschitz-continuous gradients. The methods fitting into this combine line searches suitably decaying step lengths. A key issue is two-step sampling at each iteration, which allows us control error ...
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
in this paper, we have outlined the surrogate management framework for optimization of expensive functions. an initial simple iterative method which we call the “strawman” method illustrates how surrogates can be incorporated into optimization to stand in for the most expensive function. these ideas are made rigorous by incorporating them into the framework of pattern search methods. the smf al...
The aim of this paper is to present a comparative numerical study between the minorant functions and line search methods in computing step size penalty method for linear optimization. were confirmed by many interesting experimentations be more beneficial than classical methods.
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید