Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimization
نویسنده
چکیده
This paper studies recent modi cations of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modi cation technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in certain sense. Because at some iterations these updates might be redundant or worsen the quality of this Hessian, this paper proposes an updates criterion to measure this quality. Hence, extra updates are employed only to improve the poor approximation of the L-BFGS Hessian. The presented numerical results illustrate the usefulness of this criterion and show that extra updates improve the performance of the L-BFGS method substantially.
منابع مشابه
Extra-Updates Criterion for the Limited Memory BFGS Algorithm for Large Scale Nonlinear Optimizatio
This paper studies recent modifications of the limited memory BFGS (L-BFGS) method for solving large scale unconstrained optimization problems. Each modification technique attempts to improve the quality of the L-BFGS Hessian by employing (extra) updates in a certain sense. Because at some iterations these updates might be redundant or worsen the quality of this Hessian, this paper proposes an ...
متن کاملA limited memory adaptive trust-region approach for large-scale unconstrained optimization
This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...
متن کاملModifications of the Limited Memory Bfgs Algorithm for Large-scale Nonlinear Optimization
In this paper we present two new numerical methods for unconstrained large-scale optimization. These methods apply update formulae, which are derived by considering different techniques of approximating the objective function. Theoretical analysis is given to show the advantages of using these update formulae. It is observed that these update formulae can be employed within the framework of lim...
متن کاملOptimization Technology Center towards a Discrete Newton Method with Memory for Large Scale Optimization
A new method for solving large nonlinear optimization problems is outlined It attempts to combine the best properties of the discrete truncated Newton method and the limited memory BFGS method to produce an algorithm that is both economical and capable of handling ill conditioned problems The key idea is to use the curvature information generated during the computation of the discrete Newton st...
متن کاملTowards a Discrete Newton Method with Memory for Large-scale Optimization
A new method for solving large nonlinear optimization problems is outlined. It attempts to combine the best properties of the discrete-truncated Newton method and the limited memory BFGS method, to produce an algorithm that is both economical and capable of handling ill-conditioned problems. The key idea is to use the curvature information generated during the computation of the discrete Newton...
متن کامل