Modify the linear search formula in the BFGS method to achieve global convergence.

Authors

  • M. Hamzehnejad Department of Mathematics, Vali-e-Asr University of Rafsanjan, Rafsanjan, Iran
Abstract:

Nonlinear programming problems belong to the realm of commonly used optimization problems. In most cases, the objective function of such problems is non-convex. However, to guarantee global convergence in the algorithms proposed based on Newton's method to solve these problems, a convexity condition is generally required. Meanwhile, the quasi-Newton techniques are more popular because they use an approximation of the Hessian matrix or its inverse. However, in these algorithms, only gradient information is used to approximate this matrix. One of the most applicable quasi-Newton algorithms in solving nonlinear programming problems is the BFGS method. This paper presents a new idea for a linear search in the BFGS method. It proves that using this technique will lead to global convergence for general problems without the need for any additional conditions. Finally, the performance of the proposed algorithm is evaluated numerically.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

the search for the self in becketts theatre: waiting for godot and endgame

this thesis is based upon the works of samuel beckett. one of the greatest writers of contemporary literature. here, i have tried to focus on one of the main themes in becketts works: the search for the real "me" or the real self, which is not only a problem to be solved for beckett man but also for each of us. i have tried to show becketts techniques in approaching this unattainable goal, base...

15 صفحه اول

On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems

This paper is concerned with the open problem whether BFGS method with inexact line search converges globally when applied to nonconvex unconstrained optimization problems. We propose a cautious BFGS update and prove that the method with either Wolfe-type or Armijo-type line search converges globally if the function to be minimized has Lipschitz continuous gradients.

full text

Global convergence of online limited memory BFGS

Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...

full text

from linguistics to literature: a linguistic approach to the study of linguistic deviations in the turkish divan of shahriar

chapter i provides an overview of structural linguistics and touches upon the saussurean dichotomies with the final goal of exploring their relevance to the stylistic studies of literature. to provide evidence for the singificance of the study, chapter ii deals with the controversial issue of linguistics and literature, and presents opposing views which, at the same time, have been central to t...

15 صفحه اول

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 5  issue 21

pages  37- 46

publication date 2019-12-22

By following a journal you will be notified via email when a new issue of this journal is published.

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023