The modified BFGS method with new secant relation ‎for unconstrained optimization problems‎

Authors

Abstract:

Using Taylor's series we propose a modified secant relation to get a more accurate approximation of the second curvature of the objective function. Then, based on this modified secant relation we present a new BFGS method for solving unconstrained optimization problems. The proposed method make use of both gradient and function values while the usual secant relation uses only gradient values. Under appropriate conditions, we show that the proposed method is globally convergent without needing convexity assumption on the objective function. Comparative results show computational effciency of the proposed method in the sense of the Dolan-More performance prolies.

Upgrade to premium to download articles

Sign up to access the full text

Already have an account?login

similar resources

A Modified BFGS Algorithm for Unconstrained Optimization

In this paper we present a modified BFGS algorithm for unconstrained optimization. The BFGS algorithm updates an approximate Hessian which satisfies the most recent quasi-Newton equation. The quasi-Newton condition can be interpreted as the interpolation condition that the gradient value of the local quadratic model matches that of the objective function at the previous iterate. Our modified al...

full text

A New Scaled Hybrid Modified BFGS Algorithms for Unconstrained Optimization

The BFGS methods is a method to solve an unconstrained optimization. Many modification have been done for solving this problems. In this paper, we present a new scaled hybrid modified BFGS. The new scaled hybrid modified BFGS algorithms are proposed and analyzed. The scaled hybrid modified BFGS can improve the number of iterations. Results obtained by the hybrid modified BFGS algorithms are com...

full text

Modified Limited Memory Bfgs Method with Nonmonotone Line Search for Unconstrained Optimization

In this paper, we propose two limited memory BFGS algorithms with a nonmonotone line search technique for unconstrained optimization problems. The global convergence of the given methods will be established under suitable conditions. Numerical results show that the presented algorithms are more competitive than the normal BFGS method.

full text

A modified nonmonotone BFGS algorithm for unconstrained optimization

In this paper, a modified BFGS algorithm is proposed for unconstrained optimization. The proposed algorithm has the following properties: (i) a nonmonotone line search technique is used to obtain the step size [Formula: see text] to improve the effectiveness of the algorithm; (ii) the algorithm possesses not only global convergence but also superlinear convergence for generally convex functions...

full text

On the Global Convergence of the BFGS Method for Nonconvex Unconstrained Optimization Problems

This paper is concerned with the open problem whether BFGS method with inexact line search converges globally when applied to nonconvex unconstrained optimization problems. We propose a cautious BFGS update and prove that the method with either Wolfe-type or Armijo-type line search converges globally if the function to be minimized has Lipschitz continuous gradients.

full text

My Resources

Save resource for easier access later

Save to my library Already added to my library

{@ msg_add @}


Journal title

volume 7  issue 1

pages  28- 41

publication date 2019-01-01

By following a journal you will be notified via email when a new issue of this journal is published.

Hosted on Doprax cloud platform doprax.com

copyright © 2015-2023