Continuous Steepest Descent Path for Traversing Non-convex Regions
نویسنده
چکیده
This paper revisits the ideas of seeking unconstrained minima by following a continuous steepest descent path (CSDP). We are especially interested in the merits of such an approach in regions where the objective function is non-convex and Newton-like methods become ineffective. The paper combines ODE-trajectory following with trust-region ideas to give an algorithm which performs curvilinear searches on each iteration. Progress along the CSDP is governed both by the decrease in function value and measures of the accuracy of a local quadratic model. Experience with a prototype implementation of the algorithm is promising and it is shown to be competitive with more conventional line search and trust region approaches. In particular, it is also shown to perform well in comparison with the, superficially similar, gradient-flow method proposed by Behrman.
منابع مشابه
Traversing non-convex regions
This paper considers a method for dealing with non-convex objective functions in optimization problems. It uses the Hessian matrix and combines features of trust-region techniques and continuous steepest descent trajectory-following in order to construct an algorithm which performs curvilinear searches away from the starting point of each iteration. A prototype implementation yields promising r...
متن کاملMATHEMATICAL ENGINEERING TECHNICAL REPORTS Discrete L-/M-Convex Function Minimization Based on Continuous Relaxation
We consider the problem of minimizing a nonlinear discrete function with L-/M-convexity proposed in the theory of discrete convex analysis. For this problem, steepest descent algorithms and steepest descent scaling algorithms are known. In this paper, we use continuous relaxation approach which minimizes the continuous variable version first in order to find a good initial solution of a steepes...
متن کاملAn Asymptotical Variational Principle Associated with the Steepest Descent Method for a Convex Function
The asymptotical limit of the trajectory deened by the continuous steepest descent method for a proper closed convex function f on a Hilbert space is characterized in the set of minimizers of f via an asymp-totical variational principle of Brezis-Ekeland type. The implicit discrete analogue (prox method) is also considered.
متن کاملLANCS Workshop on Modelling and Solving Complex Optimisation Problems
Towards optimal Newton-type methods for nonconvex smooth optimization Coralia Cartis Coralia.Cartis (at) ed.ac.uk School of Mathematics, Edinburgh University We show that the steepest-descent and Newton methods for unconstrained non-convex optimization, under standard assumptions, may both require a number of iterations and function evaluations arbitrarily close to the steepest-descent’s global...
متن کاملConvergence of the Nelder - Mead Simplex Method Toa Non - Stationary Pointk
This paper analyses the behaviour of the Nelder-Mead simplex method for a family of examples which cause the method to converge to a non-stationary point. All the examples use continuous functions of two variables. The family of functions contains strictly convex functions with up to three continuous derivatives. In all the examples the method repeatedly applies the inside contraction step with...
متن کامل