An Effective Optimization Algorithm for Locally Nonconvex Lipschitz Functions Based on Mollifier Subgradients
نویسندگان
چکیده
We present an effective algorithm for minimization of locally nonconvex Lipschitz functions based on mollifier functions approximating the Clarke generalized gradient. To this aim, first we approximate the Clarke generalized gradient by mollifier subgradients. To construct this approximation, we use a set of averaged functions gradients. Then, we show that the convex hull of this set serves as a good approximation for the Clarke generalized gradient. Using this approximation of the Clarke generalized gradient, we establish an algorithm for minimization of locally Lipschitz functions. Based on mollifier subgradient approximation, we propose a dynamic algorithm for finding a direction satisfying the Armijo condition without needing many subgradient evaluations. We prove that the search direction procedure terminates after finitely many iterations and show how to reduce the objective function value in the obtained search direction. We also prove that the first order optimality conditions are satisfied for any accumulation point of the sequence constructed by the algorithm. Finally, we implement our algorithm with MATLAB codes and approximate averaged functions gradients by the Monte-Carlo method. The numerical results show that our algorithm is effectively more efficient and also more robust than the GS algorithm, currently perceived to be a competitive algorithm for minimization of nonconvex Lipschitz functions. MSC(2010): Primary: 90C26; Secondary: 47N10, 49J52.
منابع مشابه
An effective optimization algorithm for locally nonconvex Lipschitz functions based on mollifier subgradients
متن کامل
A Derivative-free Method for Linearly Constrained Nonsmooth Optimization
This paper develops a new derivative-free method for solving linearly constrained nonsmooth optimization problems. The objective functions in these problems are, in general, non-regular locally Lipschitz continuous function. The computation of generalized subgradients of such functions is difficult task. In this paper we suggest an algorithm for the computation of subgradients of a broad class ...
متن کاملA Proximity Control Algorithm to Minimize Nonsmooth and Nonconvex Functions
We present a new proximity control bundle algorithm to minimize nonsmooth and nonconvex locally Lipschitz functions. In contrast with the traditional oracle-based methods in nonsmooth programming, our method is model-based and can accommodate cases where several Clarke subgradients can be computed at reasonable cost. We propose a new way to manage the proximity control parameter, which allows u...
متن کاملA secant method for nonsmooth optimization
The notion of a secant for locally Lipschitz continuous functions is introduced and a new algorithm to locally minimize nonsmooth, nonconvex functions based on secants is developed. We demonstrate that the secants can be used to design an algorithm to find descent directions of locally Lipschitz continuous functions. This algorithm is applied to design a minimization method, called a secant met...
متن کاملBundle method for non-convex minimization with inexact subgradients and function values∗
We discuss a bundle method to minimize non-smooth and non-convex locally Lipschitz functions. We analyze situations where only inexact subgradients or function values are available. For suitable classes of non-smooth functions we prove convergence of our algorithm to approximate critical points.
متن کامل