Derivative-Free Optimization Via Proximal Point Methods

نویسندگان

  • Warren Hare
  • Yves Lucet
چکیده

Derivative-Free Optimization (DFO) examines the challenge of minimizing (or maximizing) a function without explicit use of derivative information. Many standard techniques in DFO are based on using model functions to approximate the objective function, and then applying classic optimization methods on the model function. For example, the details behind adapting steepest descent, conjugate gradient, and quasi-Newton methods to DFO have been studied in this manner. In this paper we demonstrate that the proximal point method can also be adapted to DFO. To that end, we provide a derivative-free proximal point (DFPP) method and prove convergence of the method in a general sense. In particular, we give conditions under which the gradient values of the iterates converge to 0, and conditions under which an iterate corresponds to a stationary point of the objective function.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Derivative-Free Methods for Mixed-Integer Constrained Optimization Problems

Methods which do not use any derivative information are becoming popular among researchers, since they allow to solve many real-world engineering problems. Such problems are frequently characterized by the presence of discrete variables, which can further complicate the optimization process. In this paper, we propose derivative-free algorithms for solving continuously differentiable Mixed Integ...

متن کامل

An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization

We consider an unconstrained problem of minimization of a smooth convex function which is only available through noisy observations of its values, the noise consisting of two parts. Similar to stochastic optimization problems, the first part is of a stochastic nature. On the opposite, the second part is an additive noise of an unknown nature, but bounded in the absolute value. In the two-point ...

متن کامل

Randomized Similar Triangles Method: A Unifying Framework for Accelerated Randomized Optimization Methods (Coordinate Descent, Directional Search, Derivative-Free Method)

In this paper, we consider smooth convex optimization problems with simple constraints and inexactness in the oracle information such as value, partial or directional derivatives of the objective function. We introduce a unifying framework, which allows to construct different types of accelerated randomized methods for such problems and to prove convergence rate theorems for them. We focus on a...

متن کامل

Outdoor WLAN planning via non-monotone derivative-free optimization: algorithm adaptation and case study

In this paper, we study the application of non-monotone derivative-free optimization algorithms to wireless local area networks (WLAN) planning, which can be modeled as an unconstrained minimization problem. We wish to determine the access point (AP) positions that maximize coverage in order to provide connectivity to static and mobile users. As the objective function of the optimization model ...

متن کامل

Asynchronous parallel hybrid optimization combining DIRECT and GSS

In this paper we explore hybrid parallel global optimization using DIRECT and asynchronous generating set search (GSS). Both DIRECT and GSS are derivative-free and so require only objective function values; this makes these methods applicable to a wide variety of science and engineering problems. DIRECT is a global search method that strategically divides the search space into ever-smaller rect...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Optimization Theory and Applications

دوره 160  شماره 

صفحات  -

تاریخ انتشار 2014