Derivative-free nonlinear optimization filter simplex

نویسندگان

  • Aldina Correia
  • João Matias
  • Pedro Mestre
  • Carlos Serôdio
چکیده

The filter method is a technique for solving nonlinear programming problems. The filter algorithm has two phases in each iteration. The first one reduces a measure of infeasibility, while in the second the objective function value is reduced. In real optimization problems, usually the objective function is not differentiable or its derivatives are unknown. In these cases it becomes essential to use optimization methods where the calculation of the derivatives or the verification of their existence is not necessary: direct search methods or derivative-free methods are examples of such techniques. In this work we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of simplex and filter methods. This method neither computes nor approximates derivatives, penalty constants or Lagrange multipliers.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Derivative-free optimization and filter methods to solve nonlinear constrained problems

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the proble...

متن کامل

Augmented Downhill Simplex a Modified Heuristic Optimization Method

Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...

متن کامل

Numerical experience with a derivative-free trust-funnel method for nonlinear optimization problems with general nonlinear constraints

A trust-funnel method is proposed for solving nonlinear optimization problems with general nonlinear constraints. It extends the one presented by Gould and Toint (Math. Prog., 122(1):155196, 2010), originally proposed for equality-constrained optimization problems only, to problems with both equality and inequality constraints and where simple bounds are also considered. As the original one, ou...

متن کامل

OPTIMA Mathematical Programming Society Newsletter 79

Derivative free optimization addresses general nonlinear optimization problems in the cases when obtaining derivative information for the objective and/or the constraint functions is impractical due to computational cost or numerical inaccuracies. Applications of derivative free optimization arise often in engineering design such as circuit tuning, aircraft configuration, water pipe calibration...

متن کامل

A Progressive Barrier for Derivative-Free Nonlinear Programming

We propose a new algorithm for general constrained derivative-free optimization. As in most methods, constraint violations are aggregated into a single constraint violation function. As in filter methods, a threshold, or barrier, is imposed on the constraint violation function, and any trial point whose constraint violation function value exceeds this threshold is discarded from consideration. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Applied Mathematics and Computer Science

دوره 20  شماره 

صفحات  -

تاریخ انتشار 2010