Evolution Strategies on Noisy Functions: How to Improve Convergence Properties

نویسندگان

  • Ulrich Hammel
  • Thomas Bäck
چکیده

Evolution Strategies are reported to be robust in the presence of noise which in general hinders the optimization process. In this paper we discuss the innuence of some of the stratey parameters and strategy variants on the convergence process and discuss measures for improvement of the convergence properties. After having a broad look to the theory for the dynamics of a (1,)-ES on a simple quadratic function we numerically investigate the innuence of the parent population size and the introduction of recombination. Finally we compare the eeects of multiple sampling of the objective function versus the enlargment of the population size for the convergence precision as well as the convergence reliability by the example of the multimodal Rastrigins function. 1 Introduction Evolution strategies are claimed to be well suited for experimental optimization 8, 7], where optimal features of a physical system, e.g. the shape of a nozzel 6], are searched for through a series of experiments. Typically a formal model describing the system properties apropriately is not available such that the system has to be viewed as a black box, in the sense that, given a set of parameter values, we observe a corresponding model quality. This complicates the search process, because we can not derive analytical information like gradients directly. Furthermore the observations are usually disturbed, e.g. due to the limited accuracy of experimentation and observation. An optimization strategie to be of any value in this eld must be robust with respect to noise. It is this aspect of evolution strategies that we want to investigate in the following. We consider only the case of real valued objective functions f : M IR n ! IR with additive normal distributed noise, where we call M the search space: F(x) = f(x) + ; (1) where is a random variable with a Gaussian distribution (N(0;)). It is the task to nd a global optimum point x with 8x 2 M : f(x) f(x) ; (2) by observations of F(x).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Log-log Convergence for Noisy Optimization

Weconsider noisy optimization problems,without the assumption of variance vanishing in the neighborhood of the optimum. We show mathematically that simple rules with exponential number of resamplings lead to a log-log convergence rate. In particular, in this case the log of the distance to the optimum is linear on the log of the number of resamplings. As well as with number of resamplings polyn...

متن کامل

Convergence Properties of Curvature Scale Space Representations

Curvature Scale Space (CSS) representations have been shown to be very useful for recognition of noisy curves of arbitrary shapes at unknown orientations and scales [10,14]. This paper contains a number of important results on the convergence properties of CSS representations and on the evolution and arc length evolution of planar curves [6,12]. The processes which convolve arc length parametri...

متن کامل

Towards a Theory Of`evolution Strategies'. Some Asymptotical Results from the (1; + )-theory

A method for the determination of the progress rate and the probability of success for thèEvolution Strategy' (short ES) is presented. The new method is based on the asymptotical behavior of the-distribution and yields exact results in case of innnite-dimensional parameter spaces. The technique is demonstrated for the (1; +) ES using a spherical model including noisy quality functions. The resu...

متن کامل

Mutate Large, but Inherit Small ! on the Analysis of Rescaled Mutations in ( ~ 1; ~ )-es with Noisy Fitness Data

The paper presents the asymptotical analysis of a technique for improving the convergence of evolution strategies (ES) on noisy t-ness data. This technique that may be called \Mutate large, but inherit small", is discussed in light of the EPP (evolutionary progress principle). The derivation of the progress rate formula is sketched, its predictions are compared with experiments, and its limitat...

متن کامل

Fitness Noise and Localization Errors of the Optimum in General Quadratic Fitness Models

Evolutionary algorithms are generally believed to perform well in the presence of noise. Thus, they are often used for optimization in noisy environments. It comes as a surprise that hardly more than a handful of studies has dealt with the question of just how well they are doing and what can be done to improve their performance. The present paper presents empirical results regarding the behavi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1994