نتایج جستجو برای: parameter tuning

تعداد نتایج: 260958  

2017
Fabienne Comte Jan Johannes

We consider the estimation of the slope function in functional linear regression, where scalar responses are modeled in dependence of random functions. Cardot and Johannes [2010] have shown that a thresholded projection estimator can attain up to a constant minimax-rates of convergence in a general framework which allows to cover the prediction problem with respect to the mean squared predictio...

2010
Selmar K. Smit A. E. Eiben

We present a case study demonstrating that using the REVAC parameter tuning method we can greatly improve the ‘world champion’ EA (the winner of the CEC2005 competition) with little effort. For ‘normal’ EAs the margins for possible improvements are likely much bigger. Thus, the main message of this paper is that using REVAC great performance improvements are possible for many EAs at moderate co...

2010
Kyupil Yeon Moon Sup Song Yongdai Kim Hosik Choi Cheolwoo Park

A supervised learning algorithm aims to build a prediction model using training examples. This paradigm typically has the assumptions that the underlying distribution and the true input-output dependency does not change. However, these assumptions often fail to hold, especially in data streams. This phenomenon is known as concept drift. We propose a new model combining algorithm for tracking co...

2009
Günter Rudolph Mike Preuss Jan Quadflieg

The problem of detecting suitable parameters for metaheuristic optimization algorithms is well known long since. As these nondeterministic methods, e.g. evolution strategies (ES) [1], are highly adaptible to a specific application, detecting good parameter settings is vital for their success. Performance differences of orders of magnitude (in time and/or quality) are often achieved by means of ...

2008
Peter Athron

The solution to fine tuning is one of the principal motivations for supersymmetry. However constraints on the parameter space of the Minimal Supersymmetric Standard Model (MSSM) suggest it may also require fine tuning (although to a much lesser extent). To compare this tuning with different extensions of the Standard Model (including other supersymmetric models) it is essential that we have a r...

Journal: :Swarm and Evolutionary Computation 2011
A. E. Eiben Selmar K. Smit

In this paper we present a conceptual framework for parameter tuning, provide a survey of tuning methods, and discuss related methodological issues. The framework is based on a three-tier hierarchy of a problem, an evolutionary algorithm (EA), and a tuner. Furthermore, we distinguish problem instances, parameters, and EA performance measures as major factors, and discuss how tuning can be direc...

2007
Md. Nurul Haque Mollah Nayeema Sultana Mihoko Minami Shinto Eguchi

This paper discusses a new highly robust learning algorithm for exploring local principal component analysis (PCA) structures in which an observed data follow one of several heterogeneous PCA models. The proposed method is formulated by minimizing β-divergence. It searches a local PCA structure based on an initial location of the shifting parameter and a value of the tuning parameter β. If the ...

2012
Aldeida Aleti Sanaz Mostaghim Irene Moser

All existing stochastic optimisers such as Evolutionary Algorithms require parameterisation which has a significant influence on the algorithm’s performance. In most cases, practitioners assign static values to variables after an initial tuning phase. This parameter tuning method requires experience the practitioner may not have and, when done conscientiously, is rather time-consuming. Also, th...

Journal: :Soft Comput. 2011
Marco Antonio Montes de Oca Dogan Aydin Thomas Stützle

The development cycle of high-performance optimization algorithms requires the designer to make several design decisions. These decisions range from implementation details to the setting of parameter values for testing intermediate designs. Proper parameter setting can be crucial for the effective assessment of algorithmic components because a bad parameter setting can make a good algorithmic c...

2011
Christoph Bernau Thomas Augustin Anne-Laure Boulesteix

High-dimensional binary classification tasks, e.g. the classification of mi-croarray samples into normal and cancer tissues, usually involve a tuning parameter adjusting the complexity of the applied method to the examined data set. By reporting the performance of the best tuning parameter value only, over-optimistic prediction errors are published. The contribution of this paper is twofold. Fi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید