Expected complexity analysis of stochastic direct-search
نویسندگان
چکیده
This work presents the convergence rate analysis of stochastic variants broad class direct-search methods directional type. It introduces an algorithm designed to optimize differentiable objective functions f whose values can only be computed through a stochastically noisy blackbox. The proposed (SDDS) accepts new iterates by imposing sufficient decrease condition on so called probabilistic estimates corresponding unavailable function values. accuracy such is required hold with sufficiently large but fixed probability $$\beta$$ . this method utilizes existing supermartingale-based framework for rates optimization that use adaptive step sizes. aims show expected number iterations drive norm gradient below given threshold $$\epsilon$$ bounded in $${\mathcal {O}}\left( \epsilon ^{\frac{-p}{\min (p-1,1)}}/(2\beta -1)\right)$$ $$p>1$$ Unlike prior using same aforementioned as those trust-region and line search methods, SDDS does not any information find descent directions. However, its similar both latter dependence also matches deterministic which accept condition.
منابع مشابه
Simple Complexity Analysis of Direct Search
We consider the problem of unconstrained minimization of a smooth function in the derivativefree setting. In particular, we study the direct search method (of directional type). Despite relevant research activity spanning several decades, until recently no complexity guarantees— bounds on the number of function evaluations needed to find a satisfying point—for methods of this type were establis...
متن کاملSimple Complexity Analysis of Simplified Direct Search
We consider the problem of unconstrained minimization of a smooth function in the derivativefree setting using. In particular, we propose and study a simplified variant of the direct search method (of direction type), which we call simplified direct search (SDS). Unlike standard direct search methods, which depend on a large number of parameters that need to be tuned, SDS depends on a single sc...
متن کاملTime Complexity Analysis of the Stochastic Diffusion Search
The Stochastic Diffusion Search algorithm -an integral part of Stochastic Search Networks is investigated. Stochastic Diffusion Search is an alternative solution for invariant pattern recognition and focus of attention. It has been shown that the algorithm can be modelled as an ergodic, finite state Markov Chain under some non-restrictive assumptions. Sub-linear time complexity for some setting...
متن کاملWorst case complexity of direct search
In this paper we prove that the broad class of direct-search methods of directional type based on imposing sufficient decrease to accept new iterates shares the worst case complexity bound of steepest descent for the unconstrained minimization of a smooth function, more precisely that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain th...
متن کاملA direct stochastic algorithm for global search
This paper presents a new algorithm called PGSL Probabilistic Global Search Lausanne. PGSL is founded on the assumption that optimal solutions can be identified through focusing search around sets of good solutions. Tests on benchmark problems having multi-parameter non-linear objective functions revealed that PGSL performs better than genetic algorithms and advanced algorithms for simulated an...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Optimization and Applications
سال: 2021
ISSN: ['0926-6003', '1573-2894']
DOI: https://doi.org/10.1007/s10589-021-00329-9