A harmonic framework for stepsize selection in gradient methods
نویسندگان
چکیده
Abstract We study the use of inverse harmonic Rayleigh quotients with target for stepsize selection in gradient methods nonlinear unconstrained optimization problems. This not only provides an elegant and flexible framework to parametrize reinterpret existing schemes, but it also gives inspiration new tunable families steplengths. In particular, we analyze extend adaptive Barzilai–Borwein method a family stepsizes. While this exploits negative values target, consider positive targets. present convergence analysis quadratic problems extending results by Dai Liao (IMA J Numer Anal 22(1):1–10, 2002), carry out experiments outlining potential approaches.
منابع مشابه
a framework for identifying and prioritizing factors affecting customers’ online shopping behavior in iran
the purpose of this study is identifying effective factors which make customers shop online in iran and investigating the importance of discovered factors in online customers’ decision. in the identifying phase, to discover the factors affecting online shopping behavior of customers in iran, the derived reference model summarizing antecedents of online shopping proposed by change et al. was us...
15 صفحه اولA New Framework for Distributed Multivariate Feature Selection
Feature selection is considered as an important issue in classification domain. Selecting a good feature through maximum relevance criterion to class label and minimum redundancy among features affect improving the classification accuracy. However, most current feature selection algorithms just work with the centralized methods. In this paper, we suggest a distributed version of the mRMR featu...
متن کاملStepsize Selection for Approximate Value Iteration and a New Optimal Stepsize Rule
Approximate value iteration is used in dynamic programming when we use random observations to estimate the value of being in a state. These observations are smoothed to approximate the expected value function, leading to the problem of choosing a stepsize (the weight given to the most recent observation). A stepsize of 1/n is a common (and provably convergent) choice. However, we prove that it ...
متن کاملConvergence of Conjugate Gradient Methods with a Closed-Form Stepsize Formula
Conjugate gradient methods are efficient methods for minimizing differentiable objective functions in large dimension spaces. However, converging line search strategies are usually not easy to choose, nor to implement. Sun and colleagues (Ann. Oper. Res. 103:161–173, 2001; J. Comput. Appl. Math. 146:37–45, 2002) introduced a simple stepsize formula. However, the associated convergence domain ha...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Optimization and Applications
سال: 2023
ISSN: ['0926-6003', '1573-2894']
DOI: https://doi.org/10.1007/s10589-023-00455-6