نتایج جستجو برای: inexact search directions

تعداد نتایج: 387345  

Journal: :Comp. Opt. and Appl. 2013
Ellen H. Fukuda L. M. Graña Drummond

In this work, we propose an inexact projected gradient-like method for solving smooth constrained vector optimization problems. In the unconstrained case, we retrieve the steepest descent method introduced by Graña Drummond and Svaiter. In the constrained setting, the method we present extends the exact one proposed by Graña Drummond and Iusem, since it admits relative errors on the search dire...

Journal: :Journal of Optimization Theory and Applications 2022

Abstract Based on a result by Taylor et al. (J Optim Theory Appl 178(2):455–476, 2018) the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms function values, an elementary analysis general methods with fixed step sizes is presented. It covers variable metric methods, gradient-related search directions under angle scaling conditions, as well inexac...

2012
Paul Draghicescu Corey Olson

Inexact search is a difficult and time-consuming task with widespread application. Acceleration of inexact search could have tremendous impact upon fields such as chemistry, meteorology, and even bioinformatics. Field-programmable gate arrays provide a means by which to accelerate this process. We demonstrate the acceleration of inexact search using the BurrowsWheeler transform on FPGAs using t...

Journal: :Journal of Computational and Applied Mathematics 2023

We study unconstrained optimization problems with nonsmooth and convex objective function in the form of a mathematical expectation. The proposed method approximates expected sample average using Inexact Restoration-based adapted sizes. size is chosen an adaptive manner based on Restoration. algorithm uses line search assumes descent directions respect to current approximate function. prove a.s...

Journal: :Mathematical Problems in Engineering 2012

2011
Weijun Zhou

A hybrid HS and PRP type conjugate gradient method for smooth optimization is presented, which reduces to the classical RPR or HS method if exact linear search is used and converges globally and R-linearly for nonconvex functions with an inexact backtracking line search under standard assumption. An inexact version of the proposed method which admits possible approximate gradient or/and approxi...

1995
W. Gomez

The paper studies convergence properties of numerical optimization algorithms using the natural idea of searching in several directions and where one-processor computation is assumed. Global convergence theorems (with exact arithmetic) are proved for two classes of "descent" methods in IR n and for one class of such methods in Hilbert spaces in the framework of unconstrained problems and where ...

2009
Bing Zhao Shengyuan Chen

We propose a variation of simplex-downhill algorithm specifically customized for optimizing parameters in statistical machine translation (SMT) decoder for better end-user automatic evaluation metric scores for translations, such as versions of BLEU, TER and mixtures of them. Traditional simplexdownhill has the advantage of derivative-free computations of objective functions, yet still gives sa...

Journal: :Math. Program. 1993
Jorge Nocedal Ya-Xiang Yuan

We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in additi...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید