نتایج جستجو برای: backtracking armijo line search
تعداد نتایج: 694472 فیلتر نتایج به سال:
In this paper, a double-step-length symmetric splitting sequential quadratic optimization (DSL-SS-SQO) algorithm for solving two-block nonconvex with nonlinear constraints is proposed. First, at each iteration, the idea of embedded into (QO) subproblem approximating discussed problem. As result, QO split two small-scale QOs, which can generate improved search directions primal variables. Second...
We consider constrained optimization problems with a nonsmooth objective function in the form of mathematical expectation. The Sample Average Approximation (SAA) is used to estimate and variable sample size strategy employed. proposed algorithm combines an SAA subgradient spectral coefficient order provide suitable direction which improves performance first method as shown by numerical results....
We examine the retrieval and mapping tasks of visual analogy as constraint satisfaction problems. We describe a constraint satisfaction method for the two tasks; the method organizes the source cases in a discrimination tree; uses heuristics to guide the search; performs backtracking, and searches all the source cases at once. We present an evaluation of this method for retrieval and mapping of...
Backtracking search is frequently applied to solve a constraint-based search problem but it often suuers from exponential growth of computing time. We present an alternative to backtracking search: local search based on connict minimization. We have applied this general search framework to study a benchmark constraint-based search problem, the n-queens problem. An eecient local search algorithm...
A hybrid HS and PRP type conjugate gradient method for smooth optimization is presented, which reduces to the classical RPR or HS method if exact linear search is used and converges globally and R-linearly for nonconvex functions with an inexact backtracking line search under standard assumption. An inexact version of the proposed method which admits possible approximate gradient or/and approxi...
In this paper, we propose a couple of new Stochastic Strictly Contractive PeacemanRachford Splitting Method (SCPRSM), called Stochastic SCPRSM (SS-PRSM) and Stochastic Conjugate Gradient SCPRSM (SCG-PRSM) for large-scale optimization problems. The two types of Stochastic PRSM algorithms respectively incorporate stochastic variance reduced gradient (SVRG) and conjugate gradient method. Stochasti...
We propose a unifying algorithm for non-smooth non-convex optimization. The algorithm approximates the objective function by a convex model function and finds an approximate (Bregman) proximal point of the convex model. This approximate minimizer of the model function yields a descent direction, along which the next iterate is found. Complemented with an Armijo-like line search strategy, we obt...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید