نتایج جستجو برای: sufficient descent directions

تعداد نتایج: 286567  

Journal: :Comp. Opt. and Appl. 1997
Mikhail V. Solodov

We consider the recently proposed parallel variable distribution (PVD) algorithm of Ferris and Mangasarian [4] for solving optimization problems in which the variables are distributed among p processors. Each processor has the primary responsibility for updating its block of variables while allowing the remaining “secondary” variables to change in a restricted fashion along some easily computab...

Journal: :journal of ai and data mining 2015
f. alibakhshi m. teshnehlab m. alibakhshi m. mansouri

the stability of learning rate in neural network identifiers and controllers is one of the challenging issues which attracts great interest from researchers of neural networks. this paper suggests adaptive gradient descent algorithm with stable learning laws for modified dynamic neural network (mdnn) and studies the stability of this algorithm. also, stable learning algorithm for parameters of ...

2008
Rishi Graham Jorge Cortés

This paper considers robotic sensor networks performing spatial estimation tasks. We model a physical process of interest as a spatiotemporal random field with mean unknown and covariance known up to a scaling parameter. We design a distributed coordination algorithm for an heterogeneous network composed of mobile agents that take point measurements of the field and static nodes that fuse the i...

2012
Jason D. Lee Yuekai Sun Michael A. Saunders

We study inexact proximal Newton-type methods to solve convex optimization problems in composite form: minimize x∈Rn f(x) := g(x) + h(x), where g is convex and continuously differentiable and h : R → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. Proximal Newton-type methods require the solution of subproblems to obtain the search ...

2003
Adil M. Bagirov Alexander M. Rubinov Jiapu Zhang

This paper presents a new method for solving global optimization problems. We use a local technique based on the notion of discrete gradients for finding a cone of descent directions and then we use a global cutting angle algorithm for finding global minimum within the intersection of the cone and the feasible region. We present results of numerical experiments with well-known test problems and...

Journal: :Journal of Industrial and Management Optimization 2023

In this paper, we propose a self-adaptive derivative-free projection method for solving large-scale nonlinear monotone equations with convex constraints. The search direction satisfies the sufficient descent property, which is independent of any line search. Based on Lipschitz continuity and monotonicity proposed shown to be globally convergent. Moreover, numerical results are reported show eff...

2009
Marko M. Mäkelä Yury Nikulin József Mezei

We consider a general multiobjective optimization problem with five basic optimality principles: efficiency, weak and proper Pareto optimality, strong efficiency and lexicographic optimality. We generalize the concept of tradeoff directions defining them as some optimal surface of appropriate cones. In convex optimization, the contingent cone can be used for all optimality principles except lex...

Journal: :Math. Meth. of OR 2008
Adil M. Bagirov Asef Nazari Ganjehlou

In this paper a new algorithm for minimizing locally Lipschitz functions is developed. Descent directions in this algorithm are computed by solving a system of linear inequalities. The convergence of the algorithm is proved for quasidifferentiable semismooth functions. We present the results of numerical experiments with both regular and nonregular objective functions. We also compare the propo...

Journal: :The Journal of chemical physics 2005
Myung Won Lee Massimo Mella Andrew M Rappe

Atomic forces are calculated for first-row monohydrides and carbon monoxide within electronic quantum Monte Carlo (QMC). Accurate and efficient forces are achieved by using an improved method for moving variational parameters in variational QMC. Newton's method with singular value decomposition (SVD) is combined with steepest-descent (SD) updates along directions rejected by the SVD, after init...

2007
Adil M. Bagirov Asef Nazari Ganjehlou

The notion of a secant for locally Lipschitz continuous functions is introduced and a new algorithm to locally minimize nonsmooth, nonconvex functions based on secants is developed. We demonstrate that the secants can be used to design an algorithm to find descent directions of locally Lipschitz continuous functions. This algorithm is applied to design a minimization method, called a secant met...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید