نتایج جستجو برای: quasi newton algorithm

تعداد نتایج: 844645  

Journal: :Rairo-operations Research 2021

A derivative-free quasi-Newton-type algorithm in which its search direction is a product of positive definite diagonal matrix and residual vector presented. The simple to implement has the ability solve large-scale nonlinear systems equations with separable functions. simply obtained quasi-Newton manner at each iteration. Under some suitable conditions, global R-linear convergence result are Nu...

2000
GEORGE BIROS

In this paper we follow up our discussion on algorithms suitable for optimization of systems governed by partial differential equations. In the first part of of this paper we proposed a Lagrange-Newton-Krylov-Schur method (LNKS) that uses Krylov iterations to solve the Karush-Kuhn-Tucker system of optimality conditions, but invokes a preconditioner inspired by reduced space quasi-Newton algorit...

2010
C. G. Broyden C. G. BROYDEN

Analyses of the convergence properties of general quasi-Newton methods are presented, particular attention being paid to how the approximate solutions and the iteration matrices approach their final values. It is further shown that when Broyden's algorithm is applied to linear systems, the error norms are majorised by a superlinearly convergent sequence of an unusual kind.

2008
Berkant Savas Lek-Heng Lim

In this report we present computational methods for the best multilinear rank approximation problem. We consider algorithms build on quasi-Newton methods operating on product of Grassmann manifolds. Specifically we test and compare methods based on BFGS and L-BFGS updates in local and global coordinates with the Newton-Grassmann and alternating least squares methods. The performance of the quas...

2016
Frank Curtis

An algorithm for stochastic (convex or nonconvex) optimization is presented. The algorithm is variable-metric in the sense that, in each iteration, the step is computed through the product of a symmetric positive definite scaling matrix and a stochastic (mini-batch) gradient of the objective function, where the sequence of scaling matrices is updated dynamically by the algorithm. A key feature ...

2004
Alexander M. Bronstein Michael M. Bronstein Michael Zibulevsky

Presented here is a generalization of the modified relative Newton method, recently proposed in [1] for quasi-maximum likelihood blind source separation. Special structure of the Hessian matrix allows to perform block-coordinate Newton descent, which significantly reduces the algorithm computational complexity and boosts its performance. Simulations based on artificial and real data show that t...

2008
Rick Chartrand Valentina Staneva

We propose an algorithm for segmentation of grayscale images. Our algorithm computes a solution to the convex, unconstrained minimization problem proposed by T. Chan, S. Esedoḡlu, and M. Nikolova in [1], which is closely related to the Chan-Vese level set algorithm for the Mumford-Shah segmentation model. Up to now this problem has been solved with a gradient descent method. Our approach is a q...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید