نتایج جستجو برای: globally convergence

تعداد نتایج: 160982  

Journal: :Foundations of Computational Mathematics 2011
Martin Hutzenthaler Arnulf Jentzen

Stochastic differential equations are often simulated with the Monte Carlo Euler method. Convergence of this method is well understood in the case of globally Lipschitz continuous coefficients of the stochastic differential equation. The important case of superlinearly growing coefficients, however, remained an open question for a long time now. The main difficulty is that numerically weak conv...

2007
Zlatko Drmač

This paper introduces a globally convergent block (column– and row–) cyclic Jacobi method for diagonalization of Hermitian matrices and for computation of the singular value decomposition of general matrices. It is shown that a block rotation (generalization of the Jacobi’s 2× 2 rotation) must be computed and implemented in a particular way to guarantee global convergence. This solves a long st...

Journal: :Numerical Lin. Alg. with Applic. 2011
Xueping Guo Iain S. Duff

Newton-HSS methods, that are variants of inexact Newton methods different from Newton-Krylov methods, have been shown to be competitive methods for solving large sparse systems of nonlinear equations with positive definite Jacobian matrices [Bai and Guo, 2010]. In that paper, only local convergence was proved. In this paper, we prove a Kantorovich-type semilocal convergence. Then we introduce N...

2014
Dušan Jakovetić

We study distributed optimization where nodes cooperatively minimize the sum of their individual, locally known, convex costs fi(x)’s, x ∈ R is global. Distributed augmented Lagrangian (AL) methods have good empirical performance on several signal processing and learning applications, but there is limited understanding of their convergence rates and how it depends on the underlying network. Thi...

2005

Quasi-Newton algorithms for unconstrained nonlinear minimization generate a sequence of matrices that can be considered as approximations of the objective function second derivatives. This paper gives conditions under which these approximations can be proved to converge globally to the true Hessian matrix, in the case where the Symmetric Rank One update formula is used. The rate of convergence ...

Journal: :Math. Oper. Res. 1994
Daniel Ralph

A natural damping of Newton's method for nonsmooth equations is presented. This damping, via the path search instead of the traditional line search, enlarges the domain of convergence of Newton's method and therefore is said to be globally convergent. Convergence behavior is like that of line search damped Newton's method for smooth equations, including Q-quadratic convergence rates under appro...

2008
Kouichi Taji

Newton’s method for solving variational inequalities is known to be locally quadratically convergent. By incorporating a line search strategy for the regularized gap function, Taji et al. (Mathematical Programming, 1993) have proposed a modification of a Newton’s method which is globally convergent and whose rate of convergence is quadratic. But the quadratic convergence has been shown only und...

Journal: :CoRR 2012
Guodong Shi Mikael Johansson Karl Henrik Johansson

In this paper, we study finite-time convergence of gossip algorithms. We show that there exists a symmetric gossip algorithm that converges in finite time if and only if the number of network nodes is a power of two, while there always exists a globally finite-time convergent gossip algorithm despite the number of nodes if asymmetric gossiping is allowed. For n = 2 nodes, we prove that a fastes...

نمودار تعداد نتایج جستجو در هر سال

با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید