NEW INVARIANT TO NONLINEAR SCALING QUASI-NEWTON ALGORITHMS
نویسندگان
چکیده
منابع مشابه
Wide interval for efficient self-scaling quasi-Newton algorithms
This paper uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstraiend optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from the so-called convex class, but increase their c...
متن کاملSelf-Scaling Parallel Quasi-Newton Methods
In this paper, a new class of self-scaling quasi-Newton(SSQN) updates for solving unconstrained nonlinear optimization problems(UNOPs) is proposed. It is shown that many existing QN updates can be considered as special cases of the new family. Parallel SSQN algorithms based on this class of class of updates are studied. In comparison to standard serial QN methods, proposed parallel SSQN(SSPQN) ...
متن کاملNew quasi-Newton method for solving systems of nonlinear equations
In this paper, we propose the new Broyden method for solving systems of nonlinear equations, which uses the first derivatives, but it is more efficient than the Newton method (measured by the computational time) for larger dense systems. The new method updates QR decompositions of nonsymmetric approximations of the Jacobian matrix, so it requires O(n) arithmetic operations per iteration in cont...
متن کاملApproximate invariant subspaces and quasi-newton optimization methods
New approximate secant equations are shown to result from the knowledge of (problem dependent) invariant subspace information, which in turn suggests improvements in quasi-Newton methods for unconstrained minimization. A new limitedmemory BFGS using approximate secant equations is then derived and its encouraging behaviour illustrated on a small collection of multilevel optimization examples. T...
متن کاملAnalysis of a self-scaling quasi-Newton method
We study the self-scaling BFGS method of Oren and Luenberger (1974) for solving unconstrained optimization problems. For general convex functions, we prove that the method is globally convergent with inexact line searches. We also show that the directions generated by the self-scaling BFGS method approach Newton's direction asymptotically. This would ensure superlinear convergence if, in additi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: International Journal of Apllied Mathematics
سال: 2021
ISSN: 1311-1728,1314-8060
DOI: 10.12732/ijam.v34i3.12