Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization

نویسنده

  • Massimo Roma
چکیده

This paper deals with the preconditioning of truncated Newton methods for the solution of large scale nonlinear unconstrained optimization problems. We focus on preconditioners which can be naturally embedded in the framework of truncated Newton methods, i.e. which can be built without storing the Hessian matrix of the function to be minimized, but only based upon information on the Hessian obtained by the product of the Hessian matrix times a vector. In particular we propose a diagonal preconditioning which enjoys this feature and which enables us to examine the effect of diagonal scaling on truncated Newton methods. In fact, this new preconditioner carries out a scaling strategy and it is based on the concept of equilibration of the data in linear systems of equations. An extensive numerical testing has been performed showing that the diagonal preconditioning strategy proposed is very effective. In fact, on most problems considered, the resulting diagonal preconditioned truncated Newton method performs better than both the unpreconditioned method and the one using an automatic preconditioner based on limited memory quasi—Newton updating (PREQN) recently proposed in [22].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

Preconditioning Newton-Krylov methods in nonconvex large scale optimization

We consider an iterative preconditioning technique for large scale optimization, where the objective function is possibly non-convex. First, we refer to the solution of a generic indefinite linear system by means of a Krylov subspace method, and describe the iterative construction of the preconditioner which does not involve matrices products or matrix storage. The set of directions generated b...

متن کامل

Comparison of advanced large-scale minimization algorithms for the solution of inverse ill-posed problems

We compare the performance of several robust large-scale minimization algorithms for the unconstrained minimization of an ill-posed inverse problem. The parabolized Navier–Stokes equation model was used for adjoint parameter estimation. The methods compared consist of three versions of nonlinear conjugate-gradient (CG) method, quasiNewton Broyden–Fletcher–Goldfarb–Shanno (BFGS), the limited-mem...

متن کامل

Performance Profiles of Line-search Algorithms for Unconstrained Optimization

The most important line-search algorithms for solving large-scale unconstrained optimization problems we consider in this paper are the quasi-Newton methods, truncated Newton and conjugate gradient. These methods proved to be efficient, robust and relatively inexpensive in term of computation. In this paper we compare the Dolan-Moré [11] performance profile of line-search algorithms implemented...

متن کامل

A nonmonotone truncated Newton-Krylov method exploiting negative curvature directions, for large scale unconstrained optimization

We propose a new truncated Newton method for large scale unconstrained optimization, where a Conjugate Gradient (CG)-based technique is adopted to solve Newton’s equation. In the current iteration, the Krylov method computes a pair of search directions: the first approximates the Newton step of the quadratic convex model, while the second is a suitable negative curvature direction. A test based...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Optimization Methods and Software

دوره 20  شماره 

صفحات  -

تاریخ انتشار 2005