Updating Quasi - Newton Matrices With Limited Storage By Jorge Nocedal

نویسندگان

  • Jorge Nocedal
  • JORGE NOCEDAL
چکیده

We study how to use the BFGS quasi-Newton matrices to precondition minimization methods for problems where the storage is critical. We give an update formula which generates matrices using information from the last m iterations, where m is any number supplied by the user. The quasi-Newton matrix is updated at every iteration by dropping the oldest information and replacing it by the newest information. It is shown that the matrices generated have some desirable properties. The resulting algorithms are tested numerically and compared with several wellknown methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Representations of quasi-Newton matrices and their use in limited memory methods

We derive compact representations of BFGS and symmetric rank one matrices for optimization These representations allow us to e ciently implement limitedmemory methods for large constrained optimization problems In particular we discuss how to compute projections of limited memory matrices onto subspaces We also present a compact representation of the matrices generated by Broyden s update for s...

متن کامل

Algorithm PREQN Fortran Subroutines for Preconditioning the Conjugate Gradient Method

PREQN is a package of Fortran subroutines for automatically generating pre conditioners for the conjugate gradient method It is designed for solving a sequence of linear systems Aix bi i t where the coe cient matrices Ai are symmetric and positive de nite and vary slowly Problems of this type arise for example in non linear optimization The preconditioners are based on limited memory quasi Newt...

متن کامل

A Progressive Batching L-BFGS Method for Machine Learning

The standard L-BFGS method relies on gradient approximations that are not dominated by noise, so that search directions are descent directions, the line search is reliable, and quasi-Newton updating yields useful quadratic models of the objective function. All of this appears to call for a full batch approach, but since small batch sizes give rise to faster algorithms with better generalization...

متن کامل

Automatic Preconditioning by Limited Memory Quasi-Newton Updating

This paper proposes a preconditioner for the conjugate gradient method (CG) that is designed for solving systems of equations Ax = bi with different right-hand-side vectors or for solving a sequence of slowly varying systems Akx = bk. The preconditioner has the form of a limited memory quasi-Newton matrix and is generated using information from the CG iteration. The automatic preconditioner doe...

متن کامل

On the limited memory BFGS method for large scale optimization

We study the numerical performance of a limited memory quasi Newton method for large scale optimization which we call the L BFGS method We compare its performance with that of the method developed by Buckley and LeNir which combines cyles of BFGS steps and conjugate direction steps Our numerical tests indicate that the L BFGS method is faster than the method of Buckley and LeNir and is better a...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010