Limited - Memory Matrix Methods with Applications 1

نویسنده

  • Tamara Gibson Kolda
چکیده

The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory, thereby allowing problems with a very large number of variables to be solved. Speciically, we will focus on two applications areas: optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited-memory quasi-Newton methods. Many well-known methods such as limited-memory Broyden Family methods satisfy the general form. We are able to prove several results about methods which satisfy the general form. In particular, we show that the only limited-memory Broyden Family method (using exact line searches) that is guaranteed to terminate within n iterations on an n-dimensional strictly convex quadratic is the limited-memory BFGS method. Furthermore , we are able to introduce several new variations on the limited-memory BFGS method that retain the quadratic termination property. We also have a new result that shows that full-memory Broyden Family methods (using exact line searches) that skip p updates to the quasi-Newton matrix will terminate in no more than n+p steps on an n-dimensional strictly convex quadratic. We propose several new variations on the limited-memory BFGS method and test these on standard test problems. We also introduce and test a new method for a process known as Latent Semantic Indexing (LSI) for information retrieval. The new method replaces the singular value matrix decomposition (SVD) at the heart of LSI with a semi-discrete matrix decomposition (SDD). We show several convergence results for the SDD and compare some strategies for computing it on general matrices. We also compare the SVD-based LSI to the SDD-based LSI and show that the SDD-based method has a faster query computation time and requires significantly less storage. We also propose and test several SDD-updating strategies for adding new documents to the collection.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Limited - Memory Matrix Methods with Applications 1 Tamara

The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory, thereby allowing problems with a very large number of variables to be solved. Speciically, we will focus on two applications areas: optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited-memory quasi-Newton methods. Many well-known meth...

متن کامل

Limited-Memory Matrix Methods with Applications

The focus of this dissertation is on matrix decompositions that use a limited amount of computer memory, thereby allowing problems with a very large number of variables to be solved. Speci cally, we will focus on two applications areas: optimization and information retrieval. We introduce a general algebraic form for the matrix update in limited-memory quasiNewton methods. Many well-known metho...

متن کامل

Fast Reconstruction of SAR Images with Phase Error Using Sparse Representation

In the past years, a number of algorithms have been introduced for synthesis aperture radar (SAR) imaging. However, they all suffer from the same problem: The data size to process is considerably large. In recent years, compressive sensing and sparse representation of the signal in SAR has gained a significant research interest. This method offers the advantage of reducing the sampling rate, bu...

متن کامل

Recursive formulation of limited memory variable metric methods

In this report we propose a new recursive matrix formulation of limited memory variable metric methods. This approach enables to approximate of both the Hessian matrix and its inverse and can be used for an arbitrary update from the Broyden class (and some other updates). The new recursive formulation requires approximately 4mn multiplications and additions for the direction determination, so i...

متن کامل

Limited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization

Limited-memory BFGS quasi-Newton methods approximate the Hessian matrix of second derivatives by the sum of a diagonal matrix and a fixed number of rank-one matrices. These methods are particularly effective for large problems in which the approximate Hessian cannot be stored explicitly. It can be shown that the conventional BFGS method accumulates approximate curvature in a sequence of expandi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006