Analysis of Sparse Quasi-Newton Updates with Positive Definite Matrix Completion

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Active Positive-Definite Matrix Completion

In the FindCandidates function (line 4), Select finds all the single edges that can be added to the current mask graph GΩ while maintaining its chordal structure. To do that, we make use of the clique tree data structure as introduced by Ibarra [1]. Given a graph G = (V,E), the clique tree is a tree C = (VC , EC), in which each node is a maximal clique of G, i.e., VC ⊂ 2 . In our case the numbe...

متن کامل

Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms

We develop and analyze a broad family of stochastic/randomized algorithms for inverting a matrix. We also develop specialized variants maintaining symmetry or positive definiteness of the iterates. All methods in the family converge globally and linearly (i.e., the error decays exponentially), with explicit rates. In special cases, we obtain stochastic block variants of several quasiNewton upda...

متن کامل

Quasi-Newton updates with weighted secant equations

We provide a formula for variational quasi-Newton updates with multiple weighted secant equations. The derivation of the formula leads to a Sylvester equation in the correction matrix. Examples are given.

متن کامل

The Positive Definite Matrix Completion Problem: an Optimization Viewpoint∗

We look at the real positive (semi)definite matrix completion problem from the relative entropy minimization viewpoint. After the problem is transformed into the standard maxdet from, conditions are sought for existence of positive (semi)definite completions. Using basic tools of convex analysis results previously established using graph-theoretic or functional-analytic techniques are recovered...

متن کامل

Incorporating Function Values into Quasi-Newton Updates

The traditional quasi-Newton method for updating the approximate Hessian is based on the change in the gradient of the objective function. This paper describes a new update method that incorporates also the change in the value of the function. The method effectively uses a cubic approximation of the objective function to better approximate its directional second derivative. The cubic approximat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of the Operations Research Society of China

سال: 2014

ISSN: 2194-668X,2194-6698

DOI: 10.1007/s40305-014-0039-x