A Modified Newton Method for Multilinear PageRank
نویسندگان
چکیده
منابع مشابه
Multilinear PageRank
In this paper, we first extend the celebrated PageRank modification to a higher-order Markov chain. Although this system has attractive theoretical properties, it is computationally intractable for many interesting problems. We next study a computationally tractable approximation to the higher-order PageRank vector that involves a system of polynomial equations called multilinear PageRank. This...
متن کاملA Modified Newton Method for Minimization I
Some promising ideas for minimizing a nonlinear function, whose first and second derivatives are given, by a modified Newton method, were introduced by Fiacco and McCormick (Ref. 1). Unfortunately, in developing a method around these ideas, Fiacco and McCormick used a potentially unstable, or even impossible, matrix factorization. Using some recently developed techniques for factorizing an inde...
متن کاملA Modified Newton Method for Solving Non-linear Algebraic Equations
The Newton algorithm based on the “continuation” method may be written as being governed by the equation ( ) j x t + 1 ( ) 0, ij i j B F x − = where Fi (xj) = 0, i, j = 1, ...n are nonlinear algebraic equations (NAEs) to be solved, and Bij = ∂Fi /∂xj is the corresponding Jacobian matrix. It is known that the Newton’s algorithm is quadratically convergent; however, it has some drawbacks, such as...
متن کاملA Modified Regularized Newton Method for Unconstrained Nonconvex Optimization
In this paper, we present a modified regularized Newton method for the unconstrained nonconvex optimization by using trust region technique. We show that if the gradient and Hessian of the objective function are Lipschitz continuous, then the modified regularized Newton method (M-RNM) has a global convergence property. Numerical results show that the algorithm is very efficient.
متن کاملA Modified Orthant-Wise Limited Memory Quasi-Newton Method
where U = V k−mV k−m+1 · · ·V k−1. For the L-BFGS, we need not explicitly store the approximated inverse Hessian matrix. Instead, we only require matrix-vector multiplications at each iteration, which can be implemented by a twoloop recursion with a time complexity of O(mn) (Jorge & Stephen, 1999). Thus, we only store 2m vectors of length n: sk−1, sk−2, · · · , sk−m and yk−1,yk−2, · · · ,yk−m w...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Taiwanese Journal of Mathematics
سال: 2018
ISSN: 1027-5487
DOI: 10.11650/tjm/180303