نتایج جستجو برای: newton step
تعداد نتایج: 283964 فیلتر نتایج به سال:
A new method for solving large nonlinear optimization problems is outlined It attempts to combine the best properties of the discrete truncated Newton method and the limited memory BFGS method to produce an algorithm that is both economical and capable of handling ill conditioned problems The key idea is to use the curvature information generated during the computation of the discrete Newton st...
We propose a proximal Newton method for solving nondiieren-tiable convex optimization. This method combines the generalized Newton method with Rockafellar's proximal point algorithm. At each step, the proximal point is found approximately and the regu-larization matrix is preconditioned to overcome inexactness of this approximation. We show that such a preconditioning is possible within some ac...
The simulation of large macroeconometric models containing forward-looking variables can become impractical when using exact Newton methods. The diiculties generally arise from the use of direct methods for the solution of the linear system in the Newton step. In such cases, nonstationary iterative methods, also called Krylov methods, provide an interesting alternative. In this paper we apply s...
We present the design and implementation of a new inexact Newton type algorithm for solving large-scale bundle adjustment problems with tens of thousands of images. We explore the use of Conjugate Gradients for calculating the Newton step and its performance as a function of some simple and computationally efficient preconditioners. We show that the common Schur complement trick is not limited ...
Optimization using the L∞ norm is an increasingly important area in multiview geometry. Previous work has shown that globally optimal solutions can be computed reliably using the formulation of generalized fractional programming, in which algorithms solve a sequence of convex problems independently to approximate the optimal L∞ norm error. We found the sequence of convex problems are highly rel...
Tissue stiffness is one of the qualitative properties to distinguish abnormal tissues from normal tissues, and the stiffness changes are generally described in terms of the Lamé coefficient. In this paper, an all-at-once Lagrange-Newton-Krylov-Schwarz algorithm is developed to solve the inverse problem of recovering the Lamé coefficient in biological tissues. Specifically, we propose and study ...
A new method for solving large nonlinear optimization problems is outlined. It attempts to combine the best properties of the discrete-truncated Newton method and the limited memory BFGS method, to produce an algorithm that is both economical and capable of handling ill-conditioned problems. The key idea is to use the curvature information generated during the computation of the discrete Newton...
We propose Sketched Online Newton (SON), an online second order learning algorithm that enjoys substantially improved regret guarantees for ill-conditioned data. SON is an enhanced version of the Online Newton Step, which, via sketching techniques enjoys a linear running time. We further improve the computational complexity to linear in the number of nonzero entries by creating sparse forms of ...
The randomized subspace Newton convex methods for the sensor selection problem are proposed. algorithm is straightforwardly applied to formulation, and customized method in which part of update variables selected be present best candidates also considered. In converged solution, almost same results obtained by original randomized-subspace-Newton methods. As expected, require more computational ...
Self-concordant functions are a special class of convex functions in Euclidean space introduced by Nesterov. They are used in interior point methods, based on Newton iterations, where they play an important role in solving efficiently certain constrained optimization problems. The concept of self-concordant functions has been defined on Riemannian manifolds by Jiang et al. and a damped Newton m...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید