نتایج جستجو برای: sufficient descent directions
تعداد نتایج: 286567 فیلتر نتایج به سال:
Simple examples for the failure of Newton's method with line search for strictly convex minimization
In this paper two simple examples of a twice continuously differentiable strictly convex function f are presented for which Newton’s method with line search converges to a point where the gradient of f is not zero. The first example uses a line search based on the Wolfe conditions. For the second example, some strictly convex function f is defined as well as a sequence of descent directions for...
In this paper we present a Lagrange-multiplier formulation of discrete constrained optimization problems, the associated discrete-space first-order necessary and sufficient conditions for saddle points, and an efficient first-order search procedure that looks for saddle points in discrete space. Our new theory provides a strong mathematical foundation for solving general nonlinear discrete opti...
چکیده در این پایان نامه، حل مساله مینیمم سازی نامقید (min f(x، توسط یک الگوریتم گرادیان مزدوج اصلاحی مورد نظر است. برای حل این نوع از مسایل در مقیاس بزرگ، روش گرادیان مزدوج غیرخطی دارای خواص جالبی از قبیل سادگی ساختار، نیاز به حافظه کم، کارایی و همگرایی مناسب است. علاوه بر این، الگوریتم کاهشی گرادیان مزدوج (cg - descent)درمقایسه با نسخه های دیگر این الگوریتم ها از خواص ویژه ای برخوردار هستند...
Many computer vision problems (e.g., camera calibration, image alignment, structure from motion) are solved with nonlinear optimization methods. It is generally accepted that second order descent methods are the most robust, fast, and reliable approaches for nonlinear optimization of a general smooth function. However, in the context of computer vision, second order descent methods have two mai...
Abstract The paper suggests a new technique for solution of the inverse problem for Maxwell’s equations in a dissipating medium. Being non-self-adjoint, the problem generally requires computing an adjoint operator on each step of gradient descent minimization procedure. The suggested approach introduces a conjugation operator and develops a representation of the system of Maxwell’s equations ba...
Recently several methods were proposed for sparse optimization which make careful use of second-order information [11, 34, 20, 4] to improve local convergence rates. These methods construct a composite quadratic approximation using Hessian information, optimize this approximation using a first-order method, such as coordinate descent and employ a line search to ensure sufficient descent. Here w...
Abstract This note discusses certain aspects of computational solution of optimal control problems for fluid systems. We focus on approaches in which the steepest descent direction of the cost functional is determined using the adjoint equations. In the first part we review the classical formulation by presenting it in the context of Nonlinear Programming. In the second part we show some new re...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید