نتایج جستجو برای: squares and newton
تعداد نتایج: 16835918 فیلتر نتایج به سال:
In this paper, we propose a hybrid Gauss-Newton structured BFGS method with a new update formula and a new switch criterion for the iterative matrix to solve nonlinear least squares problems. We approximate the second term in the Hessian by a positive definite BFGS matrix. Under suitable conditions, global convergence of the proposed method with a backtracking line search is established. Moreov...
In this paper, a new framework for the construction of accurate and efficient numerical methods for differential algebraic equation (DAE) initial value problems is presented. The methods are based on applying spectral deferred correction techniques as preconditioners to a Picard integral collocation formulation for the solution. The resulting preconditioned nonlinear system is solved using Newt...
This paper describes a method of dogleg trust-region steps, or restricted Levenberg-Marquardt steps, based on a projection process onto the Krylov subspaces for neural networks nonlinear least squares problems. In particular, the linear conjugate gradient (CG) method works as the inner iterative algorithm for solving the linearized Gauss-Newton normal equation, whereas the outer nonlinear algor...
The numerical property of an adaptive filter algorithm is the most important problem in practical applications. Most fast adaptive filter algorithms have the numerical instability problem and the fast Newton transversal filter (FNTF) algorithms are no exception. In this paper, we propose a numerically stable fast Newton type adaptive filter algorithm. Two problems are dealt with in the paper. F...
In the paper, a special approximated Newton method for minimizing a sum of squares f(x) = 1 2 ‖F (x)‖ = 1 2 Pm i=1[Fi(x)] 2 is introduced. In this Restricted Newton method, the Hessian H = G + S of f where G = (F ′)T F ′, S = F ◦ F ′′, is approximated by ARN = G + B where B = Z2Z T 2 SZ2Z T 2 is the restriction of the second order term S on the subspace imZ2 spanned by the eigenvectors of the G...
The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and inc...
MINIMIZING THE SUM OF EUCLIDEAN DISTANCES YUYING LI Abstract. The Weiszfeld algorithm for continuous location problems can be considered as an iteratively reweighted least squares method. It exhibits linear convergence. In this paper, a Newton type algorithm with similar simplicity is proposed to solve a continuous multifacility location problem with Euclidean distance measure. Similar to the W...
This paper presents a numerically stable fast Newton-type adaptive filter algorithm. Two problems are dealt with in the paper. First, we derive the proposed algorithm from an order-recursive least squares algorithm. The result of the proposed algorithm is equivalent to that of the fast Newton transversal filter (FNTF) algorithm. However, the derivation process is different. Instead of extending...
To prove that a polynomial is nonnegative on R one can try to show that it is a sum of squares of polynomials (SOS). The latter problem is now known to be reducible to a semidefinite programming (SDP) computation much faster than classical algebraic methods (see, e.g., [Par03]), thus enabling new speed-ups in algebraic optimization. However, exactly how often nonnegative polynomials are in fact...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید