نتایج جستجو برای: newton iteration method
تعداد نتایج: 1663489 فیلتر نتایج به سال:
Recurrent Neural Networks (RNNs) are powerful models that achieve unparalleled performance on several pattern recognition problems. However, training of RNNs is a computationally difficult task owing to the well-known “vanishing/exploding” gradient problems. In recent years, several algorithms have been proposed for training RNNs. These algorithms either: exploit no (or limited) curvature infor...
A novel type of fractals (i.e., Zhang fractals) is yielded via solving time-varying or static nonlinear equations in complex domain by discrete-time complex-valued Zhang dynamics (DTCVZD). The DTCVZD model that uses different types of activation functions can generate various Zhang fractals. These fractals are different from the conventional Newton fractals discovered 30 years ago (since 1983) ...
A modified Newton method for unconstrained minimization is presented and analyzed. The modification is based upon the model trust region approach. This report contains a thorough analysis of the locally constrained quadratic minimizations that arise as subproblems in the modified Newton iteration. Several promising alternatives are presented for solving these subproblems in ways that overcome c...
In this paper we present a new method for accelerating ray tracing of scenes containing NURBS (Non Uniform Rational B-Spline) surfaces by exploiting the GPU’s fast z-buffer rasterization for regular triangle meshes. In combination with a lightweight, memory efficient data organization this allows for fast calculation of primary ray intersections using a Newton Iteration based approach executed ...
In this paper, we present a new modification of Newton method for solving non-linear equations. Analysis of convergence shows that the new method is cubically convergent. Per iteration the new method requires two evaluations of the function and one evaluation of its first derivative. Thus, the new method is preferable if the computational costs of the first derivative are equal or more than tho...
We describe a generalized Levenberg-Marquardt method for computing critical points of the Ginzburg-Landau energy functional which models superconductivity. The algorithm is a blend of a Newton iteration with a Sobolev gradient descent method, and is equivalent to a trust-region method in which the trustregion radius is defined by a Sobolev metric. Numerical test results demonstrate the method t...
Many machine learning models are reformulated as optimization problems. Thus, it is important to solve a large-scale optimization problem in big data applications. Recently, subsampled Newton methods have emerged to attract much attention for optimization due to their efficiency at each iteration, rectified a weakness in the ordinary Newton method of suffering a high cost in each iteration whil...
The classical division algorithm for polynomials requires O(n) operations for inputs of size n. Using reversal technique and Newton iteration, it can be improved to O(M(n)), where M is a multiplication time. But the method requires that the degree of the modulo, x, should be the power of 2. If l is not a power of 2 and f(0) = 1, Gathen and Gerhard suggest to compute the inverse, f, modulo x r⌉,...
Sugiura, H. and T. Torii, A method for constructing generalized Runge-Kutta methods, Journal of Computational and Applied Mathematics 38 (1991) 399-410. In the implementation of an implicit Runge-Kutta formula, we need to solve systems of nonlinear equations. In this paper, we analyze the Newton iteration process and a modified Newton iteration process for solving these equations. Then we propo...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید