نتایج جستجو برای: conjugate gradient algorithm
تعداد نتایج: 895617 فیلتر نتایج به سال:
The paper presents some open problems associated to the nonlinear conjugate gradient algorithms for unconstrained optimization. Mainly, these problems refer to the initial direction, the conjugacy condition, the step length computation, new formula for conjugate gradient parameter computation based on function’s values, the influence of accuracy of line search procedure, how we can take the pro...
This paper presents the hybrid algorithm of global optimization of dynamic learning rate for multilayer feedforward neural networks (MLFNN). The effect of inexact line search on conjugacy was studied and a generalized conjugate gradient method based on this effect was proposed and shown to have global convergence for error backpagation of MLFNN. The descent property and global convergence was g...
A modification of the Dai-Yuan conjugate gradient algorithm is proposed. Using the exact line search, the algorithm reduces to the original version of the Dai and Yuan computational scheme. For inexact line search the algorithm satisfies both the sufficient descent and conjugacy condition. A global convergence result is proved when the Wolfe line search conditions are used. Computational result...
This paper reports the study results on neural network training algorithm of numerical optimization techniques multiface detection in static images. The training algorithms involved are scale gradient conjugate backpropagation, conjugate gradient backpropagation with Polak-Riebre updates, conjugate gradient backpropagation with Fletcher-Reeves updates, one secant backpropagation and resilent ba...
Global Krylov subspace methods are the most efficient and robust methods to solve generalized coupled Sylvester matrix equation. In this paper, we propose the nested splitting conjugate gradient process for solving this equation. This method has inner and outer iterations, which employs the generalized conjugate gradient method as an inner iteration to approximate each outer iterate, while each...
With respect to importance of the conjugate gradient methods for large-scale optimization, in this study a descent three-term conjugate gradient method is proposed based on an extended modified secant condition. In the proposed method, objective function values are used in addition to the gradient information. Also, it is established that the method is globally convergent without convexity assu...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید