A conjugate gradient based method for Decision Neural Network training
نویسندگان
چکیده مقاله:
Decision Neural Network is a new approach for solving multi-objective decision-making problems based on artificial neural networks. Using inaccurate evaluation data, network training has improved and the number of educational data sets has decreased. The available training method is based on the gradient decent method (BP). One of its limitations is related to its convergence speed. Therefore, decision makers can simply guess the necessary data. In this paper, for increasing the Decision Neural Network training efficiency, a conjugate gradient based method has developed for network training. The key point in decision neural network training is to keep the same structures and parameters of the two sub network (multilayer perceptron) through training process. The efficiency of the proposed method is evaluated by estimating linear and nonlinear utility function of multi-objective decision problems. The results of the proposed method are compared with previous existing method and show that in the proposed method, convergence is faster than previous methods and the results are more favorable.
منابع مشابه
SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method
SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller's scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neura...
متن کاملA Spectral Version of Perry’s Conjugate Gradient Method for Neural Network Training
In this work, an efficient training algorithm for feedforward neural networks is presented. It is based on a scaled version of the conjugate gradient method suggested by Perry, which employs the spectral steplength of Barzilai and Borwein that contains second order information without estimating the Hessian matrix. The learning rate is automatically adapted at each epoch, using the conjugate gr...
متن کاملA New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems
In this paper, two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS}) conjugate gradient method are presented to solve unconstrained optimization problems. A remarkable property of the proposed methods is that the search direction always satisfies the sufficient descent condition independent of line search method, based on eigenvalue analysis. The globa...
متن کاملConjugate Gradient Methods in Training Neural Networks
Training of artificial neural networks is normally a time consuming task due to iterative search imposed by the implicit nonlinearity of the network behavior. To tackle the supervised learning of multilayer feed forward neural networks, the backpropagation algorithm has been proven to be one of the most successful neural network algorithm. Although backpropagation training has proved to be effi...
متن کاملA Conjugate Gradient-neural Network Technique for Ultrasound Inverse Imaging
In this paper, a new technique for solving the two-dimensional inverse scattering problem for ultrasound inverse imaging is presented. Reconstruction of a two-dimensional object is accomplished using an iterative algorithm which combines the conjugate gradient (CG) method and a neural network (NN) approach. The neural network technique is used to exploit knowledge of the statistical characteris...
متن کاملمنابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ذخیره در منابع من قبلا به منابع من ذحیره شده{@ msg_add @}
عنوان ژورنال
دوره 4 شماره 16
صفحات 79- 92
تاریخ انتشار 2019-02-20
با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.
میزبانی شده توسط پلتفرم ابری doprax.com
copyright © 2015-2023