Minimisation methods for training feedforward neural networks
نویسنده
چکیده
-Mimmtsatlon methods for trammgfeedforward networks with back propagatton are compared Feedforward neural network trammg ts a special case of functlon mmtmtsatton, where no exphctt model o f the data ts assumed Therefore, and due to the htgh dlmenstonahty o f the data, hneartsatton of the trainmg problem through use o f orthogonal basts functtons is not destrable The focus ts on functton mmlmtsatton on any basts Quast-Newton and conJugate gradtent methods are revtewed, and the latter are shown to be a spectal case o f error back propagatton wtth momentum term Three feedforward learnmg problems are tested wtth five methods It ts shown that, due to the f ixed stepstze, standard error back propagatton performs well In avotdlng local mtmma. However, by usmg not only the local gradtent but also the second dertvattve o f the error function, a much shorter trammg ttme ts requtred ConJugate gradtent with Powell restarts shows to be the supertor method Keywords--Feedforward neural network training, Numerical optlmisatlon techniques, Neural funcUon approximation, Error back propagation, Conjugate gradient, Quas~-Newton.
منابع مشابه
Trajectory Methods for Neural Network Training
A new class of methods for training multilayer feedforward neural networks is proposed. The proposed class of methods draws from methods for solving initial value problems of ordinary differential equations, and belong to the subclass of trajectory methods. The training of a multilayer feedforward neural network is equivalent to the minimization of the network’s error function with respect to t...
متن کاملComparison of the Experimental and Predicted Data for Thermal Conductivity of Fe3O4/water Nanofluid Using Artificial Neural Networks
Objective(s): This study aims to evaluate and predict the thermal conductivity of iron oxide nanofluid at different temperatures and volume fractions by artificial neural network (ANN) and correlation using experimental data. Methods: Two-layer perceptron feedforward artificial neural network and backpropagation Levenberg-Marquardt (BP-LM) tra...
متن کاملAvoiding Local Minima in Feedforward Neural Networks by Simultaneous Learning
Feedforward neural networks are particularly useful in learning a training dataset without prior knowledge. However, weight adjusting with a gradient descent may result in the local minimum problem. Repeated training with random starting weights is among the popular methods to avoid this problem, but it requires extensive computational time. This paper proposes a simultaneous training method wi...
متن کاملبررسی کارایی روشهای مختلف هوش مصنوعی و روش آماری در برآورد میزان رواناب (مطالعه موردی: حوزه شهید نوری کاخک گناباد)
Rainfall-runoff models are used in the field of hydrology and runoff estimation for many years, but despite existing numerous models, the regular release of new models shows that there is still not a model that can provide sophisticated estimations with high accuracy and performance. In order to achieve the best results, modeling and identification of factors affecting the output of the model i...
متن کاملData Mining Using Dynamically Constructed Recurrent Fuzzy Neural Networks
Abs t rac t . Approaches to data mining proposed so far are mainly symbolic decision trees and numerical feedforward neural networks methods. While decision trees give, in many cases, lower accuracy compared to feedforward neural networks, the latter show black-box behaviour, long training times, and difficulty to incorporate available knowledge. We propose to use an incrementally-generated rec...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural Networks
دوره 7 شماره
صفحات -
تاریخ انتشار 1994