On the Gradient Descent in Backpropagation and Its Substitution by a Genetic Algorithm
نویسنده
چکیده
Backpropagation is the standard training procedure for Multiple Layer Perceptron networks. It is based on the gradient descent to minimize the network error. However, using the gradient descent algorithm leads to some problems with the convergence of the training at all and to restrictions concerning applicable transfer functions as well. This paper describes a complete substitution of the gradient descent by a genetic algorithm, its implementation and practical results. By means of some benchmark applications characteristic features of both the genetic algorithm and the modified neural network are explained.
منابع مشابه
A Note on the Descent Property Theorem for the Hybrid Conjugate Gradient Algorithm CCOMB Proposed by Andrei
In [1] (Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization J. Optimization. Theory Appl. 141 (2009) 249 - 264), an efficient hybrid conjugate gradient algorithm, the CCOMB algorithm is proposed for solving unconstrained optimization problems. However, the proof of Theorem 2.1 in [1] is incorrect due to an erroneous inequality which used to indicate the descent property for the s...
متن کاملForecasting GDP Growth Using ANN Model with Genetic Algorithm
Applying nonlinear models to estimation and forecasting economic models are now becoming more common, thanks to advances in computing technology. Artificial Neural Networks (ANN) models, which are nonlinear local optimizer models, have proven successful in forecasting economic variables. Most ANN models applied in Economics use the gradient descent method as their learning algorithm. However, t...
متن کاملAn eigenvalue study on the sufficient descent property of a modified Polak-Ribière-Polyak conjugate gradient method
Based on an eigenvalue analysis, a new proof for the sufficient descent property of the modified Polak-Ribière-Polyak conjugate gradient method proposed by Yu et al. is presented.
متن کاملHandwritten Character Recognition using Modified Gradient Descent Technique of Neural Networks and Representation of Conjugate Descent for Training Patterns
The purpose of this study is to analyze the performance of Back propagation algorithm with changing training patterns and the second momentum term in feed forward neural networks. This analysis is conducted on 250 different words of three small letters from the English alphabet. These words are presented to two vertical segmentation programs which are designed in MATLAB and based on portions (1...
متن کاملA Genetic-Algorithms Based Evolutionary Computational Neural Network for Modelling Spatial Interaction Data
Building a feedforward computational neural network model (CNN) involves two distinct tasks: determination of the network topology and weight estimation. The specification of a problem adequate network topology is a key issue and the primary focus of this contribution. Up to now, this issue has been either completely neglected in spatial application domains, or tackled by search heuristics (see...
متن کامل